How to Build an ELK Log Center for Dockerized Apps: Step‑by‑Step Guide
Learn how to containerize ELK components, configure Rsyslog, deploy Elasticsearch, Logstash, Kibana, and an Nginx logging container, then visualize and query application logs—all with Docker commands and configuration snippets for a complete log‑collection pipeline.
Overview
After an application is containerized, the next step is to collect its printed logs from Docker containers for operations analysis, such as logs from a SpringBoot application.
This article explains how to use the ELK stack to collect logs generated by containerized applications and visualize them for querying and analysis.
Image Preparation
Elasticsearch image
Logstash image
Kibana image
Nginx image (used as a containerized application that produces logs)
Enable Linux Rsyslog Service
Edit the Rsyslog configuration file:
vim /etc/rsyslog.confEnable the following three parameters:
<code>$ModLoad imtcp
$InputTCPServerRun 514
*.* @@localhost:4560</code>The intention is simple: load the imtcp module, listen on port 514, and forward collected data to the local port 4560.
Restart the Rsyslog service:
systemctl restart rsyslogCheck the Rsyslog status:
netstat -tnlDeploy Elasticsearch Service
<code>docker run -d -p 9200:9200 \
-v ~/elasticsearch/data:/usr/share/elasticsearch/data \
--name elasticsearch elasticsearch</code>Deploy Logstash Service
Add a configuration file
~/logstash/logstash.confwith the following content:
<code>input {
syslog {
type => "rsyslog"
port => 4560
}
}
output {
elasticsearch {
hosts => [ "elasticsearch:9200" ]
}
}</code>This configuration makes Logstash pull logs from the local Rsyslog service and forward them to Elasticsearch.
Start the Logstash container:
<code>docker run -d -p 4560:4560 \
-v ~/logstash/logstash.conf:/etc/logstash.conf \
--link elasticsearch:elasticsearch \
--name logstash logstash \
logstash -f /etc/logstash.conf</code>Deploy Kibana Service
<code>docker run -d -p 5601:5601 \
--link elasticsearch:elasticsearch \
-e ELASTICSEARCH_URL=http://elasticsearch:9200 \
--name kibana kibana</code>Run Nginx Container to Produce Logs
<code>docker run -d -p 90:80 \
--log-driver syslog --log-opt \
syslog-address=tcp://localhost:514 \
--log-opt tag="nginx" --name nginx nginx</code>The Nginx container forwards its logs to the local syslog service, which then passes the data to Logstash for collection.
At this point, four containers are running: Elasticsearch, Logstash, Kibana, and Nginx.
Experiment Verification
Open
http://localhost:90in a browser, refresh the page several times to generate GET request logs.
Open Kibana visual interface at
http://localhost:5601.
Collect Nginx application logs.
Query application logs.
Enter
program=nginxin the Kibana query box to retrieve the specific logs.
Efficient Ops
This public account is maintained by Xiaotianguo and friends, regularly publishing widely-read original technical articles. We focus on operations transformation and accompany you throughout your operations career, growing together happily.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.