Operations 5 min read

How to Build an ELK Log Center for Dockerized Applications

This guide walks you through setting up an ELK stack to collect, forward, and visualize logs from Docker containers, covering Rsyslog configuration, ElasticSearch, Logstash, Kibana, and an Nginx example for end‑to‑end log monitoring.

Efficient Ops
Efficient Ops
Efficient Ops
How to Build an ELK Log Center for Dockerized Applications

Overview

After containerizing an application, you need to collect its printed logs for operations analysis. A typical case is collecting logs from a SpringBoot application.

This article shows how to use the ELK log center to collect logs generated by containerized applications, and query and analyze them visually. The architecture is shown below:

Preparing Images

ElasticSearch image

Logstash image

Kibana image

Nginx image (as a containerized application producing logs)

Enable Linux Rsyslog Service

Edit the Rsyslog configuration file:

vim /etc/rsyslog.conf

Enable the following three parameters:

$ModLoad imtcp
$InputTCPServerRun 514

*.* @@localhost:4560

These settings load the imtcp module, listen on port 514, and forward collected data to local port 4560.

Restart the Rsyslog service:

systemctl restart rsyslog

Check Rsyslog status:

netstat -tnl

Deploy ElasticSearch Service

docker run -d -p 9200:9200 \
 -v ~/elasticsearch/data:/usr/share/elasticsearch/data \
 --name elasticsearch elasticsearch

Deploy Logstash Service

Add the configuration file

~/logstash/logstash.conf

:

input {
  syslog {
    type => "rsyslog"
    port => 4560
  }
}

output {
  elasticsearch {
    hosts => [ "elasticsearch:9200" ]
  }
}

This configuration makes Logstash pull logs from the local Rsyslog service and forward them to ElasticSearch.

Start the Logstash container:

docker run -d -p 4560:4560 \
 -v ~/logstash/logstash.conf:/etc/logstash.conf \
 --link elasticsearch:elasticsearch \
 --name logstash logstash \
 logstash -f /etc/logstash.conf

Deploy Kibana Service

docker run -d -p 5601:5601 \
 --link elasticsearch:elasticsearch \
 -e ELASTICSEARCH_URL=http://elasticsearch:9200 \
 --name kibana kibana

Start Nginx Container to Produce Logs

docker run -d -p 90:80 --log-driver syslog --log-opt \
 syslog-address=tcp://localhost:514 \
 --log-opt tag="nginx" --name nginx nginx

The Nginx application inside the Docker container forwards its logs to the local syslog service, which then passes them to Logstash for collection.

Verification

Open a browser to

localhost:90

to access the Nginx interface and generate GET request logs.

Open Kibana visual interface at

localhost:5601

.

Collect Nginx application logs.

Query logs using

program=nginx

in Kibana.

dockeroperationsLoggingELKLogstashkibanaRsyslog
Efficient Ops
Written by

Efficient Ops

This public account is maintained by Xiaotianguo and friends, regularly publishing widely-read original technical articles. We focus on operations transformation and accompany you throughout your operations career, growing together happily.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.