Building a Real‑Time ELK Log Analysis Platform and Integrating It with Spring Boot and Nginx
This tutorial explains why centralized log collection is essential for micro‑service systems, introduces the ELK stack (Elasticsearch, Logstash, Kibana), provides step‑by‑step installation on Ubuntu, shows how to configure Logstash shipper and indexer pipelines, and demonstrates integration with Spring Boot and Nginx logs for real‑time monitoring.
When troubleshooting micro‑service applications, logs are scattered across many machines, making analysis difficult; a unified real‑time log platform greatly improves efficiency.
ELK Overview
ELK is an open‑source log analysis suite composed of Elasticsearch , Logstash and Kibana .
Logstash
Logstash collects logs, processes them through an input‑filter‑output pipeline.
Input: supports files, syslog, MySQL, message queues, etc.
Filter: parses and transforms data in real time.
Output: can send data to Elasticsearch or other destinations.
Elasticsearch
A distributed RESTful search and analytics engine offering structured and unstructured queries, aggregations, millisecond‑level response times, and horizontal scalability from a laptop to petabyte‑scale clusters.
Kibana
A browser‑based UI that visualizes Elasticsearch data, allowing users to create dashboards and explore log trends without writing code.
Implementation Diagram
Logstash (shipper) runs on each service host, reads log files and pushes events to a Redis channel; Logstash (indexer) reads from Redis, parses logs with Grok, and stores structured records in Elasticsearch; Kibana queries Elasticsearch to display logs.
ELK Platform Setup
All components can be installed on a single Ubuntu machine for a quick start.
Install Logstash
tar -xzvf logstash-7.3.0.tar.gzTest the installation:
cd logstash-7.3.0
bin/logstash -e 'input { stdin {} } output { stdout {} }'Install Elasticsearch
tar -xzvf elasticsearch-7.3.0-linux-x86_64.tar.gz
cd elasticsearch-7.3.0
bin/elasticsearchVerify with curl http://localhost:9200 to see cluster information.
Install Kibana
tar -xzvf kibana-7.3.0-linux-x86_64.tar.gz
cd kibana-7.3.0-linux-x86_64
./bin/kibanaAccess http:// host :5601 to confirm the UI loads.
Integrating Spring Boot Logs
Create spring-logback.xml to output logs to /log/sb-log.log , package the application with Maven, and deploy to the Ubuntu server.
# Build
mvn package -Dmaven.test.skip=true
# Deploy
java -jar sb-elk-start-0.0.1-SNAPSHOT.jarSample Logback entry:
2019-08-11 18:01:31.602 [http-nio-8080-exec-2] INFO c.i.s.aop.WebLogAspect sb-elk - 接口日志 POST请求测试接口结束调用:耗时=11ms,result=BaseResponse{code=10000, message='操作成功'}Shipper Logstash Configuration (Spring Boot)
input {
file { path => ["/log/sb-log.log"] }
}
output {
redis { host => "10.140.45.190" port => 6379 db => 8 data_type => "channel" key => "logstash_list_0" }
}Indexer Logstash Configuration (Spring Boot)
input {
redis { host => "192.168.142.131" port => 6379 db => 8 data_type => "channel" key => "sb-logback" }
}
filter {
grok { match => { "message" => "%{TIMESTAMP_ISO8601:time} \[%{NOTSPACE:threadName}\] %{LOGLEVEL:level} %{DATA:logger} %{NOTSPACE:applicationName} -(?:.*=%{NUMBER:timetaken}ms|)" } }
}
output {
elasticsearch { hosts => ["localhost:9200"] index => "logback" }
stdout {}
}Integrating Nginx Logs
Use a similar shipper configuration to read /var/log/nginx/access.log and send to Redis, then add a Grok pattern to parse Nginx fields.
%{IPV4:ip} - - \[%{HTTPDATE:time}\] "%{NOTSPACE:method} %{DATA:requestUrl} HTTP/%{NUMBER:httpVersion}" %{NUMBER:httpStatus} %{NUMBER:bytes} "%{DATA:referer}" "%{DATA:agent}"Combined Indexer Configuration
input {
redis { type => "logback" ... }
redis { type => "nginx" ... }
}
filter {
if [type] == "logback" { ... grok for Spring Boot ... }
if [type] == "nginx" { ... grok for Nginx ... }
}
output {
if [type] == "logback" { elasticsearch { index => "logback" } }
if [type] == "nginx" { elasticsearch { index => "nginx" } }
}Running ELK as Daemons
Use Supervisor to manage Elasticsearch, Logstash (shipper & indexer) and Kibana, ensuring they start on boot and stay alive.
[program:elasticsearch]
command=/home/elk/elasticsearch/bin/elasticsearch
[program:logstash]
command=/home/elk/logstash/bin/logstash -f /home/elk/logstash/indexer-logstash.conf
[program:kibana]
command=/home/elk/kibana/bin/kibanaReload Supervisor with sudo supervisorctl reload to apply the configuration.
Conclusion
The guide walks through installing the ELK stack, configuring Logstash shipper and indexer pipelines for Spring Boot and Nginx logs, visualizing them in Kibana, and managing the services with Supervisor, providing a complete real‑time log analysis solution.
Top Architect
Top Architect focuses on sharing practical architecture knowledge, covering enterprise, system, website, large‑scale distributed, and high‑availability architectures, plus architecture adjustments using internet technologies. We welcome idea‑driven, sharing‑oriented architects to exchange and learn together.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.