20 Essential Linux Commands for Server Log Analysis and Threat Detection
This guide compiles a comprehensive set of Linux shell commands that let you examine web server logs, count unique IPs, identify top‑requested pages, filter bots, monitor connection states, and spot performance or security anomalies, helping you keep your site secure and performant.
Running a personal website on an Alibaba Cloud ECS instance, you may occasionally want to analyze server logs to monitor traffic, detect attacks, and assess performance. Below is a curated list of useful command‑line snippets for log analysis.
1. Count unique visiting IPs:
<code>awk '{print $1}' log_file | sort | uniq | wc -l</code>2. Count accesses to a specific page (e.g., /index.php ):
<code>grep "/index.php" log_file | wc -l</code>3. Show how many pages each IP accessed:
<code>awk '{++S[$1]} END {for (a in S) print a, S[a]}' log_file > log.txt
sort -n -t ' ' -k 2 log.txt</code>4. Sort IPs by the number of pages they accessed (ascending):
<code>awk '{++S[$1]} END {for (a in S) print S[a], a}' log_file | sort -n</code>5. List pages visited by a particular IP (replace with the target IP):
<code>grep ^111.111.111.111 log_file | awk '{print $1,$7}'</code>6. Exclude search‑engine bot requests (identified by the User‑Agent field):
<code>awk '{print $12,$1}' log_file | grep ^"Mozilla" | awk '{print $2}' | sort | uniq | wc -l</code>7. Count unique IPs that accessed the site during a specific hour (example: 16/Aug/2015:14):
<code>awk '{print $4,$1}' log_file | grep 16/Aug/2015:14 | awk '{print $2}' | sort | uniq | wc -l</code>8. Show the top 10 most frequent IP addresses:
<code>awk '{print $1}' log_file | sort | uniq -c | sort -nr | head -10</code>uniq -c groups identical lines and prefixes each group with its count.
9. List the top 10 most requested files or pages:
<code>cat log_file | awk '{print $11}' | sort | uniq -c | sort -nr | head -10</code>10. Count accesses per sub‑domain (using the Referer field):
<code>cat access.log | awk '{print $11}' | sed -e 's|http://||' -e 's|/.*||' | sort | uniq -c | sort -rn | head -20</code>11. List files with the largest transferred size (example for PHP files):
<code>cat www.access.log | awk '($7~/\.php/){print $10, $1, $4, $7}' | sort -nr | head -100</code>12. Show pages larger than 200 KB and how often they were requested:
<code>cat www.access.log | awk '($10 > 200000 && $7~/\.php/){print $7}' | sort | uniq -c | sort -nr | head -100</code>13. Identify the pages that took the longest time to transfer (if the last column records transfer time):
<code>cat www.access.log | awk '($7~/\.php/){print $NF, $1, $4, $7}' | sort -nr | head -100</code>14. List pages whose response time exceeded 60 seconds and their request counts:
<code>cat www.access.log | awk '($NF > 60 && $7~/\.php/){print $7}' | sort | uniq -c | sort -nr | head -100</code>15. List pages whose response time exceeded 30 seconds:
<code>cat www.access.log | awk '($NF > 30){print $7}' | sort | uniq -c | sort -nr | head -20</code>16. Show the number of processes per command, sorted descending:
<code>ps -ef | awk -F ' ' '{print $8, $9}' | sort | uniq -c | sort -nr | head -20</code>17. Get the current number of concurrent Apache connections:
<code>netstat -an | grep ESTABLISHED | wc -l</code>18. Count Apache processes (each request may spawn a process):
<code>ps -ef | grep httpd | wc -l</code>Apache can handle roughly the same number of concurrent requests as the process count.
19. Count total connections on port 80:
<code>netstat -nat | grep -i "80" | wc -l</code>20. Count established connections on port 80:
<code>netstat -an | grep ESTABLISHED | wc -l</code>Additional useful snippets include monitoring TCP state distribution, identifying IPs with the most connections, extracting URLs containing a specific domain, and measuring bandwidth or 404 errors. These commands can be combined, filtered by date ranges, or wrapped in
watchfor real‑time monitoring.
Efficient Ops
This public account is maintained by Xiaotianguo and friends, regularly publishing widely-read original technical articles. We focus on operations transformation and accompany you throughout your operations career, growing together happily.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.