Add nginx log analyzer

pull/7382/head
Kamran Ahmed 1 week ago
parent 4125c28f5a
commit ae7ea40f74
  1. 28
      src/data/projects/nginx-log-analyser.md

@ -1,5 +1,5 @@
---
title: 'Log Analysis Tool'
title: 'Nginx Log Analyser'
description: 'Write a simple tool to analyze logs from the command line.'
isNew: true
sort: 3
@ -10,14 +10,13 @@ skills:
- 'bash'
- 'shell scripting'
seo:
title: 'Log Analysis Tool'
title: 'Nginx Log Analyser'
description: 'Build a simple CLI tool to analyze logs from the command line.'
keywords:
- 'log analysis tool'
- 'Nginx Log Analyser'
- 'devops project idea'
roadmapIds:
- 'devops'
- 'linux'
---
The goal of this project is to help you practice some basic shell scripting skills. You will write a simple tool to analyze logs from the command line.
@ -36,6 +35,13 @@ Download the sample nginx access log file from [here](https://gist.githubusercon
You are required to create a shell script that reads the log file and provides the following information:
1. Top 5 IP addresses with the most requests
2. Top 5 most requested paths
3. Top 5 response status codes
4. Top 5 user agents
Here is an example of what the output should look like:
```text
Top 5 IP addresses with the most requests:
45.76.135.253 - 1000 requests
@ -44,5 +50,19 @@ Top 5 IP addresses with the most requests:
43.224.43.187 - 30 requests
178.128.94.113 - 20 requests
Top 5 most requested paths:
/api/v1/users - 1000 requests
/api/v1/products - 600 requests
/api/v1/orders - 50 requests
/api/v1/payments - 30 requests
/api/v1/reviews - 20 requests
Top 5 response status codes:
200 - 1000 requests
404 - 600 requests
500 - 50 requests
401 - 30 requests
304 - 20 requests
```
There are multiple ways to solve this challenge. Do some research on `awk`, `sort`, `uniq`, `head`, `grep`, and `sed` commands. Stretch goal is to come up with multiple solutions for the above problem. For example, instead of using `awk`, you can use `grep` and `sed` to filter and count the requests.
Loading…
Cancel
Save