Log Management Toolkit
Production-ready ELK stack deployment with structured logging pipelines and operational dashboards.
Complete Elasticsearch, Logstash, Kibana, and Filebeat configurations for centralized log management. Includes ready-to-use parsing pipelines for Nginx, application JSON logs, and system logs, plus operational scripts for index management, backups, and a comprehensive log strategy guide.
What You Get
- Docker Compose stack — One-command ELK deployment with resource limits
- 5 Logstash pipelines — Parse Nginx, JSON apps, syslog, and custom formats
- Filebeat configuration — Ship logs from any server to your stack
- Kibana dashboards — Pre-built visualizations for immediate insight
- 3 operational scripts — Setup, index rotation, and Kibana backup
- Strategy guide — Log levels, retention, alerting, and compliance
File Tree
log-management-toolkit/
├── README.md
├── LICENSE
├── manifest.json
├── .env.example
├── docker-compose.yml
├── elasticsearch/
│ └── elasticsearch.yml # ES node configuration
├── logstash/
│ ├── pipeline/
│ │ ├── main.conf # Main routing pipeline
│ │ ├── nginx-access.conf # Nginx access log parser
│ │ └── app-json.conf # JSON application log parser
│ └── patterns/
│ └── custom-patterns # Custom grok patterns
├── filebeat/
│ ├── filebeat.yml # Filebeat shipper config
│ └── modules.d/
│ ├── nginx.yml # Nginx module config
│ └── system.yml # System log module config
├── kibana/
│ └── export/
│ └── dashboards.ndjson # Pre-built dashboards
├── scripts/
│ ├── setup.sh # Stack deployment script
│ ├── rotate-indices.sh # Index lifecycle management
│ └── backup-kibana.sh # Kibana saved objects backup
└── guides/
└── log-management-strategy.md # Logging strategy & best practices
Getting Started
1. Configure environment
cp .env.example .env
# Edit .env with your settings (passwords, retention, memory limits)
vim .env
2. Deploy the stack
# Run the automated setup
bash scripts/setup.sh
# Or deploy manually
docker compose up -d
3. Access the interfaces
| Service | URL | Default Credentials |
|---|---|---|
| Kibana | http://localhost:5601 | elastic / (from .env) |
| Elasticsearch | http://localhost:9200 | elastic / (from .env) |
4. Ship logs from other servers
Copy filebeat/filebeat.yml to your application servers and configure:
# Deploy the config
sudo cp filebeat.yml /etc/filebeat/filebeat.yml
sudo systemctl enable --now filebeat
5. Import dashboards
# Dashboards auto-import during setup, or manually:
curl -X POST "localhost:5601/api/saved_objects/_import" \
-H "kbn-xsrf: true" \
--form file=@kibana/export/dashboards.ndjson
Requirements
- Docker 20.10+ and Docker Compose v2.0+
- RAM: Minimum 4GB (8GB+ recommended for production)
- Disk: 20GB+ for Elasticsearch data (depends on log volume)
- Network: Ports 5601, 9200, 5044 available
Architecture
┌──────────────┐ ┌──────────────┐ ┌──────────────┐
│ App Server │ │ Web Server │ │ DB Server │
│ (Filebeat) │ │ (Filebeat) │ │ (Filebeat) │
└──────┬───────┘ └──────┬───────┘ └──────┬───────┘
│ │ │
└────────────────────┼────────────────────┘
│ Port 5044 (Beats)
┌────────▼────────┐
│ Logstash │
│ (Parse/Route) │
└────────┬────────┘
│
┌────────▼────────┐
│ Elasticsearch │
│ (Store/Index) │
└────────┬────────┘
│
┌────────▼────────┐
│ Kibana │
│ (Visualize) │
└─────────────────┘
Index Lifecycle Management
Included scripts manage index retention automatically:
# Rotate indices older than 30 days (configurable)
bash scripts/rotate-indices.sh
# Set up as daily cron job
echo "0 2 * * * /path/to/scripts/rotate-indices.sh" | crontab -
Related Products
- Monitoring Stack Setup — Prometheus, Grafana, and alerting setup
- Nginx Config Templates — Production Nginx configurations and patterns
- Backup & Disaster Recovery — Backup strategies and DR runbooks
This is 1 of 6 resources in the DevOps Toolkit Pro toolkit. Get the complete [Log Management Toolkit] with all files, templates, and documentation for $XX.
Or grab the entire DevOps Toolkit Pro bundle (6 products) for $178 — save 30%.
Top comments (0)