Grafana Loki is a horizontally scalable, highly available log aggregation system. Unlike Elasticsearch, it only indexes metadata (labels), making it cost-effective to operate at scale.
What Is Loki?
Loki is designed to be very cost-effective and easy to operate. It does not index the contents of the logs, but rather a set of labels for each log stream — like Prometheus does for metrics.
Key Features:
- Label-based log indexing (like Prometheus)
- LogQL query language
- Native Grafana integration
- Multi-tenant support
- S3/GCS/Azure object storage backend
- Promtail, Fluentd, Fluent Bit agents
- Alert rules on log patterns
- Structured metadata
Quick Start
# Docker
docker run -d --name loki -p 3100:3100 grafana/loki:latest
# Kubernetes via Helm
helm repo add grafana https://grafana.github.io/helm-charts
helm install loki grafana/loki-stack -n monitoring --create-namespace \
--set promtail.enabled=true \
--set grafana.enabled=true
Loki API: Push and Query Logs
Push Logs
import requests
import time
LOKI = "http://localhost:3100"
# Push log entries
requests.post(f"{LOKI}/loki/api/v1/push", json={
"streams": [{
"stream": {
"app": "web-api",
"env": "production",
"level": "error"
},
"values": [
[str(int(time.time() * 1e9)), "Connection refused to database on port 5432"],
[str(int(time.time() * 1e9) + 1000000), "Retrying database connection in 5s"],
[str(int(time.time() * 1e9) + 5000000000), "Database connection established"]
]
}]
})
Query Logs with LogQL
# Simple label query
result = requests.get(f"{LOKI}/loki/api/v1/query_range", params={
"query": '{app="web-api", level="error"}',
"start": str(int((time.time() - 3600) * 1e9)), # last hour
"end": str(int(time.time() * 1e9)),
"limit": 100
}).json()
for stream in result["data"]["result"]:
labels = stream["stream"]
print(f"Stream: {labels}")
for ts, line in stream["values"]:
print(f" {line}")
# Log line filter
result = requests.get(f"{LOKI}/loki/api/v1/query", params={
"query": '{app="web-api"} |= "error" | json | status >= 500'
}).json()
# Metrics from logs (count errors per minute)
result = requests.get(f"{LOKI}/loki/api/v1/query", params={
"query": 'rate({app="web-api", level="error"}[5m])'
}).json()
for r in result["data"]["result"]:
print(f"Error rate: {r['value'][1]} errors/sec")
# Top error messages
result = requests.get(f"{LOKI}/loki/api/v1/query", params={
"query": 'topk(5, sum by (message) (count_over_time({app="web-api", level="error"} | json [1h])))'
}).json()
List Labels and Values
# Get all label names
labels = requests.get(f"{LOKI}/loki/api/v1/labels").json()
print(f"Labels: {labels['data']}")
# Get values for a label
values = requests.get(f"{LOKI}/loki/api/v1/label/app/values").json()
print(f"Apps: {values['data']}")
# Get log streams
streams = requests.get(f"{LOKI}/loki/api/v1/series", params={
"match[]": '{env="production"}'
}).json()
for stream in streams["data"]:
print(f"Stream: {stream}")
Promtail Configuration
# promtail-config.yml
server:
http_listen_port: 9080
clients:
- url: http://loki:3100/loki/api/v1/push
scrape_configs:
- job_name: containers
docker_sd_configs:
- host: unix:///var/run/docker.sock
relabel_configs:
- source_labels: ['__meta_docker_container_name']
target_label: 'container'
Resources
- Loki Docs
- Loki GitHub — 24K+ stars
- LogQL Reference
Need to scrape web data for your logging pipeline? Check out my web scraping tools on Apify — production-ready actors for Reddit, Google Maps, and more. Questions? Email me at spinov001@gmail.com
Top comments (0)