DEV Community

Alex Spinov
Alex Spinov

Posted on

Grafana Loki Has a Free API — Log Aggregation Without Indexing Full Text

Grafana Loki is the cost-effective log aggregation system that indexes only metadata (labels), not the full log text. This makes it 10-100x cheaper than Elasticsearch while still being fast.

Free, open source, by Grafana Labs. The 'Prometheus for logs.'

Why Use Loki?

  • Cheap storage — doesn't index log content, only labels
  • Prometheus-like — same label-based querying as Prometheus
  • LogQL — powerful query language for filtering and aggregating logs
  • Kubernetes-native — auto-discovers pod logs via Promtail/Alloy
  • Multi-tenant — built-in multi-tenancy support

Quick Setup

1. Install

# Docker
docker run -d --name loki -p 3100:3100 grafana/loki:latest

# Helm
helm repo add grafana https://grafana.github.io/helm-charts
helm install loki grafana/loki-stack --set promtail.enabled=true
Enter fullscreen mode Exit fullscreen mode

2. Push Logs

# Push a log entry
curl -s -X POST http://localhost:3100/loki/api/v1/push \
  -H "Content-Type: application/json" \
  -d '{
    "streams": [{
      "stream": {"app": "my-scraper", "env": "production"},
      "values": [
        ["'$(date +%s)000000000'", "Scraping completed: 1500 pages in 45s"],
        ["'$(date +%s)000000001'", "Found 342 new products"]
      ]
    }]
  }'
Enter fullscreen mode Exit fullscreen mode

3. Query Logs (LogQL)

LOKI="http://localhost:3100"

# Simple label query
curl -s -G "$LOKI/loki/api/v1/query_range" \
  --data-urlencode 'query={app="my-scraper"}' \
  --data-urlencode 'limit=10' | jq '.data.result[].values[][1]'

# Filter by content
curl -s -G "$LOKI/loki/api/v1/query_range" \
  --data-urlencode 'query={app="my-scraper"} |= "error"' | jq '.data.result[].values[][1]'

# Regex filter
curl -s -G "$LOKI/loki/api/v1/query_range" \
  --data-urlencode 'query={app="my-scraper"} |~ "Found [0-9]+ new"' | jq

# Aggregation — log rate per minute
curl -s -G "$LOKI/loki/api/v1/query" \
  --data-urlencode 'query=rate({app="my-scraper"}[5m])' | jq '.data.result'
Enter fullscreen mode Exit fullscreen mode

4. Get Labels & Series

# List all labels
curl -s "$LOKI/loki/api/v1/labels" | jq '.data'

# Label values
curl -s "$LOKI/loki/api/v1/label/app/values" | jq '.data'

# Series
curl -s -G "$LOKI/loki/api/v1/series" --data-urlencode 'match={app="my-scraper"}' | jq
Enter fullscreen mode Exit fullscreen mode

Python Example

import requests
from datetime import datetime, timedelta
import time

LOKI = "http://localhost:3100"

# Push logs
requests.post(f"{LOKI}/loki/api/v1/push", json={
    "streams": [{
        "stream": {"app": "scraper", "level": "info"},
        "values": [[str(int(time.time() * 1e9)), "Started scraping amazon.com"]]
    }]
})

# Query logs
result = requests.get(f"{LOKI}/loki/api/v1/query_range", params={
    "query": '{app="scraper"}',
    "limit": 50
}).json()

for stream in result["data"]["result"]:
    labels = stream["stream"]
    for ts, line in stream["values"]:
        print(f"[{labels.get('level','?')}] {line}")
Enter fullscreen mode Exit fullscreen mode

Key Endpoints

Endpoint Description
/loki/api/v1/push Push log entries
/loki/api/v1/query Instant query
/loki/api/v1/query_range Range query
/loki/api/v1/labels List label names
/loki/api/v1/label/{name}/values Label values
/loki/api/v1/series List series
/loki/api/v1/tail WebSocket live tail
/ready Readiness check

Need custom data extraction or scraping solution? I build production-grade scrapers for any website. Email: Spinov001@gmail.com | My Apify Actors

Top comments (0)