Grafana's HTTP API lets you create dashboards, manage alerts, query data sources, and automate monitoring — all programmatically. No clicking through the UI.
Authentication
# Create API key in Grafana UI → Configuration → API Keys
export GRAFANA_TOKEN="your-api-key"
curl -H "Authorization: Bearer $GRAFANA_TOKEN" http://localhost:3000/api/org
Dashboard API
# Create dashboard
curl -X POST http://localhost:3000/api/dashboards/db \
-H "Authorization: Bearer $GRAFANA_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"dashboard": {
"title": "API Metrics",
"panels": [{
"type": "timeseries",
"title": "Request Rate",
"gridPos": {"h": 8, "w": 12, "x": 0, "y": 0},
"targets": [{
"expr": "rate(http_requests_total[5m])",
"legendFormat": "{{method}} {{path}}"
}]
}]
},
"overwrite": true
}'
# Search dashboards
curl -H "Authorization: Bearer $GRAFANA_TOKEN" \
"http://localhost:3000/api/search?query=api"
Alerting API
# Create alert rule
curl -X POST http://localhost:3000/api/v1/provisioning/alert-rules \
-H "Authorization: Bearer $GRAFANA_TOKEN" \
-d '{
"title": "High Error Rate",
"condition": "C",
"data": [{
"refId": "A",
"queryType": "",
"model": {"expr": "rate(http_errors_total[5m])"}
}],
"for": "5m"
}'
# List alert rules
curl -H "Authorization: Bearer $GRAFANA_TOKEN" \
http://localhost:3000/api/v1/provisioning/alert-rules
Data Source Proxy
# Query Prometheus through Grafana
curl -H "Authorization: Bearer $GRAFANA_TOKEN" \
"http://localhost:3000/api/datasources/proxy/1/api/v1/query?query=up"
# Query InfluxDB through Grafana
curl -X POST -H "Authorization: Bearer $GRAFANA_TOKEN" \
http://localhost:3000/api/datasources/proxy/2/query \
-d '{"db":"mydb","q":"SELECT mean(value) FROM cpu WHERE time > now() - 1h GROUP BY time(5m)"}'
Automation Script
import requests
GRAFANA = "http://localhost:3000"
headers = {"Authorization": f"Bearer {TOKEN}"}
# Export all dashboards
dashboards = requests.get(f"{GRAFANA}/api/search", headers=headers).json()
for d in dashboards:
full = requests.get(f"{GRAFANA}/api/dashboards/uid/{d['uid']}", headers=headers).json()
with open(f"dashboards/{d['uid']}.json", 'w') as f:
json.dump(full, f, indent=2)
Why This Matters
- Infrastructure as code: Version-control your dashboards
- Automated alerts: Create alerting rules programmatically
- Multi-datasource: Query any datasource through one API
- Dashboard provisioning: Deploy monitoring with your code
Need custom monitoring tools or Grafana automation? I build developer tools. Check out my web scraping actors on Apify or reach out at spinov001@gmail.com for custom solutions.
Top comments (0)