My Stack Runs Itself
I automated 80% of my repetitive work with 5 APIs. No complex tools — just Python scripts and cron jobs.
1. GitHub API — Auto-Create Repos and Push Code
import requests
def create_repo(name, description):
r = requests.post("https://api.github.com/user/repos",
headers={"Authorization": "Bearer ghp_token"},
json={"name": name, "description": description, "auto_init": True})
return r.json()["html_url"]
I use this to auto-create tutorial repos with standardized READMEs.
2. Dev.to API — Publish Articles Programmatically
def publish_article(title, body, tags):
r = requests.post("https://dev.to/api/articles",
headers={"api-key": "your_key", "Content-Type": "application/json"},
json={"article": {"title": title, "body_markdown": body, "tags": tags, "published": True}})
return r.json()["url"]
600+ articles published this way. Zero manual copy-pasting.
3. Telegram Bot API — Get Notifications Anywhere
def notify(message):
requests.post(f"https://api.telegram.org/bot{TOKEN}/sendMessage",
json={"chat_id": CHAT_ID, "text": message})
I get alerts when: new GitHub star, Dev.to article hits 50 views, deployment fails.
4. Open-Meteo API — Daily Context
def weather_summary(lat, lon):
r = requests.get("https://api.open-meteo.com/v1/forecast",
params={"latitude": lat, "longitude": lon,
"daily": "temperature_2m_max,precipitation_sum"})
return r.json()["daily"]
My morning Telegram bot sends weather along with my task list.
5. Wayback Machine API — Monitor Changes
def check_archived(url):
r = requests.get(f"https://archive.org/wayback/available?url={url}")
return r.json()["archived_snapshots"].get("closest", {})
I track competitor landing pages. When they change pricing, I know.
The Automation Stack
cron (every 6 hours):
-> check GitHub notifications -> Telegram
-> check Dev.to stats -> log to file
-> check weather -> Telegram morning brief
-> check competitor pages -> alert if changed
Total code: ~200 lines of Python. Saves me 2+ hours per day.
What APIs power YOUR workflow?
More from me: 10 Dev Tools I Use Daily | 77 Scrapers on a Schedule | 150+ Free APIs
Top comments (0)