How I Automated My Entire Dev Workflow with Python (15 Hours Saved Per Week)
Every Friday I spent 3 hours on: deployments, reports, cleanup, notifications. Now a Python script does it in 4 minutes.
Here is every automation, with code you can steal.
1. Auto-Deploy on Git Push (Saves 30 min/deploys)
import subprocess
def deploy():
subprocess.run(["git", "pull", "origin", "main"], check=True)
result = subprocess.run(["pytest", "--tb=short"], capture_output=True)
if result.returncode != 0:
notify(f"Tests failed!")
return
subprocess.run(["docker", "compose", "up", "-d", "--build"], check=True)
notify("Deployed successfully!")
Pair with a Raspberry Pi 5 as a dedicated deploy server.
2. Weekly Report Generator (Saves 1 hr/week)
import git
from datetime import datetime, timedelta
from collections import Counter
def weekly_report(repo_path):
repo = git.Repo(repo_path)
week_ago = datetime.now() - timedelta(days=7)
commits = list(repo.iter_commits("main", since=week_ago))
files_changed = Counter()
for c in commits:
for f in c.stats.files:
files_changed[f] += 1
report = f"# Weekly Report - {len(commits)} commits
"
for f, count in files_changed.most_common(10):
report += f"- {f}: {count} changes
"
return report
3. Log Monitor + Alert (Saves 2 hrs/week debugging)
import re
ERROR_PATTERN = re.compile(r"(ERROR|FATAL|Exception)", re.IGNORECASE)
def monitor_logs(log_file, alert_func):
with open(log_file) as f:
f.seek(0, 2)
while True:
line = f.readline()
if not line:
import time; time.sleep(0.1)
continue
if ERROR_PATTERN.search(line):
alert_func(f"Error detected: {line.strip()}")
4. Database Backup (Saves 30 min/week)
import subprocess, gzip
from datetime import datetime
from pathlib import Path
def backup_db(db_name, backup_dir="/backups"):
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
backup_path = Path(backup_dir) / f"{db_name}_{timestamp}.sql.gz"
dump = subprocess.run(["pg_dump", db_name], capture_output=True)
with gzip.open(backup_path, "wb") as f:
f.write(dump.stdout)
# Keep only last 30 backups
for old in sorted(Path(backup_dir).glob(f"{db_name}_*.sql.gz"))[:-30]:
old.unlink()
5. PR Review Reminder (Saves 1 hr/week)
import requests
from datetime import datetime, timedelta
def check_stale_prs(owner, repo, token):
url = f"https://api.github.com/repos/{owner}/{repo}/pulls"
headers = {"Authorization": f"token {token}"}
prs = requests.get(url, headers=headers).json()
return [pr for pr in prs
if datetime.now() - datetime.fromisoformat(pr["created_at"].replace("Z","+00:00")) > timedelta(hours=24)]
6. Daily Standup Prep (Saves 15 min/day)
def standup_prep():
yesterday = subprocess.run(
["git", "log", "--since=yesterday", "--oneline"],
capture_output=True, text=True
).stdout
return f"## Yesterday
{yesterday}
## Today
Working on current sprint.
## Blockers
None"
Putting It All Together
import schedule
schedule.every().day.at("06:00").do(deploy)
schedule.every().monday.at("09:00").do(weekly_report, "/projects/main")
schedule.every(5).minutes.do(check_stale_prs, "myorg", "myrepo", TOKEN)
schedule.every().day.at("07:00").do(backup_db, "production")
while True:
schedule.run_pending()
__import__("time").sleep(60)
Time Saved
- Auto-deploy: 2.5 hrs/week
- Weekly report: 1 hr/week
- Log monitoring: 2 hrs/week
- DB backup: 0.5 hr/week
- PR reminders: 1 hr/week
- Standup prep: 1.25 hrs/week
- Total: ~9 hrs/week (15+ with other automations)
For monitoring all this, I run Grafana on a mini PC -- $150.
Start Small
Do not automate everything at once. Pick the task you hate most. Write a 20-line script. Run it for a week. Then automate the next one.
Recommended: Automate the Boring Stuff with Python -- still the best intro to practical automation.
What do you automate? Share your scripts below.
Disclosure: Amazon affiliate links. I earn a small commission at no extra cost to you.
Top comments (0)