DEV Community

Alex Spinov
Alex Spinov

Posted on

10 Python Automation Scripts That Save Me Hours Every Week

I used to waste 3 hours a day on repetitive tasks

Last year I was manually renaming files, checking APIs, parsing CSVs, and monitoring websites. Then I wrote 10 Python scripts that now run on autopilot.

Here's the exact list — with working code you can copy right now.


1. Bulk File Renamer (15 lines)

I had 2,000 photos from a client named IMG_3847.jpg. Needed them as project-001.jpg.

import os
from pathlib import Path

folder = Path('photos')
for i, f in enumerate(sorted(folder.glob('*.jpg')), 1):
    f.rename(folder / f'project-{i:03d}{f.suffix}')
    print(f'Renamed: {f.name} -> project-{i:03d}{f.suffix}')
Enter fullscreen mode Exit fullscreen mode

Time saved: 45 minutes per batch.


2. API Health Checker

I monitor 12 APIs for a side project. This script pings all of them and sends me a Slack alert if any goes down.

import requests

endpoints = [
    'https://api.github.com',
    'https://jsonplaceholder.typicode.com/posts/1',
    'https://httpbin.org/get',
]

for url in endpoints:
    try:
        r = requests.get(url, timeout=5)
        status = '' if r.status_code == 200 else '⚠️'
    except requests.RequestException:
        status = ''
    print(f'{status} {url} [{r.status_code if "r" in dir() else "TIMEOUT"}]')
Enter fullscreen mode Exit fullscreen mode

Time saved: 20 minutes/day (no more manual checking).


3. CSV Deduplicator

Client sends me CSVs with duplicate rows every week. This fixes it in seconds.

import pandas as pd

df = pd.read_csv('data.csv')
before = len(df)
df = df.drop_duplicates()
print(f'Removed {before - len(df)} duplicates. {len(df)} rows remain.')
df.to_csv('data_clean.csv', index=False)
Enter fullscreen mode Exit fullscreen mode

Time saved: 30 minutes per file.


4. Website Change Monitor

I track competitor pricing pages. This script checks for changes every hour and emails me when something changes.

import requests
import hashlib

url = 'https://example.com/pricing'
previous_hash = None

def check():
    global previous_hash
    content = requests.get(url).text
    current_hash = hashlib.md5(content.encode()).hexdigest()
    if previous_hash and current_hash != previous_hash:
        print(f'🔔 Change detected on {url}!')
    previous_hash = current_hash

check()
Enter fullscreen mode Exit fullscreen mode

Time saved: Hours of manual page refreshing.


5. Git Repo Stats Generator

I wanted a quick summary of any repo without reading through everything.

import subprocess

def repo_stats():
    commits = subprocess.getoutput('git rev-list --count HEAD')
    contributors = subprocess.getoutput('git shortlog -sn | wc -l').strip()
    first = subprocess.getoutput('git log --reverse --format=%ci | head -1')
    print(f'Commits: {commits}')
    print(f'Contributors: {contributors}')
    print(f'First commit: {first}')

repo_stats()
Enter fullscreen mode Exit fullscreen mode

6. JSON Flattener

Nested JSON from APIs is painful. This flattens any JSON to a flat dictionary.

def flatten(d, parent='', sep='.'):
    items = {}
    for k, v in d.items():
        key = f'{parent}{sep}{k}' if parent else k
        if isinstance(v, dict):
            items.update(flatten(v, key, sep))
        else:
            items[key] = v
    return items

nested = {'user': {'name': 'Alex', 'address': {'city': 'Berlin'}}}
print(flatten(nested))
# {'user.name': 'Alex', 'user.address.city': 'Berlin'}
Enter fullscreen mode Exit fullscreen mode

7. Screenshot Any Webpage

For client reports, I need screenshots of live pages. Playwright does this in 3 lines.

from playwright.sync_api import sync_playwright

with sync_playwright() as p:
    browser = p.chromium.launch()
    page = browser.new_page(viewport={'width': 1280, 'height': 720})
    page.goto('https://news.ycombinator.com')
    page.screenshot(path='screenshot.png')
    browser.close()
    print('Screenshot saved!')
Enter fullscreen mode Exit fullscreen mode

8. Email Attachment Downloader

I get invoices as email attachments. This script downloads them all to a folder.

import imaplib
import email
from pathlib import Path

mail = imaplib.IMAP4_SSL('imap.gmail.com')
mail.login('you@gmail.com', 'app-password')
mail.select('inbox')

_, nums = mail.search(None, 'SUBJECT "invoice"')
for num in nums[0].split()[:5]:  # Last 5
    _, data = mail.fetch(num, '(RFC822)')
    msg = email.message_from_bytes(data[0][1])
    for part in msg.walk():
        if part.get_filename():
            Path('invoices').mkdir(exist_ok=True)
            Path(f'invoices/{part.get_filename()}').write_bytes(
                part.get_payload(decode=True))
            print(f'Downloaded: {part.get_filename()}')
Enter fullscreen mode Exit fullscreen mode

9. Markdown Table Generator

I write a lot of docs. This converts any list of dicts to a markdown table.

def to_markdown(data):
    headers = data[0].keys()
    lines = ['| ' + ' | '.join(headers) + ' |']
    lines.append('| ' + ' | '.join(['---'] * len(headers)) + ' |')
    for row in data:
        lines.append('| ' + ' | '.join(str(row[h]) for h in headers) + ' |')
    return '\n'.join(lines)

data = [
    {'Tool': 'Python', 'Stars': '60k', 'Use': 'Everything'},
    {'Tool': 'Node.js', 'Stars': '100k', 'Use': 'Web'},
]
print(to_markdown(data))
Enter fullscreen mode Exit fullscreen mode

10. System Resource Monitor

A simple dashboard that shows CPU, RAM, and disk usage — useful for servers.

import psutil

print(f'CPU: {psutil.cpu_percent()}%')
print(f'RAM: {psutil.virtual_memory().percent}%')
print(f'Disk: {psutil.disk_usage("/").percent}%')

# Bonus: top 5 processes by memory
for p in sorted(psutil.process_iter(['name', 'memory_percent']),
                key=lambda x: x.info['memory_percent'] or 0,
                reverse=True)[:5]:
    print(f"  {p.info['name']}: {p.info['memory_percent']:.1f}%")
Enter fullscreen mode Exit fullscreen mode

Why I'm sharing this

I'm a developer who builds automation tools and web scrapers. I've published 77 scrapers on Apify Store and write about Python productivity.

If you want more scripts like these — follow me here or check out my GitHub where I open-source everything.

What's your most useful Python automation script? Drop it in the comments — I'm always looking for new ideas to add to my toolkit.


If you're building automation tools and need a developer — reach out. I help companies build data pipelines, scrapers, and API integrations.


More from me: 10 Dev Tools I Use Daily | 77 Scrapers on a Schedule | 150+ Free APIs

Top comments (0)