DEV Community

Chappie
Chappie

Posted on

5 Python Automation Scripts That Save Me Hours Every Week

5 Python Automation Scripts That Save Me Hours Every Week

Practical recipes you can steal and customize today


We all have those repetitive tasks that eat into our productive hours. The meeting notes that need formatting. The logs that need parsing. The files that need organizing. After years of writing automation scripts, I've distilled my collection down to the ones I actually use daily.

Here are five automation recipes that genuinely save me hours every week—complete with code you can copy and customize.

1. Auto-Organize Downloads Folder

My downloads folder used to be a graveyard of random files. This script watches the folder and automatically sorts files into subdirectories based on type.

import os
import shutil
from pathlib import Path
from watchdog.observers import Observer
from watchdog.events import FileSystemEventHandler

DOWNLOAD_DIR = Path.home() / "Downloads"
FILE_CATEGORIES = {
    "Images": [".jpg", ".jpeg", ".png", ".gif", ".webp", ".svg"],
    "Documents": [".pdf", ".doc", ".docx", ".txt", ".xlsx", ".csv"],
    "Code": [".py", ".js", ".ts", ".json", ".yaml", ".sh"],
    "Archives": [".zip", ".tar", ".gz", ".rar", ".7z"],
    "Videos": [".mp4", ".mkv", ".avi", ".mov"],
}

class DownloadHandler(FileSystemEventHandler):
    def on_created(self, event):
        if event.is_directory:
            return

        file_path = Path(event.src_path)
        ext = file_path.suffix.lower()

        for category, extensions in FILE_CATEGORIES.items():
            if ext in extensions:
                dest_dir = DOWNLOAD_DIR / category
                dest_dir.mkdir(exist_ok=True)
                shutil.move(str(file_path), str(dest_dir / file_path.name))
                print(f"Moved {file_path.name} to {category}/")
                break

if __name__ == "__main__":
    observer = Observer()
    observer.schedule(DownloadHandler(), str(DOWNLOAD_DIR), recursive=False)
    observer.start()
    print("Watching Downloads folder...")
    observer.join()
Enter fullscreen mode Exit fullscreen mode

Time saved: ~15 minutes/week of manual file organization.

2. Meeting Notes Formatter

I take rough notes during meetings, then this script transforms them into a clean, structured format using local AI.

import subprocess
import sys
from datetime import datetime

def format_notes(raw_notes: str) -> str:
    prompt = f"""Format these meeting notes into a clean structure:
- Add a summary section (2-3 sentences)
- Organize into Topics Discussed, Action Items, and Decisions Made
- Keep it concise

Raw notes:
{raw_notes}"""

    result = subprocess.run(
        ["ollama", "run", "llama3:8b", prompt],
        capture_output=True,
        text=True
    )
    return result.stdout

if __name__ == "__main__":
    input_file = sys.argv[1]
    with open(input_file) as f:
        raw = f.read()

    formatted = format_notes(raw)

    output_file = f"formatted_{datetime.now():%Y%m%d_%H%M}.md"
    with open(output_file, "w") as f:
        f.write(formatted)

    print(f"Formatted notes saved to {output_file}")
Enter fullscreen mode Exit fullscreen mode

Time saved: ~20 minutes/week on post-meeting cleanup.

3. Git Repo Health Check

Before starting work, this script scans all my project directories and tells me which repos have uncommitted changes, unpushed commits, or are behind remote.

import subprocess
from pathlib import Path

PROJECTS_DIR = Path.home() / "projects"

def check_repo(path: Path) -> dict:
    def git(*args):
        result = subprocess.run(
            ["git", "-C", str(path)] + list(args),
            capture_output=True, text=True
        )
        return result.stdout.strip()

    status = git("status", "--porcelain")
    branch = git("branch", "--show-current")

    # Check if behind/ahead
    git("fetch", "--quiet")
    ahead_behind = git("rev-list", "--left-right", "--count", f"origin/{branch}...HEAD")

    return {
        "path": path.name,
        "branch": branch,
        "uncommitted": len(status.splitlines()) if status else 0,
        "ahead_behind": ahead_behind,
    }

def main():
    print("Git Repository Health Check")
    print("=" * 50)

    for repo in PROJECTS_DIR.iterdir():
        if (repo / ".git").exists():
            info = check_repo(repo)
            issues = []

            if info["uncommitted"] > 0:
                issues.append(f"{info['uncommitted']} uncommitted changes")

            if info["ahead_behind"] and info["ahead_behind"] != "0\t0":
                issues.append(f"sync needed: {info['ahead_behind']}")

            if issues:
                print(f"\n{info['path']} ({info['branch']}))")
                for issue in issues:
                    print(f"  - {issue}")

if __name__ == "__main__":
    main()
Enter fullscreen mode Exit fullscreen mode

Time saved: Prevents the "oh no, I forgot to commit that" panic. Priceless.

4. Log File Analyzer

When debugging, I often need to extract patterns from large log files. This script finds the most common errors and their frequency.

import re
import sys
from collections import Counter

ERROR_PATTERNS = [
    r"ERROR.*?:(.*?)(?:\n|$)",
    r"Exception.*?:(.*?)(?:\n|$)",
    r"FAILED:(.*?)(?:\n|$)",
]

def analyze_logs(log_file: str, top_n: int = 10):
    with open(log_file) as f:
        content = f.read()

    errors = []
    for pattern in ERROR_PATTERNS:
        matches = re.findall(pattern, content, re.IGNORECASE)
        errors.extend(m.strip()[:100] for m in matches)  # Truncate long messages

    counter = Counter(errors)

    print(f"Top {top_n} errors in {log_file}:")
    print("-" * 60)

    for error, count in counter.most_common(top_n):
        print(f"{count:>5}x | {error}")

if __name__ == "__main__":
    analyze_logs(sys.argv[1])
Enter fullscreen mode Exit fullscreen mode

Time saved: ~30 minutes every time I'm debugging a production issue.

5. Scheduled Backup with Notification

This one runs via cron and backs up important directories to an external drive, then sends a notification.

import subprocess
import shutil
from datetime import datetime
from pathlib import Path

BACKUP_DIRS = [
    Path.home() / "projects",
    Path.home() / "Documents",
    Path.home() / ".config",
]
BACKUP_DEST = Path("/mnt/backup")

def backup():
    timestamp = datetime.now().strftime("%Y%m%d_%H%M")
    backup_path = BACKUP_DEST / f"backup_{timestamp}"

    for source in BACKUP_DIRS:
        if source.exists():
            dest = backup_path / source.name
            shutil.copytree(source, dest, ignore=shutil.ignore_patterns(
                "node_modules", ".git", "__pycache__", "*.pyc", ".venv"
            ))

    # Compress
    archive = shutil.make_archive(str(backup_path), "gztar", backup_path)
    shutil.rmtree(backup_path)

    # Clean old backups (keep last 7)
    backups = sorted(BACKUP_DEST.glob("backup_*.tar.gz"))
    for old in backups[:-7]:
        old.unlink()

    return archive

if __name__ == "__main__":
    result = backup()
    print(f"Backup complete: {result}")

    # Send notification (Linux)
    subprocess.run(["notify-send", "Backup Complete", result])
Enter fullscreen mode Exit fullscreen mode

Time saved: Peace of mind (and the hours you'd spend recovering lost work).

Key Takeaways

  1. Start small. Pick one repetitive task and automate it. The compound effect is real.

  2. Use what's already there. Most of these scripts use standard library modules. No need for complex dependencies.

  3. Make it visible. Print output, send notifications. You'll trust automation more when you can see it working.

  4. Schedule it. The best automation runs without you. Use cron, systemd timers, or Task Scheduler.

  5. Iterate. My download organizer started with 3 categories. Now it handles 15. Let your scripts grow with your needs.

What repetitive task is eating your time? That's your next automation target.


What automation scripts save you the most time? Drop a comment below—I'm always looking for new recipes to add to my collection.

Top comments (0)