DEV Community

Cover image for Use Local LLMs to Eliminate Little Annoying Tasks
Seena Sabti
Seena Sabti

Posted on

Use Local LLMs to Eliminate Little Annoying Tasks

Over the past year, I’ve been slowly moving many of the little repetitive tasks in my engineering workflow over to local LLMs. These are the tiny chores that show up dozens of times a day and quietly wear you down. Automating them away has been a real blessing.

If you’re a software engineer who wants to eliminate the tedious parts and move faster, then I hope sharing some of my scripts inspires you to streamline your own workflow as well.

The core of my setup is Ollama. It runs several code focused local models. You do need a little power under the hood machine to run some of these higher param models. On my M4 Mac these models have been fantastic:

  • qwen2.5-coder:7b runs extremely fast and is more than enough for most tasks
  • qwen2.5-coder:14b a bit slower, but best overall reasoning and code quality.

Running models locally, gives automatic privacy, instant response, no api costs or dependencies. Perfect for experimentation and lots of usage.

The Scripts

The following code snippet, is a small python script I use to automatically generate commit messages. It pulls the diff from staged files, prompts qwen2.5-coder:14b and produces one liner commit messages. I commit often, which also helps the LLM with a small, focused context window.

#!/usr/bin/env python3

import subprocess
import json
import sys
import requests

OLLAMA_URL = "http://localhost:11434/api/generate"
MODEL = "qwen2.5-coder:14b"


def get_staged_diff():
    result = subprocess.run(
        ["git", "diff", "--cached"],
        capture_output=True,
        text=True
    )
    return result.stdout.strip()


def generate_commit_message(diff_text):
    payload = {
        "model": MODEL,
        "prompt": (
            "Generate a concise git commit message for this diff. "
            "Return ONLY a JSON object containing: {\"message\": \"...\"}.\n\n"
            f"Diff:\n{diff_text}"
        ),
        "format": {
            "type": "object",
            "properties": {
                "message": {"type": "string"}
            },
            "required": ["message"]
        },
        "stream": False,
        "options": {"temperature": "0"}
    }

    response = requests.post(OLLAMA_URL, json=payload)
    response.raise_for_status()
    data = response.json()

    try:
        obj = json.loads(data["response"])
        return obj["message"]
    except Exception:
        print("Error parsing structured output.\nFull response:")
        print(data)
        sys.exit(1)


def git_commit(message):
    subprocess.run(["git", "commit", "-m", message])


def main():
    diff = get_staged_diff()
    if not diff:
        print("No staged changes.")
        sys.exit(1)

    print("Generating AI commit message...")
    message = generate_commit_message(diff)

    print(f"\nCommit message:\n{message}\n")
    git_commit(message)

    print("Committed successfully!")


if __name__ == "__main__":
    main()
Enter fullscreen mode Exit fullscreen mode

The following snippet is another python script I use to automatically generate PR descriptions. It compares my feature branch against a base branch, extracts the diffs, and sends them to qwen2.5-coder:14b running on Ollama. The model summarizes the technical changes into a short, high-signal PR description, which I then publish through the GitHub CLI. I also make sure to read the description to see the model capture the intent fully, which it usually does.

#!/usr/bin/env python3

import os
import sys
import subprocess
import re
import json
import requests
import textwrap
import tempfile

OLLAMA_URL = "http://localhost:11434/api/generate"
MODEL = "qwen2.5-coder:14b"

MY_GITHUB_USER = ""
PR_LABELS = []  


def run_git(*args) -> str:
    result = subprocess.run(
        ["git", *args],
        capture_output=True,
        text=True,
    )
    if result.returncode != 0:
        print(f"Error running git {' '.join(args)}:\n{result.stderr}")
        sys.exit(1)
    return result.stdout.strip()


def get_current_branch() -> str:
    return run_git("rev-parse", "--abbrev-ref", "HEAD")


def get_changed_files(base: str, head: str):
    out = run_git("diff", "--name-only", f"{base}...{head}")
    return [line for line in out.splitlines() if line.strip()]

def get_file_diff(base: str, head: str, path: str) -> str:
    result = subprocess.run(
        ["git", "diff", f"{base}...{head}", "--", path],
        capture_output=True,
        text=True,
    )
    return result.stdout.strip()

def generate_description(formatted_changes: str, pr_title: str) -> str:
    prompt = textwrap.dedent(f"""
    You are helping write the description for a GitHub Pull Request.

    PR title:
    "{pr_title}"

    Below is a formatted list of files and their diffs between the base branch and this feature branch.

    Your task:
    - Write a concise and short description explaining what this PR does.
    - Focus on behavior changes, intent, and key technical points.
    - Assume this will go under a heading 'What does this PR do?' already.
    - DO NOT include headings, bullet lists, or checkboxes.
    - DO NOT mention that you are an AI or describe the process.
    - Just output a few sentences or short paragraphs of description.
    - Be as short and concise as possible without sacrificing meaning or quality.

    Here are the changes:

    {formatted_changes}
    """)

    payload = {
        "model": MODEL,
        "prompt": prompt,
        "stream": False,
        "options": {"temperature": 0},
    }

    response = requests.post(OLLAMA_URL, json=payload)
    response.raise_for_status()
    data = response.json()
    desc = data.get("response", "").strip()
    if not desc:
        print("Ollama returned an empty description.")
        sys.exit(1)

    return desc


def create_pr_with_gh_cli(title: str, body: str, base: str = "main"):

    with tempfile.NamedTemporaryFile(mode="w", delete=False) as tmp:
        tmp.write(body)
        tmp_path = tmp.name

    cmd = [
        "gh", "pr", "create",
        "--title", title,
        "--body-file", tmp_path,
        "--base", base,
        "--assignee", MY_GITHUB_USER,
    ]

    for label in PR_LABELS:
        cmd.extend(["--label", label])

    print("Running:", " ".join(cmd))
    result = subprocess.run(cmd, text=True)

    if result.returncode != 0:
        print("gh pr create failed.")
        sys.exit(result.returncode)

    print("PR created via GitHub CLI.")


def main():
    if len(sys.argv) < 2:
        print("Usage: ai_github_pr.py \"PR Title\" [--base main]")
        sys.exit(1)

    args = sys.argv[1:]
    base = "main"

    if "--base" in args:
        idx = args.index("--base")
        try:
            base = args[idx + 1]
        except IndexError:
            print("Error: --base requires a branch name argument.")
            sys.exit(1)

        args = args[:idx] + args[idx + 2:]

    raw_title = args[0]
    head = get_current_branch()

    if not raw_title.startswith(f"[{head}]"):
        title = f"[{head}] {raw_title}"
    else:
        title = raw_title

    files = get_changed_files(base, head)
    print(files)
    if not files:
        print(f"No changed files between {base}...{head}.")
        sys.exit(1)

    chunks = []
    for path in files:
        diff = get_file_diff(base, head, path)
        if not diff:
            continue
        chunk = f"=== File: {path} ===\n\n{diff}\n\n"
        chunks.append(chunk)

    formatted_changes = "\n".join(chunks)
    if not formatted_changes.strip():
        print("No meaningful diffs found to describe.")
        sys.exit(1)

    print("Generating PR description from Ollama...")
    description = generate_description(formatted_changes, title)

    template = """#### What does this PR do?

{description}
"""

    body = template.format(
        description=description,
        branch_name=head,
    )

    print("\n--- Generated PR Body Preview ---\n")
    print(body)
    print("\n---------------------------------\n")

    answer = input("Create PR with this body via gh? [y/N]: ").strip().lower()
    if answer not in ("y", "yes"):
        print("Aborted.")
        sys.exit(0)

    create_pr_with_gh_cli(title=title, body=body, base=base)


if __name__ == "__main__":
    main()
Enter fullscreen mode Exit fullscreen mode

I also use a set of python scripts that query my company's custom "Jira-like" system. They pull project details, divide the items into actionable tasks, embeds linked reference material, and compile everything into a "Cursor instruction document". These scripts are tightly tailored to my workflow, so I’m not sharing them here.

I hope the first two scripts are useful to you, since many of us share similar workflows. And even if they’re not a perfect fit, I hope they at least inspire you to automate the boring parts of your day and free up more time for meaningful, enjoyable work.

Top comments (0)