DEV Community

Cover image for Solved: Automate Dependency Updates: Creating a Custom Renovate-like Script
Darian Vance
Darian Vance

Posted on • Originally published at wp.me

Solved: Automate Dependency Updates: Creating a Custom Renovate-like Script

🚀 Executive Summary

TL;DR: Manually updating dependencies across numerous microservices is a significant time sink and prone to errors for DevOps engineers. This guide presents a custom Python script that automates the process of identifying outdated packages, creating a dedicated Git branch, committing the necessary updates, and opening a pull request, thereby reclaiming valuable engineering time.

🎯 Key Takeaways

  • Custom dependency update automation can be achieved using Python by leveraging pip list –outdated –format=json for reliable identification of outdated packages.
  • The GitPython library facilitates programmatic Git operations, including repository cloning, creating new branches, modifying files like requirements.txt, committing changes, and pushing the updated branch to the remote.
  • Pull request creation is managed through direct API calls to the Git provider using the requests library, requiring a Personal Access Token (PAT) with appropriate repository write permissions for authentication.

Automate Dependency Updates: Creating a Custom Renovate-like Script

Hey there, Darian Vance here. As a Senior DevOps Engineer at TechResolve, I’m always looking for ways to reclaim time. I used to spend my Monday mornings manually running checks for outdated packages across a dozen microservices. It was a tedious, error-prone time sink. That’s when I realized we could automate the most repetitive part: finding the outdated dependency, branching, and opening a pull request. This little script I’m about to show you saved my team hours every week and let us focus on what actually matters.

While tools like Renovate or Dependabot are fantastic, sometimes you need something simpler, more custom, or you’re in an environment where you can’t install third-party apps. This guide will help you build your own lightweight version from scratch.

Prerequisites

  • Python 3 installed on the machine that will run the script.
  • A good understanding of how your project manages Python dependencies (e.g., a requirements.txt file).
  • A Git provider like GitHub or GitLab.
  • A Personal Access Token (PAT) for your Git provider with repository write access. Treat this like a password!

The Guide: Step-by-Step

Step 1: Project Setup and Configuration

I’ll skip the usual project setup commands for creating directories or Python virtual environments. I trust you have your own preferred workflow for that. The important part is to create a dedicated folder for our script.

Inside that folder, you’ll need to install a few Python libraries. You can do this with pip, for example: pip install requests GitPython python-dotenv.

Next, create a configuration file named config.env to store our secrets. This keeps them out of the script itself, which is a critical security practice.

# config.env
GIT_PROVIDER_API_URL="https://api.github.com"
REPO_OWNER="your-username-or-org"
REPO_NAME="your-repo-name"
GIT_TOKEN="your_personal_access_token_here"
REPO_URL="https://your-username:${GIT_TOKEN}@github.com/${REPO_OWNER}/${REPO_NAME}.git"
MAIN_BRANCH="main"
Enter fullscreen mode Exit fullscreen mode

Pro Tip: Notice how I’m embedding the token directly in the REPO_URL. This is a common pattern for CI/CD environments that makes authentication with Git seamless for the script. Just be sure this config.env file is in your .gitignore and never committed to source control.

Step 2: The Core Logic – Checking for Updates

Let’s create our Python script, let’s call it update_checker.py. The first thing we need to do is check for outdated packages. The most reliable way I’ve found is to use pip’s own machinery.

import subprocess
import json
import os
from dotenv import load_dotenv

def get_outdated_packages():
    """Checks for outdated pip packages and returns them as a list of dicts."""
    print("Checking for outdated packages...")
    try:
        # Using --format=json is much easier to parse than the default output
        result = subprocess.run(
            ['python3', '-m', 'pip', 'list', '--outdated', '--format=json'],
            capture_output=True,
            text=True,
            check=True
        )
        outdated_packages = json.loads(result.stdout)
        if not outdated_packages:
            print("All packages are up to date. Nothing to do.")
            return []
        print(f"Found {len(outdated_packages)} outdated package(s).")
        return outdated_packages
    except (subprocess.CalledProcessError, json.JSONDecodeError) as e:
        print(f"Error checking for outdated packages: {e}")
        return []

if __name__ == '__main__':
    load_dotenv('config.env')
    updates = get_outdated_packages()
    if updates:
        # We will process the first one for simplicity in this tutorial
        package_to_update = updates[0]
        print(f"Next step would be to update {package_to_update['name']} to version {package_to_update['latest_version']}")
Enter fullscreen mode Exit fullscreen mode

The logic here is straightforward: we run the pip list --outdated command but with the crucial --format=json flag. This gives us structured data we can easily loop through, which is far more robust than trying to parse plain text columns.

Step 3: Git Operations – Branching and Committing

Now for the fun part. Once we have an outdated package, we need to clone the repo, create a new branch, update the requirements.txt file, and commit the change. The GitPython library makes this incredibly clean.

# Add these imports to the top of your script
import re
from git import Repo, Actor

def update_requirements_file(repo_path, package_name, new_version):
    """Updates the version for a specific package in requirements.txt."""
    requirements_path = os.path.join(repo_path, 'requirements.txt')
    print(f"Updating {package_name} in {requirements_path}...")
    with open(requirements_path, 'r+') as f:
        lines = f.readlines()
        f.seek(0)
        f.truncate()
        for line in lines:
            # A simple regex to handle '==' and '~=' version specifiers
            if line.lower().startswith(package_name.lower()):
                updated_line = re.sub(r'(==|~=).*', f'=={new_version}', line, flags=re.IGNORECASE)
                f.write(updated_line)
                print(f"Updated line: {updated_line.strip()}")
            else:
                f.write(line)
    return True

def create_git_branch_and_commit(package_name, new_version):
    """Clones repo, creates branch, updates file, and commits."""
    repo_url = os.getenv('REPO_URL')
    repo_name = os.getenv('REPO_NAME')
    main_branch = os.getenv('MAIN_BRANCH')
    local_repo_path = f"./{repo_name}"

    # Clean up previous clone if it exists
    if os.path.exists(local_repo_path):
        import shutil
        shutil.rmtree(local_repo_path)

    print(f"Cloning {repo_name}...")
    repo = Repo.clone_from(repo_url, local_repo_path, branch=main_branch)

    branch_name = f"deps/update-{package_name}-to-{new_version}"
    print(f"Creating new branch: {branch_name}")
    new_branch = repo.create_head(branch_name)
    new_branch.checkout()

    if not update_requirements_file(local_repo_path, package_name, new_version):
        return None # Failed to update file

    print("Committing changes...")
    repo.git.add(update=True)
    author = Actor("Automated Updater", "bot@techresolve.com")
    commit_message = f"chore(deps): update {package_name} to {new_version}"
    repo.index.commit(commit_message, author=author, committer=author)

    print("Pushing branch to origin...")
    origin = repo.remote(name='origin')
    origin.push(refspec=f'{branch_name}:{branch_name}')

    return branch_name
Enter fullscreen mode Exit fullscreen mode

Step 4: Creating the Pull Request

With our branch pushed, the final step is to open a pull request. We’ll use the requests library to make a POST request to our Git provider’s API.

# Add this import
import requests

def create_pull_request(branch_name, package_name, new_version):
    """Creates a pull request on the Git provider."""
    api_url = os.getenv('GIT_PROVIDER_API_URL')
    owner = os.getenv('REPO_OWNER')
    repo = os.getenv('REPO_NAME')
    token = os.getenv('GIT_TOKEN')
    main_branch = os.getenv('MAIN_BRANCH')

    pr_url = f"{api_url}/repos/{owner}/{repo}/pulls"
    headers = {
        'Authorization': f'token {token}',
        'Accept': 'application/vnd.github.v3+json',
    }
    payload = {
        'title': f'chore(deps): Bump {package_name} to {new_version}',
        'head': branch_name,
        'base': main_branch,
        'body': f'This is an automated PR to update the `{package_name}` dependency to version `{new_version}`.',
    }

    print(f"Creating pull request for branch: {branch_name}")
    response = requests.post(pr_url, headers=headers, json=payload)

    if response.status_code == 201:
        print("Successfully created pull request.")
        print(f"URL: {response.json().get('html_url')}")
    else:
        print(f"Failed to create pull request. Status: {response.status_code}")
        print(f"Response: {response.text}")
Enter fullscreen mode Exit fullscreen mode

Step 5: Putting It All Together and Scheduling

Now, let’s update our main execution block to run this whole workflow. We’ll process one dependency at a time to keep things simple and avoid merge conflicts between our own automated PRs.

# Update your if __name__ == '__main__': block
if __name__ == '__main__':
    load_dotenv('config.env')

    updates = get_outdated_packages()
    if not updates:
        print("Exiting.")
        # Using return instead of sys.exit()
        # return

    # Process only the first outdated package found
    package_to_update = updates[0]
    pkg_name = package_to_update['name']
    latest_version = package_to_update['latest_version']

    print(f"\nProcessing update for {pkg_name} to {latest_version}...")

    branch = create_git_branch_and_commit(pkg_name, latest_version)
    if branch:
        create_pull_request(branch, pkg_name, latest_version)
    else:
        print("Failed to create commit. Aborting PR creation.")
Enter fullscreen mode Exit fullscreen mode

To run this automatically, you can use a scheduler like cron. To run it every Monday at 2 AM, for example, you would add a line to your cron table. The command looks like this:

0 2 * * 1 python3 script.py

Common Pitfalls

Here’s where I’ve stumbled in the past, so hopefully you can avoid it:

  • API Rate Limiting: If you run this script too frequently or against too many repos, you might hit your Git provider’s API rate limit. Check their documentation and schedule your script accordingly.
  • Existing PRs: This simple script doesn’t check if a PR for that dependency already exists. A more advanced version would query for open PRs from your bot user before creating a new one.
  • Complex Dependencies: My requirements.txt updater is simple. It won’t handle complex version specifiers like package>=1.0,<2.0. For that, you might need a more robust parsing library. Start simple and add complexity only when you need it.
  • Permissions: The number one issue is an invalid or improperly scoped Personal Access Token. Double-check that your token has repo or write permissions.

Conclusion

And there you have it. A fully functional, albeit simple, dependency update bot. It handles the most annoying parts of dependency management for you, freeing you up to solve bigger problems. This script is a great foundation. You can expand it to handle different package managers, post notifications to Slack, or integrate it into a more extensive CI/CD pipeline. Happy automating!


Darian Vance

👉 Read the original article on TechResolve.blog


☕ Support my work

If this article helped you, you can buy me a coffee:

👉 https://buymeacoffee.com/darianvance

Top comments (0)