When cloning Python projects that I want to test or use for a demo, I often find out that the repository uses a requirements.txt file instead of a modern pyproject.toml. While requirements.txt has served the Python community well for years, I've come to appreciate the speed and simplicity of uv, a fast Python package manager that uses the standardized pyproject.toml format. But converting between these formats manually? That can be tedious and error-prone, especially when dealing with complex dependency specifications.
That's why I built requirements-to-uv: a command-line tool that automatically converts Python projects from requirements.txt to uv-managed pyproject.toml with a single command. Whether you're modernizing your own projects or quickly setting up cloned repositories, this tool handles the conversion details so you can focus on actually working with the code.
In this post, I'll walk you through what makes this conversion non-trivial, show you how to use the tool in seconds, and explain some of the intelligent logic that makes it work reliably across different project structures.
Why Move from requirements.txt to uv?
Before diving into the conversion process, it's worth understanding why this matters. The requirements.txt format has been the standard for dependency management in Python for many years, but it has limitations. For example, there's no standardized way (apart from file names) to separate development dependencies from production ones. And different tools interpret the format slightly differently, leading to inconsistencies.
Enter pyproject.toml: a standardized format defined in PEP 518 that consolidates project metadata and dependencies in one place. When combined with uv, you get lightning-fast dependency resolution and installation. The uv tool provides a consistent development workflow across projects and uses lock files for truly reproducible environments.
The transition from requirements.txt to pyproject.toml with uv isn't just about following trends—it's about faster builds, more reliable environments, and better project organization.
The Challenge: The requirements.txt File Is More Complex Than It Looks
At first glance, converting a requirements.txt file to pyproject.toml seems straightforward. Just copy the package names and versions, right? Not quite. The requirements.txt files you can find in the wild are surprisingly complex, and a naive conversion would lose important information or break dependencies entirely.
Consider what a real-world requirements file might contain:
# Standard packages with version specifiers
requests>=2.28.0
flask[async]>=3.0.0
# Git dependencies with specific branches
git+https://github.com/user/repo.git@main#egg=mypackage
-e git+ssh://git@github.com/user/another.git@develop
# Local path dependencies
-e ./local-package
../another-package
# Environment markers for conditional installation
pytest>=8.0.0 ; python_version >= "3.8"
# Poetry-style version constraints that aren't valid in pyproject.toml
django^4.2.0
# Package extras
celery[redis,msgpack]>=5.3.0
Each of these patterns requires different handling in the pyproject.toml format. Git dependencies need to be split into a regular dependency entry and a separate [tool.uv.sources] section that specifies the repository URL and branch. Poetry-style caret (^) constraints need to be converted to the equivalent range syntax. Local paths require special source declarations. Environment markers need to be preserved exactly.
Beyond parsing complexity, there's also the challenge of project structure. Many repositories have multiple requirements files: requirements.txt for production, requirements-dev.txt for development tools, requirements-test.txt for testing frameworks, and so on. A proper conversion should detect these patterns and organize them into the appropriate dependency groups in pyproject.toml.
Intelligent Metadata Detection
One of the tool's most useful features is automatic project metadata detection. When you run req2uv in a Python project directory, it doesn't just convert dependencies—it tries to build a complete, valid pyproject.toml by gathering information from multiple sources.
For the project name, the tool starts with the current directory name and normalizes it according to Python packaging standards (replacing spaces and special characters with hyphens). For the version, it searches for version declarations in __init__.py files, checks setup.py if present, looks at git tags, and falls back to 0.1.0 if nothing else is found. The Python version requirement is detected from .python-version files, setup.py classifiers, or defaults to the currently running Python version. The description comes from the first line of your README file, and author information is pulled from your git configuration.
This intelligent detection means you're not starting with a minimal skeleton file—you get a properly structured pyproject.toml that actually describes your project.
Smart Merging with Existing Files
Not every project is starting from scratch. Sometimes you're modernizing a project that already has a partial pyproject.toml file, perhaps created manually or by another tool. The requirements-to-uv tool handles this scenario carefully.
When a pyproject.toml already exists, the tool merges new information without destroying what's there. It preserves existing metadata and configuration sections, appends new dependencies to existing lists (detecting and warning about duplicates), and creates a backup file (pyproject.toml.backup) before making changes. This merge logic uses a sophisticated approach that understands the structure of TOML files and dependency declarations, ensuring that manual customizations aren't lost during the conversion.
Handling Edge Cases and Limitations
Python packaging has accumulated many special features over the years, and not all of them translate cleanly to the modern pyproject.toml format. The tool handles these cases transparently while keeping you informed.
For package hashes (the --hash=sha256:... format), these aren't supported in pyproject.toml because uv uses lock files for reproducibility instead. The tool strips these out but generates a comment explaining the change. Custom package indexes specified with --index-url or --extra-index-url also can't be stored directly in pyproject.toml. The tool adds a comment with the original URL so you can configure it through uv's CLI or configuration file instead.
SSH git URLs present an interesting challenge. While GitHub, GitLab, and Bitbucket SSH URLs can be automatically converted to their HTTPS equivalents, URLs from other hosts generate warnings since the conversion might not be straightforward. For these cases, you may need to adjust the generated file manually.
The tool handles these limitations gracefully: it does the conversion, preserves as much information as possible, explains what couldn't be directly translated, and provides guidance on how to handle special cases in the uv ecosystem.
Getting Started in 30 Seconds
The tool is designed to get you working quickly. Installation is straightforward using uv itself:
# Install as a global uv tool
uv tool install git+https://github.com/danilop/requirements-to-uv.git
Then navigate to any Python project with a requirements.txt file and run:
req2uv
The tool automatically detects your project structure, finds all requirements files, gathers metadata, and generates a complete pyproject.toml. It runs in interactive mode by default, showing you what it found and asking for confirmation before writing files. Once the conversion is complete, you can immediately use uv to install dependencies:
uv sync
For CI/CD pipelines or scripts, you can use non-interactive mode:
req2uv --non-interactive
You can also preview what the tool would do without making changes:
req2uv --dry-run
Real-World Usage Patterns
After using this tool across various projects, certain patterns have proven particularly valuable. When modernizing existing projects, I start with a dry run to see what the tool will generate. This helps catch any issues before committing to changes. The tool creates a backup of any existing pyproject.toml, but I also like to commit my current state to git first, just to be safe.
For repositories with multiple requirements files, the automatic detection and categorization saves significant time. The tool recognizes common patterns like requirements-dev.txt, requirements-test.txt, and requirements-docs.txt, and organizes them into appropriate dependency groups. This structure aligns well with uv's dependency group feature, making it easy to install just the dependencies you need for a particular task.
When cloning open-source projects that still use the older format, I typically run req2uv as my first step after cloning. This lets me work with the project using my preferred tools without needing to maintain both formats or deal with inconsistencies between pip and uv behavior.
Architecture: Simple and Focused
The tool is built with a clear focus on doing one thing well. It uses the packaging library for parsing requirements and handling version specifiers according to Python packaging standards. Click provides the command-line interface with automatic help generation and parameter validation. Rich handles terminal formatting for readable output and progress indication. And questionary powers the interactive prompts when running in interactive mode.
The core logic separates concerns cleanly: a parser module handles requirements.txt parsing and normalization, a detector module gathers project metadata from various sources, and a generator module creates valid TOML structures. This modular design makes the tool maintainable and makes it easier to extend with new features.
Best Practices for Conversion
Through working with various project structures, a few practices have emerged as particularly helpful. Before running the conversion, it's worth reviewing your requirements files to ensure they're current and removing any commented-out dependencies that you no longer need. This gives you a clean starting point.
After conversion, I recommend reviewing the generated pyproject.toml, especially the [tool.uv.sources] section if you have git or path dependencies. While the tool handles most cases automatically, some scenarios—like private git repositories or unusual URL patterns—might need manual adjustment.
It's also helpful to test the converted dependencies immediately by running uv sync and verifying that your application still works as expected. This catches any edge cases early while the conversion process is still fresh in your mind.
Wrapping Up
Converting Python projects from requirements.txt to modern pyproject.toml with uv support doesn't have to be a manual chore. The requirements-to-uv tool handles the complexity of parsing various dependency formats, intelligently detects project metadata, and generates complete, valid project files.
Whether you're modernizing your own projects or quickly setting up repositories you've cloned, this tool helps you move to a faster, more standardized Python development workflow. The complete code is available on GitHub, and contributions are welcome.
Give it a try the next time you encounter a Python project still using requirements.txt—you might be surprised how much smoother your workflow becomes with modern Python tooling.
Top comments (0)