This is part of a series describing the different ways to manage a Python project. To find out what I mean by "dependency stack", check out the first post.
TL;DR π©
The vanilla stack is the most basic choice, using standard and recommended Python tools. As a result it is the most stable, and the hardest to use option. You have to do most of the work yourself.
Components
- pip: The package installer for Python. Used to install dependencies.
- requirements.txt: The file used to declare dependencies for a project
- dev_requirements.txt: Used to declare dependencies for developers (not consumers). Same format as requirements.txt
- setup.py: File used to set metadata for and create your package
- setuptools: Used for building your project
- twine: Used to upload your project to a PyPI server
- venv: Used for creating virtual environments
Summary
The vanilla stack contains all the most basic tools to manage a Python project. When you Google "how to ", these are the tools you'll probably find references to. As you can see by the components list, there are a lot of moving pieces. Because most tasks are broken up into individual packages, this stack is both the most complicated to learn and the most flexible.
Development
TL;DR π©
The development experience is rough. If being productive at development time is your primary goal, pick a different stack (posts coming soon).
Defining Requirements
- π Dev dependencies: With this stack, if you want to separate the dependencies required only for developers (like a testing framework) from the dependencies required to run your app, you have to put them in different files.
- π Reproducibility: With the requirements format, you have to make a choice. You can either pin every dependency to a specific version (
== 1.2.3
), meaning you have to manually look for updates to packages manually if you want bug fixes, or you have to specify only compatibility (~= 1.2.3
) meaning you can't guarantee that all developers are working with the same versions (or that a deployed app will have the same versions!) leading to the dreaded "it works on my machine" problem. - βοΈ Adding new packages: there isn't any good way to add a new package, you just have to write it into your requirements file by hand (meaning you have to know what version you want ahead of time), then install from that requirements file.
- π₯΄ Alternative sources: You can definitely define requirements from all the usual places, but the syntax is sort of all over the place
- To add a private PyPI repo, you need to add a line to the top of your file that looks something like
--extra-index-url=https://pypi.org/simple/
- To add a Git dependency:
-e git://github.com/<repo>.git@<tag>#egg=<package_name>
- To install from a local path:
-e /path/to/package
- To add a private PyPI repo, you need to add a line to the top of your file that looks something like
Virtual Environments
- π Setup: Creating a virtual environment is pretty simple with Python 3 (it's a couple more steps with Python 2). You just run the venv module with the path you want it to put the venv at, like
python3 -m venv .venv
- π§ Usage: Actually doing things in a default virtual environment is monotonous, and different depending on your platform. You'll find yourself typing the same commands over and over.
- Installing requirements:
- Windows:
.venv\Scripts\pip.exe install -r dev_requirements.txt
- macOS/Linux:
.venv/bin/pip install -r dev_requirements.txt
- Windows:
- Running a script:
- Windows:
.venv\Scripts\python.exe script.py
- macOS/Linux:
.venv/bin/python script.py
- Windows:
- Activating the environment (to make running other commands more simple)
- Windows:
.venv\Scripts\activate
- macOS/Linux:
source .venv/bin/activate
- Deactivating (both platforms):
deactivate
- Windows:
- Installing requirements:
Distribution
TL;DR π©
Contrary to the Zen of Python π§, there are multiple non-obvious and complicated methods to do distribution using the Vanilla stack. If you're going to deploy source, this is fine. If you're going to build your project for any other sort of distribution, there are better options.
Build
- π€ Definitions: All of the metadata about your project is done in a setup.py file which, as you may have guessed, is a Python script. Everything goes into a function call which is not a particularly readable format. You can declare requirements directly here, but it's better practice to point this file at your requirements.txt so you have one source of truth.
- π Building: In order to actually package your package so you can distribute it, you'll probably need to run the command
python setup.py sdist bdist_wheel
, a super non-obvious command that you'll probably want to alias to make life easier. This will create a wheel and a source distribution for later use.
Deploy
- π₯΅ PyPI: Deploying to PyPI requires you to use yet another different tool, this time twine. For the Public PyPI repo, this looks like
python -m twine upload dist/*
. If you are uploading to a private PyPI repo, you'll need to have credentials in your environment variables or .pypirc or keyring (again, multiple options with no clear winner) and add a--repository-url
option. - πΈ Source Distribution: This is definitely the easiest way to deploy an app using this stack. Just copy the source to wherever it's going, install from requirements.txt, and you're done. No need to install additional tools before setup.
- π Prebuilt Dependencies: Almost as easy as source distribution, pip gives a couple easy methods to bundle your dependencies with your app. This is super useful if you're trying to build a desktop app or for multi-stage docker builds. You can do
pip install --target build/ -r requirements.txt
to install all your requirements into a folder. Or, you can dopip download --dest build/ -r requirements.txt
(why dest instead of target?) to download the built requirements (like wheels) into a folder.
Conclusion
The vanilla stack is... fine. It includes a lot of different tools, inconsistent syntax, and subjectively hard to read file formats. It doesn't have a "lock" file like the other two stacks I'll post about in the future, meaning that dependencies are either less predictable or less usable. On the upside, a lot of the tools required (e.g. pip and venv) are included with most Python installs, so you don't have to go and get them. They're also well supported and "stable" tools (for some reason the pip developers like to remove features that I use).
Top comments (0)