Originally published on peateasea.de.
Setting up an existing Python project from a fresh clone shouldnât be a chore. Automate the process with a setup script.
Simple setup
How do I set up this project again? Do I use virtualenv or just the stdlibâs venv module? How do I install the dependencies? Are the documented setup instructions still up to date?
These are just some of the questions that whiz through my mind when setting up a Python project either from a fresh clone or if I need to start from scratch. One way I make my life easier is by having a setup script which handles all these things for me.
This idea is, of course, not new1 and Iâve been using a variation of what I present below for several years. The thing is, I found my solution to be suboptimal: a script called install-deps
residing in a sub-subdirectory called devops/bin/
. Although this solution worked it still felt somehow clunky and inefficient. I remember seeing a post from @b0rk a while ago (that I unfortunately canât find anymore) which mentioned using a simple setup.sh
script located in the projectâs base directory. This seemed like a much better solution and is the pattern I now like to follow.
Hereâs what I use:
#!/bin/bash
# setup.sh - set up virtual environment and install dependencies
# create venv if it doesn't already exist
if [! -d venv]
then
python3 -m venv venv
fi
# shellcheck source=/dev/null # don't check venv activate script
source venv/bin/activate
pip install -r requirements.txt
# vim: expandtab shiftwidth=4 softtabstop=4
Thus, if Iâve moved a project to a new directory (and hence have to rebuild the venv
) or if Iâve checked out a fresh clone onto a new machine, running
$ ./setup.sh
will get me up to speed quickly and simply.
Script dissection
For those brave souls who would like more detail, letâs pick the script apart a bit.
Shebang
#!/bin/bash
The first line is the shebang line and ensures that we use bash when running the script. Bash has been my shell of choice for over 25 years, so itâs a hard habit to drop. It works well enough for my needs and is still in active development, so thereâs been little pressure for me to change to something newer. I canât say I havenât tried something else though! Even so, I keep gravitating back to bash. Oh well.
Quick docs
# setup.sh - set up virtual environment and install dependencies
Next, thereâs a quick comment to say whatâs getting set up. This will be more helpful in more complex situations where extra info will come in handy. In the basic, simple situation shown here, itâs probably not necessary, although it could be useful background information for onboarding new project members.
A familiar environment
# create venv if it doesn't already exist
if [! -d venv]
then
python3 -m venv venv
fi
This snippet creates and initialises the virtual environment directory if it doesnât already exist. The square brackets test the condition within them and pass a true/false result to the if
statement. This then handles which code to run depending upon the result it receives. The test condition checks for the absence of a directory (! -d
; i.e. ânot directory existsâ) called venv
. If the directory doesnât exist, then we initialise the virtual environment within a directory called venv
by using the venv
module from the Python standard library.
Once upon a time, I used to use virtualenv to create the virtual environment but at some point switched to the venv
module from the standard library. Although virtualenv does have more features, the standard module is sufficient for my purposes. Thus, I avoid having to install virtualenv separately as an operating-system-level prerequisite before initialising a Python project. Now, Python is often the only OS-level prerequisite, which is a nice simplification.
Environment activation
# shellcheck source=/dev/null # don't check venv activate script
source venv/bin/activate
To install the Python requirements (the following step), we first need to activate the virtual environment. This is as simple as sourcing the appropriate file.2
The comment above the source
line tells shellcheck
3 not to check the activate
script. This isnât my code, hence it doesnât make any sense to check it for linter issues.
Dependencies installation
pip install -r requirements.txt
Now weâre ready to do the actual hard work: installing the upstream Python dependencies. This step assumes that a file called requirements.txt
exists in the base project directory.
Using a single requirements file is fine for small projects, or when starting a new project. Yet, as a project gets larger, it is useful to separate the development- and production-related requirements into separate files. In that case, itâs a good idea to create a requirements/
directory in the base project directory and put the (appropriately named) requirements files in there. In such a situation the dependencies installation step would look like this:
pip install -r requirements/base.txt
to install the base dependencies only required in the production environment, or
pip install -r requirements/dev.txt
to install the development-related dependencies in addition to the base dependencies.
Vim standardisation
# vim: expandtab shiftwidth=4 softtabstop=4
The final line is the vim
coda. This is an old habit but a useful one: it ensures that vim
expands tabs to spaces and sets how far to indent the code. Although I define this in my main vim
config, I also find it helpful to specify this information explicitly in source files.
Always ready to run
One small thing: set the executable bit on the script so that you can run it directly, i.e. without specifying an explicit interpreter. In other words, make the script executable like so:
$ chmod 755 setup.sh`
Extend as needed; avoid unnecessary docs
With the basic structure in place, one can now extend it to more complex setup situations. This is one of the great things about using a script for this purpose: we push all the gory details and complexity down to a lower level of abstraction. Thus, we put complexity behind a simple interface which does what it says on the box: set things up. This interface is simple, andâwhen used across multiple projectsâconsistent, thus reducing cognitive load. Your brain is now free to focus on more interesting things.
Putting these steps into a script makes the setup repeatable and automated; thereâs no need to keep detailed setup instructions in a README
or similar document. By avoiding detailed setup documentation, one avoids such instructions getting out of date. Also, one reduces the risk of human error through missed steps or misspelled commands. The setup documentation is then simply ârun the setup scriptâ. In other words: keep it simple. đ
Summing up
In short, dump any project setup details into a script and automate away your setup documentation.
And Iâm definitely not the first to have thought of it! â©
source
is abash
built in command and is equivalent to.
(dot-space) in POSIX-compatible shells. I findsource <script-name>
easier to understand than. <script-name>
and less easy to confuse with script execution, i.e..<script-name>
. â©shellcheck
is a linter for shell scripts. Itâs awesome. You should use it! â©
Top comments (0)