In today's article, we'll be looking at Python virtual environments.
What they are, why we need them and how to create them.
In basic, a virtual environment is an isolated environment for Python projects.
Basically a container for your specific project.
For those familiar with node modules, I find it quite similar to that.
You can have modules globally installed, and every project can access those, or you can have them project-based installed so only that project can find them.
Why do we need virtual environments
It's because of Python's way of downloading packages that we want to differentiate between projects.
In general, I find it a good choice to have project-based environments anyway since it will narrow down your error possibility.
Python can have a hard time differentiating between versions of a package to narrow it down, so let's say we want packageA
but projectA
needs v1.0.0
and projectB
needs v2.0.0`.
As it would be installed globally, there is no way to differentiate between those two.
If we installed them in our virtual environment, each project would use its own specified version.
Python virtual environments
For my article, I'll be using venv
. However, multiple options can create virtual environments for you.
Open up your project in a terminal and run the following command.
bash
python -m venv .venv
The last argument, .venv
, is the virtual environment's location and can be anything you want.
In this case, a folder called .venv
is created.
We can activate this virtual environment by running the following command.
bash
source .venv/bin/active
You will see it's active if the terminal places the environment name in front of your arrow like this.
Now, if we run pip install
, it will install specific packages inside our virtual environment.
Thank you for reading, and let's connect!
Thank you for reading my blog. Feel free to subscribe to my email newsletter and connect on Facebook or Twitter
Top comments (3)
Virtual Environments are one of the most important things to setting up any python project and is all too often skipped.
It was quite a learning curve for me, is it comparable to why one would use Docker?
Docker does so much more, but as a way of virtualization it kind of is.
As packages are developed and maintained things change. Developers do their best to avoid breaking changes, but they are inevitable. If you are working on several different projects in python you may be in various stages of adopting these breaking changes. Some projects will be on the latest and greatest, while others may be working perfectly fine nearly untouched for a few years. These projects potentially will not work with the same set of packages/versions.
I think your analogy to a node_modules directory is good, its just much smaller and less dynamic than node modules. Generally there are a whole different set of concerns on the front end that drive a different package ecosystem. Generally python users do not care if a package is 14kb, or 12kb, In fact they probably wont even notice if its 12kb or 12mb. This leads to a smaller set of packages that solve the same problem.