From humble beginnings at an MSP, I've adventured through life as a sysadmin, into an engineer, and finally landed as a developer focused on fixing problems with automation.
👋 Hey there, I am Waylon Walker
I am a Husband, Father of two beautiful children, Senior Python Developer currently working in the Data Engineering platform space. I am a continuous learner, and sha
Thanks for the link, I've considered moving to poetry or pipenv. I like how lock files are a normal part of the process, unlike conda. Conda can do it, but its not a part of the typical workflow and requires an extra step. Thanks for the link.
From humble beginnings at an MSP, I've adventured through life as a sysadmin, into an engineer, and finally landed as a developer focused on fixing problems with automation.
I use Conda. Whenever I start a new project, I create a new directory, cd into the directory, and then run a custom bash function called 'new' which creates a new Python 3.8 Conda environment with the name of the directory, activates it, and installs any requirements if the requirements.txt already exists.
I also had a function for cd which would activate environment when cd-ing into a directory if there was an environment with the name of the directory. But in some cases I didn't want that to happen, so I stopped using that. Instead I have an alias of 'activate' which will activate the correct Conda env.
👋 Hey there, I am Waylon Walker
I am a Husband, Father of two beautiful children, Senior Python Developer currently working in the Data Engineering platform space. I am a continuous learner, and sha
I like the idea of the auto-activation, but can see where it could cause some frustrations as well. I made a fuzzy condo environment activator with fzf to make it a bit less verbose.
a () {
source activate "$(conda info --envs | fzf | awk '{print $1}')"
}
👋 Hey there, I am Waylon Walker
I am a Husband, Father of two beautiful children, Senior Python Developer currently working in the Data Engineering platform space. I am a continuous learner, and sha
I'll start with me, I used to keep everything in one environment until it burnt me one too many times and did a 180. I now keep EVERY project separate. I do not install anything for one project into another project's environment. I also do a ton of exploration, at one point I had 70 conda environments installed on my machine.
conda create -n project python=3.8
source activate project
# ensure it activated
which pip
which python
👋 Hey there, I am Waylon Walker
I am a Husband, Father of two beautiful children, Senior Python Developer currently working in the Data Engineering platform space. I am a continuous learner, and sha
👋 Hey there, I am Waylon Walker
I am a Husband, Father of two beautiful children, Senior Python Developer currently working in the Data Engineering platform space. I am a continuous learner, and sha
Bonus points for using Docker. I have never used docker for development, how do you like it as a python environment? Seems like it would be kinda big for a standard use case without needing to run other components; databases, web servers, etc, but that could be my lack of experience with it.
I do prefer using virtualenv because, as you said, docker is a bit heavy. I also think it is harder to debug inside a docker-container. I usually use the VS code debuger which automatically pick up virtual environments. What I do like with docker is that it is guaranteed to work the same in production. E.g. multi-processing in Python behaves differently on Windows and Linux and I am working with both.
It's pronounced Diane. I do data architecture, operations, and backend development. In my spare time I maintain Massive.js, a data mapper for Node.js and PostgreSQL.
Poorly. I try to keep things more or less organized (an env for this ML project, an env for that major module) but conda seems to blow itself up if I so much as look at it funny.
👋 Hey there, I am Waylon Walker
I am a Husband, Father of two beautiful children, Senior Python Developer currently working in the Data Engineering platform space. I am a continuous learner, and sha
Lol I have had my fair share of conda environment blow ups! I have recently disabled pip inside of my base environment to prevent some issues. Sometimes paths don't update correctly and you install things in base even while your prompt tells you that you have your env activated.
At least this way, even if you only use one env, you can easily wipe it and start over without a full re-install.
👋 Hey there, I am Waylon Walker
I am a Husband, Father of two beautiful children, Senior Python Developer currently working in the Data Engineering platform space. I am a continuous learner, and sha
Top comments (19)
I use Poetry.
An Introduction to Poetry
Maximilian Burszley ・ Apr 1 ・ 7 min read
Thanks for the link, I've considered moving to poetry or pipenv. I like how lock files are a normal part of the process, unlike conda. Conda can do it, but its not a part of the typical workflow and requires an extra step. Thanks for the link.
Does Poetry also need virtual environments?
Yes, it handles management of them.
I use
poetry
here are examples:wemake-services / wemake-python-package
Bleeding edge cookiecutter template to create new python packages
wemake-python-package
Bleeding edge cookiecutter template to create new python packages.
Purpose
This project is used to scaffold a
python
project structure. Just likepoetry new
but better.Features
python3.7+
flake8
and wemake-python-styleguide for lintingtravis
orGithub Actions
as the default CIInstallation
Firstly, you will need to install dependencies:
Then, create a project itself:
Projects using it
Here's a nice list of real-life open-source usages of this template.
License
MIT. See LICENSE for more details.
I use Conda. Whenever I start a new project, I create a new directory, cd into the directory, and then run a custom bash function called 'new' which creates a new Python 3.8 Conda environment with the name of the directory, activates it, and installs any requirements if the requirements.txt already exists.
I also had a function for cd which would activate environment when cd-ing into a directory if there was an environment with the name of the directory. But in some cases I didn't want that to happen, so I stopped using that. Instead I have an alias of 'activate' which will activate the correct Conda env.
I like the idea of the auto-activation, but can see where it could cause some frustrations as well. I made a fuzzy condo environment activator with fzf to make it a bit less verbose.
I'll start with me, I used to keep everything in one environment until it burnt me one too many times and did a 180. I now keep EVERY project separate. I do not install anything for one project into another project's environment. I also do a ton of exploration, at one point I had 70 conda environments installed on my machine.
I use pipenv and poetry. Actually, I'm in the process of moving from pipenv to poetry in my projects.
why the move from pipenv to poetry. If I wasn't in data science I would likely be using pipenv or peotry.
Pipenv has its issues. The stable release is very old. Poetry was unacceptable for some use cases before 1.0.0. Now it just rocks:)
Mostly I use virtualenv, but for some use cases I also use Docker
Bonus points for using Docker. I have never used docker for development, how do you like it as a python environment? Seems like it would be kinda big for a standard use case without needing to run other components; databases, web servers, etc, but that could be my lack of experience with it.
I do prefer using virtualenv because, as you said, docker is a bit heavy. I also think it is harder to debug inside a docker-container. I usually use the VS code debuger which automatically pick up virtual environments. What I do like with docker is that it is guaranteed to work the same in production. E.g. multi-processing in Python behaves differently on Windows and Linux and I am working with both.
Poetry
Poorly. I try to keep things more or less organized (an env for this ML project, an env for that major module) but conda seems to blow itself up if I so much as look at it funny.
Lol I have had my fair share of conda environment blow ups! I have recently disabled pip inside of my base environment to prevent some issues. Sometimes paths don't update correctly and you install things in base even while your prompt tells you that you have your env activated.
At least this way, even if you only use one env, you can easily wipe it and start over without a full re-install.
For overall python I use conda
For model deployment, I use bentoml to manage my deployed services
I have never heard of bentoml, I'll have to check that out.