For further actions, you may consider blocking this person and/or reporting abuse
Read next
AI Creates Ultra-Realistic 3D Models from Text Using Revolutionary Mesh Generation System
Mike Young -
What I'd do differently in Bootcamp. (spoiler: Everything)
Alexander McMillan -
Optimise AWS Costs: Automate Unused EBS Snapshot Cleanup with Lambda
Pravesh Sudha -
Understanding ConvertML: Simplifying Machine Learning for Everyone
Aditya Pratap Bhuyan -
Top comments (19)
I use Poetry.
An Introduction to Poetry
Maximilian Burszley ・ Apr 1 ・ 7 min read
Thanks for the link, I've considered moving to poetry or pipenv. I like how lock files are a normal part of the process, unlike conda. Conda can do it, but its not a part of the typical workflow and requires an extra step. Thanks for the link.
Does Poetry also need virtual environments?
Yes, it handles management of them.
I use
poetry
here are examples:wemake-services / wemake-python-package
Bleeding edge cookiecutter template to create new python packages
wemake-python-package
Bleeding edge cookiecutter template to create new python packages.
Purpose
This project is used to scaffold a
python
project structure. Just likepoetry new
but better.Features
python3.7+
flake8
and wemake-python-styleguide for lintingtravis
orGithub Actions
as the default CIInstallation
Firstly, you will need to install dependencies:
Then, create a project itself:
Projects using it
Here's a nice list of real-life open-source usages of this template.
License
MIT. See LICENSE for more details.
I use Conda. Whenever I start a new project, I create a new directory, cd into the directory, and then run a custom bash function called 'new' which creates a new Python 3.8 Conda environment with the name of the directory, activates it, and installs any requirements if the requirements.txt already exists.
I also had a function for cd which would activate environment when cd-ing into a directory if there was an environment with the name of the directory. But in some cases I didn't want that to happen, so I stopped using that. Instead I have an alias of 'activate' which will activate the correct Conda env.
I like the idea of the auto-activation, but can see where it could cause some frustrations as well. I made a fuzzy condo environment activator with fzf to make it a bit less verbose.
I'll start with me, I used to keep everything in one environment until it burnt me one too many times and did a 180. I now keep EVERY project separate. I do not install anything for one project into another project's environment. I also do a ton of exploration, at one point I had 70 conda environments installed on my machine.
I use pipenv and poetry. Actually, I'm in the process of moving from pipenv to poetry in my projects.
why the move from pipenv to poetry. If I wasn't in data science I would likely be using pipenv or peotry.
Pipenv has its issues. The stable release is very old. Poetry was unacceptable for some use cases before 1.0.0. Now it just rocks:)
Mostly I use virtualenv, but for some use cases I also use Docker
Bonus points for using Docker. I have never used docker for development, how do you like it as a python environment? Seems like it would be kinda big for a standard use case without needing to run other components; databases, web servers, etc, but that could be my lack of experience with it.
I do prefer using virtualenv because, as you said, docker is a bit heavy. I also think it is harder to debug inside a docker-container. I usually use the VS code debuger which automatically pick up virtual environments. What I do like with docker is that it is guaranteed to work the same in production. E.g. multi-processing in Python behaves differently on Windows and Linux and I am working with both.
Poetry
Poorly. I try to keep things more or less organized (an env for this ML project, an env for that major module) but conda seems to blow itself up if I so much as look at it funny.
Lol I have had my fair share of conda environment blow ups! I have recently disabled pip inside of my base environment to prevent some issues. Sometimes paths don't update correctly and you install things in base even while your prompt tells you that you have your env activated.
At least this way, even if you only use one env, you can easily wipe it and start over without a full re-install.
For overall python I use conda
For model deployment, I use bentoml to manage my deployed services
I have never heard of bentoml, I'll have to check that out.