I've been using Docker and Docker-Compose for more than two years and I really can see a lot of benefits when it comes to CI/CD pipelines and deployment.
On the other hand, it seems like the use of Docker can slow down the local development process. Just a few examples from the top of my head:
- more complicated setup (if you want to have auto-reloading and avoid building new version on every change)
- running a single test can be much slower (the cost of bringing up the containers)
- cost of installing new packages
- etc.
What is your experience when it comes to using Docker for development? What are the tricks that can make easier and more efficient? I'm curious to see your ideas :)
Top comments (44)
I personally do this weird bit where I always make sure that development is 'possible' in docker, I make sure that everything spins up with a single command with
docker-compose
.Then I expose the DB Container through a port and connect to it via my dev environment, or my IDE.
This means one could completely depend on docker if they choose to and partially depend on it if they want to.
I think this is the ultimate. You provide all the pieces so other devs can pick and choose the things they want to run locally. You dont have to work 100% in containers to get the benefits.
I like this approach, thanks for sharing :)
How do you do it? Can you link to some tutorials on how to have a good local dev workflow?
For web projects, I do, regardless how simple they are.
Pros I see:
cd
into my project directory, starttmux
so I can splityakuake
into 4-5 windows in the same tab (one for docker, one for front-end, one for back-end, one for git, usually), rundocker-compose up
, and use that for development. I (usually) have one django+rest-framework container, one postgres, one node for dev to work on front-end and serve it, and one nginx to route calls to/api/
to DRF and the rest to Vue. Start Django and Quasar (Vue framework) in dev mode, and they monitor changes and rebuild automatically. Then I open the whole project folder in VS Code.Cons:
If you want to look at one of my dev environments in practice, I have an OSS one here - although it is missing deployment instructions.
Edit: I have fixed the auto-reloading issue in most of my projects; you just have to tell node what address to use for auto-reloading, which in quasar you can do in
quasar.conf.js
with something likedevServer: { public = "http://localhost:<host port number>
, with the port you defined indocker-compose.yml
.Thanks for the detailed answer! I'm also a Python developer and use Pycharm as main IDE (Neovim when doing small edits).
Pycharm allows to easily select single test class or single test to run, but under the hood it just calls "docker-compose up python manage.py test path.to.test".
Having to "docker-compose up" it, slows down the whole process by at least few seconds. It does not sound like a lot, but can be inconvenient.
As an alternative, I could what you suggest - have a project running with docker-compose up and just execute single test by calling "docker-compose exec python manage.py test path.to.test".
It runs much faster, but has the downside that I have to manually type the test name (or path to it).
That's why I'm looking for something that could combine both solutions when working with Pycharm.
Do you find your workflow with tmux + VS Code convenient? Do you just switch tabs between them or keep them in separate monitors?
I only have one monitor, but I do a ton of work in the shell, so I use
yakuake
. I just pressF12
on my keyboard, and the terminal slides down from the top of the screen, covering 90% (can easily change it). Run my commands (e.g.git commit -m && git push
), press F12 again to hide it, and go back to my browser where I can monitor my Gitlab pipeline. (Or back to VS Code, or whatever I was doing).I also use KDE plasma with virtual desktops, so I have VS Code on Desktop 2, Chrome on Desktop 1...
<ctrl+alt+left>
or<ctrl+alt+right>
switches to the previous/next desktop, so jumping from VS Code to Chrome is also really fast...Never tried PyCharm properly, but VS Code is really simple to configure, with a few extensions it works great (linting my Python and JS code).
I do run the test commands manually, but if they are complex enough, I can write an alias or bash script. I usually run Jest in watch mode anyway, so I run the command once when I start working, and often just let it run for days.
Pytest also has at least one watcher, but I haven't tried it. Since my terminal is practically always running, I type the command once, then my flow is:
F12
to bring upyakuake
, press the up arrow to repeat the last command (or a couple of times to re-run an earlier one), and press enter. PressF12
again to hide the terminal. Sometimes I wait for the tests to finish ahead of time, sometimes I just go do my stuff and look at the results later.For me at my company we use it for local development because it just makes things easy. We work with distributed systems and there are different components so it's a headache to setup and build constantly.
So we containerize each component and just start and stop the containers as we need. Plus it makes on boarding new developers less of a hassle.
That's one of the main benefits for me - it makes the process of on boarding much easier.
Onboarding new developers was easy at my previous employer because of this. New hires up and running/compiling etc everything and anything in an hour instead of a week.
Amen on the ease of on-boarding.
And that's it, the entire tech stack is ready to go.
I'm a TypeScript full-stack developer and tried to learn how to use docker-compose+docker locally and just found a lot of pain and misery. I couldn't even think of what I could gain besides defining what node-version everything runs in and being able to "boot up" a database locally easily.
I tried to learn how to use both Docker and docker-compose from scratch and just had a tough time. Most tutorials I found go into a simple hello-world setup, or dive deep into shell scripts and commands for vastly more advanced use-cases.
I ended up giving up on using Docker locally.
Right now beyond defining just a running node version (I use nvm) I don't see much benefit in using either technology, besides being able to setup a database locally.
I'm sure I'm missing the point, but then as a "dev" that wants as little ops as possible, Docker wasn't good enough for me :'(
Avoiding the ops is a major selling point for Docker on the desktop. Once the image (or at the very least, the Dockerfile) is created, there's no need to install anything on the local system.
I used to run VMs so I could install tools (e.g. database servers, web servers, etc.) without screwing up my primary system. With Docker, all the installation is done in a container.
Need to switch back and forth between multiple versions of node? Shut down the old container, and spin up another with the other version and you're ready to go. The time to switch is measurable in minutes and the host system is unaffected.
There is a paradigm switch, because you have to figure out what files need to live outside the container and which ones don't, but it's much lighter weight than VMs and faster than uninstalling/reinstalling every time you want to update part of your stack.
My company uses an internal tool that wraps
docker-compose
to provide a bunch of extras, and we've moved everything across to that. Fundamentally, though, you can spin up each project usingdocker-compose
if you want.It's pretty good.
Because we're stuck using Macs, it's quite slow when there are a lot of files in a volume, though, maybe 10 times slower at anything I/O bound than on a Linux machine.
Could you share example "extras" that your internal tool offer? :)
We wrap things in a set of proxies that allow us to use mailhog and local development domain names (nginx-proxy, currently, but traefik also works)
I made a post a while back about extending things:
Extendable heroes
Ben Sinclair ・ Jul 16 '18 ・ 5 min read
That means we have one command and it takes custom subcommands, some global and some project-specific, to do things like
That sort of thing. Just the housekeeping stuff everyone has to do, but with consistent commands between different projects which might be running different languages or frameworks.
We might open source our system at some point. That was always the intent, but nowadays there are other products which do the same sort of thing so we wouldn't be adding anything to the dev community. We're just a little tied up in using our own system
I exclusively use docker-compose for local dev. It slows nothing down - if anything, it streamlines it because I don't have to maintain the server environments on my host; docker takes care of that. All I have locally is nvm, go, and python - just because I use those for dogfood and other general tools, though.
I use a lot of bind mounts to keep my local project in sync, and that really takes the edge off. To help that along, I've set my local uid to 1000 and I have dev images that set the container uid to the same (translation: no permissions trouble 🦄)
DC lets me do other fun things to make life good. My favorite is stacking yaml for different tasks. Base yaml + dev yaml exposes extra ports or adds dev-only tooling services; + test yaml changes db data volumes to controlled alternates.
Using
rsync -a
I can mirror data volumes - basically the db equivalent ofgit stash
♥️With the right mix of Ansible and Make, this is all no-effort. I can remember the days before docker, but I don't like to lol. Lots of things that are no-effort now were impractical or outright impossible then.
Oh, another bonus: I have a single global/privileged stack running that has Traefik and Portainer. All running project stacks are published through Traefik, and Portainer is just nice to have around. No port conflicts, and life is easy ♥️
I haven't heard about Portainer before - thanks for sharing :)
I have been full blown docker for local development (consultant, 3-8 person dev teams) and overall it works. Some things aren't the most straight forward, but overall being able to docker-compose up and have an entire dev env just work is worth it.
On thing I am starting to look into is Google's new Skaffold project which is more kubernetes style approach but claims to make the tech in that are more dev accessible
I use docker and docker-compose for local development all the time.
I'm usually doing a web project and I've got a few
docker-compose.yml
templates that outline tech stacks that I commonly use. That way I can easily spin up isolated environments with control of the versions of all stack components.I find it useful but I might be consuming more resources than necessary.
We have a Jekyll-based thing and there are at least three of us who work with it. Rather than deal with the hassles of installation of Ruby and gems etc, I‘ve set up a very simple Docker Compose configuration that lets anyone develop (or review a PR) locally with a single command. It’s been an absolute godsend. This is far from all the magic that Docker can offer for local development, but the elegance and simplicity of
docker-compose up
can’t be beat.I have docker only on dev and production built.
If something wrong on the dev built.
We'll try to check in our local build to fix and again built in dev.
Docker on local machine is painful. Not all the Dev's are familiar with.
Using docker is overwhelming but it's not necessary.