When Docker first became popular a few years ago, I generally dismissed it as a fad: Something that the hipsters were adopting just so they could be cool.
I mean, why add this extra layer on top of everything and add complexity to your development workflow? And any benefit you may glean would be minimal.
I didn't dive right into containers, but now things have changed. Over the past few years, cloud providers have rolled out some fantastic offerings that make me want to use Docker. Here are the reasons why using Docker containers in your developer workflow make a lot of sense in 2022.
Stable development environments
Gone are the days when you have a single development environment that all your projects leverage. If you start a handful of projects every year, then keeping those projects up to date could take a huge amount of time. In recent history, Python offers an upgrade every year. Using Node.js means you face a new version every six months!
What I've experienced is that if I take a break from one of my projects and come back to it a year later, I'm faced with numerous hours of just upgrading the environment to get the darn thing to work!
While keeping up with the latest security patches is important, I don't always need the latest and greatest features in the language. Sometimes I just want it to work so I can move on with whatever features I'm trying to implement. With containers, you can put a project on hold for a little bit and when you return, all the dependencies will still be there and, unless you depend on an external service, things should just work.
Isolated development environments
Another issue I have run into is having multiple versions of the language installed on my system. Python 2? Python 3? Node 12? Node 14? Many projects use a wide spectrum of these tools. Nothing drives me crazier than when someone comes up with a new shiny tool and asks me to install it globally!
People have solutions for multiple environments on a single system. Just to name a few: virtual environments for Python; nvm and volta for Node.js.
Using Docker means that you can create a container that is dedicated to the environment and language version of your choice. Each of your projects can have different versions of languages and none of them will conflict with your host machine!
Easily repeatable development environments
If you're working with multiple folks or just working with more than one computer, then you may appreciate that containers make sharing easy. All the instructions for building your container are checked into your source code repository.
If a teammate or friend wants to collaborate with you on the project, just point them to the source code and they can have an exact replica of your container within minutes. No more fussing around with Windows vs Mac vs Linux desktops. They all can participate.
Another scenario could be that you'd like to share your project in a demonstrable stage, but not share the source code. You can hand over the built container image and your friend can run an instance locally to check out your project. While this does not secure your source code, it does make sharing your project more palatable to folks who maybe are not developers.
Easier Continuous Integration
Since all the instructions on how to build your project are in the source code, then building the container image is very easy. You can build it on your personal computer. Or you can leverage one of the many cloud-based Continuous Integration (CI) offerings and have them build your container anytime you'd like. This is most helpful when working on a team,
but it's important to point out that containers make this step much easier!
Cloud based hosting
One of the biggest reasons to use containers these days is the plethora of online services that will host containers. The biggest drawback in the past was that even if you took the time and effort to make your project run in a container - you couldn't run it! At least you couldn't run it easily. Kubernetes fighting for developer mindshare with Mesos and Docker Swarm. Ultimately Kubernetes became very popular and now all major and many minor cloud providers offer some container hosting services.
Based on your hosting needs, you can run a hobby project in the cloud for free (using various Free Tier offerings) or you can run up a robust and geographically redundant flotilla of containers for your production project. It's almost too overwhelming to choose from the options. In fact, Corey Quinn on the Last Week in AWS published a blog post on 17 ways to run containers on AWS - and then wrote another blog post on 17 more ways!
Serverless Ready
Finally, using containers for your deployment allows you to fully embrace the Serverless movement. Allow me to explain the Serverless movement. In any commercial hosting relationship, you, as a project owner, delegate some responsibilities to your hosting provider. If you are renting physical rack space in a data center, then you allow them to provide a roof, power, and air conditioning. By using containers, you delegate a lot more to your provider. Essentially you are just renting CPU cycles from these container providers (although how you are billed varies from provider to provider).
As a project owner, you are free of the responsibility of server management (mostly). If the load on your project goes up, then you just rent more CPU cycles. Conversely, if the load goes down, you rent fewer CPU cycles. This is the beauty of Serverless: the line of responsibility allows you to focus more on your project's delivered value and less on the boring tasks.
Why not run Docker containers?
Ok, so we've discussed the advantages of using containers. What about the downsides?
Docker Licensing
According to the Docker Pricing page:
Docker Desktop can be used for free as part of a Docker Personal subscription for: small companies (fewer than 250 employees AND less than $10 million in annual revenue), personal use, education, and non-commercial open source projects.
Docker Desktop requires a per user paid Pro, Team or Business subscription for professional use in larger companies with subscriptions available for as little as $5 per month.
This is a nominal cost for most developers, but something worth considering.
Docker Desktop overhead
When running Docker in the background, your computer will use more energy. I have found from personal experience that most modern computers handle the load easily.
Only runs Linux
Your Docker Container can only run Linux. That's because Docker takes advantage of runC which uses the Linux kernel. You can't run Windows inside of Docker. But of course you can run Docker on a Windows host machine. If you are running a .NET project, you won't be able to use Docker. On the other hand, if you're running .NET Core then you're in luck!
While you may think only using Linux would be limitation, but there are literally hundreds of Linux distributions from which to choose.
Complex workflows
Most developers are taught to develop locally on their machines. While this may be slowly shifting to online development, there is a lot of existing momentum. Exacerbating the problem are the myriad of options that Docker exposes to use the tool. It's extremely flexible and thus is quite complex to set up properly. Facing the task of setting up Docker from scratch is quite intimidating. It could cause some developers to have a bad experience and thus resist leveraging a fantastic tool.
Getting Started with containers
In the next posts, I would like to propose a flexible but opinionated developer workflow for the Python and Node.js environments.
Opinionated Docker development workflow for Node.js projects
Cover Photo by Venti Views on Unsplash
Top comments (0)