Introduction
In today’s fast-paced software development landscape, Docker is not just a tool for deployment but a transformative force in development workflows. While its role in streamlining production environments is well-known, Docker's impact on development is equally significant and often overlooked.
In this blog, we'll discover the hidden gems of using Docker for your everyday work and development. We'll go over the current problems in modern application development so that it will be clearer how Docker can solve these problems and make our lives less miserable. We will also look at practical examples of how to set up a Dockerized workflow in 2 different ways and compare them. By the end of this blog, you should be able to convince yourself or your manager why you should containerize your development workflow!
Let's dive into it!
Challenges in Traditional Development
Complex Setup and Configuration
Today, it has become a norm that onboarding new members to the team and installing and configuring the project may take a day or more, following a huge and (could be) poorly documented README, and a lot of debugging and StackOverflow-ing. I am not talking about simple Node.js apps that are simply installed via npm install
, I am talking about huge applications and open-source projects, which will require you to install dozens of dependencies and tools just to run the project. I still remember going days over installing a particular open-source project and still couldn't get the tests to run properly...
Inconsistent Environments, Dependency Conflicts
Each developer has their own environment, software packages and dependencies installed. This leads to problems and conflicts when installing and running the project, leading to more debugging and wasted time.
Another major issue, what if you have a backend that runs on Node.js v16 and a frontend app that runs on Node.js v20? Which Node.js version will you install? Even if you use NVM (Node Version Manager), only 1 version of Node.js will be running at a time, meaning you can't run the backend and frontend together to test them.
Learning, Testing and Experimentation Challenges
How would you go about testing the full project stack? You would need to install all the projects and run them together. You're most likely to suffer from one of the previously mentioned problems.
Need to learn load balancing and test out a load balancer, or test out scaling your application? You would need to run multiple instances of your application. How difficult that is will depend on your runtime, but you will have to assign different ports manually.
Want to learn SQL databases or Redis? Go into huge installation steps.
At the end, you hope that all the new software dependencies and installations didn't mess up your system and impact your daily life usage. BTW, good luck uninstalling a project's dependencies.
Now to the solution!
How Docker can fix your life (Err, your development problems)
Eases Onboarding and Contribution to Complex Projects
After cloning a project, and with a single command docker compose up --watch
(no matter how complex the project is), the project will just magically run. Zero installations or configurations needed. Onboarding new members or contributing to new projects has never been easier!
And as a bonus, all changes they make to the code will instantly reflect in the container, allowing for a seamless and effortless development process.
Simplifies README Files
No need to spend hours documenting how to run the project, just to find out that only you can understand what you wrote. The setup section of the README will now be as small as: "run docker compose up --watch
to run the development workflow".
Ensures Environment Consistency, Simplifies Dependency Management, and Offers Portability
With Docker, you can ensure that all developers are running the same environment, the same software versions, and the same dependencies, no matter what OS or configuration they have. Develop everywhere, run everywhere.
Provides Isolation and Handles Conflicting Dependencies Securely
Remember the problem of having different Node.js versions? Now you can run them both in separate containers, each with their Node.js version, and they won't interfere with each other. Everything is isolated. Everyone is happy. Want to delete a project? Just docker compose down
and that's it, no need to worry about uninstalling anything.
Enables Practical Learning and Experimentation
Want to learn a new technology or tool? Just pull a Docker image and run it. No need to install anything or worry about messing up your system. Could be a database, a cache, a load balancer, a message broker, etc.
Learning frontend and want some ready-made backend to test your frontend with? Just pull a Docker image and run it. Same for backend developers.
Want to learn system design and how to scale your application? Just run multiple instances of your application in separate containers. No need to worry about ports or configurations.
Facilitates Consistent Testing Across Versions
Want to test something on an older version of your project? Just keep track of your Docker images and run that older version. No need to rollback your code or worry about conflicting dependencies.
Want to test across different environments? Just run your project in different containers with different configurations.
Want to test your whole stack? Easy as well. If you have some Docker knowledge, you can do anything you want.
Let's get our hands dirty!
Setting up Dockerized Development Workflow
For this section, we will be working on a Node.js application. Assuming it runs on Node.js v16.
First step is making the Dockerfile.
Dockerfiles allow us to describe the installation and configuration steps that will be run to build the final usable image. This is the part where you install all the dependencies and tools.
# Dockerfile
FROM node:16.20.2-slim # Defines a Docker image to build on
WORKDIR /app # Creates a directory and switches to it, inside the image
COPY package*.json ./ # Copy files into the image
ENV PORT=3000 # Sets environment variables
EXPOSE $PORT # Exposes a PORT to the outside
RUN npm install --force # Runs an arbitrary command, like installing npm modules
COPY . . # Copies all files in the directory into the image
ENTRYPOINT [ "npm", "run" ] # Sets up the entry point of the image
CMD [ "dev:docker" ] # Sets up the command, which will act as an argument to the entry point in this case
dev:docker
is a package.json script defined asnodemon server.js --legacy-watch
.
Second, creating .dockerignore
Similar to .gitignore, we don't want to copy all files into the Docker image we are creating. For example, we shouldn't copy the node_modules or the secrets. The .dockerignore
file helps us to specify what not to copy.
Ex:
node_modules
.git
.env
A very simple Dockerfile that first copies the package.json and package-lock.json, then installs the packages, and then copies the rest of the project.
Third, Creating the Docker Compose file
Docker Compose files allow us to specify how the Docker image should be run and the specifications of the Docker container. Here we specify how changes will propagate from our local machine to the container.
There are 2 ways for the Dockerized development workflow:
Using Volumes
The old way. You will mount your project directory to the container, so any changes you make in your project will be reflected in the container.
Docker Compose file:
services:
backend:
container_name: my-nodejs-app
image: my-nodejs-app:dev
build: .
ports:
- 3000:3000
volumes:
- ./:/app
- NOT_USED:/app/node_modules
volumes:
NOT_USED:
Notice the volumes
section: In the first line, we are mounting the current directory to the /app
directory in the container. However, we want to exclude the node_modules
directory from the mounting, so we are mounting a volume that is not used to the node_modules
directory in the container. The end result is that now any changes made in our project will be reflected in the container, except node_modules
.
In other runtimes, you exclude the equivalent of
node_modules
directory in that runtime.
For this method to work, the command running must be a dev command; it should watch for changes and restart the server automatically. If you install a new dependency, you will need to manually rebuild the image.
You run it via docker compose up
.
Using Develop Specification
The new way, we will use the new develop
specification in the Docker Compose that was made exactly for this purpose.
Docker Compose file:
services:
backend:
container_name: my-nodejs-app
image: my-nodejs-app:dev
build: .
ports:
- 3000:3000
develop:
watch:
- action: sync
path: ./
target: /app
ignore:
- node_modules
- action: rebuild
path: package.json
In develop.watch
, we have 2 items that define what happens when we change a file in the project directory.
In each item, we define the action
to take, the path
to watch, and the target
in the container. Actions can be sync
, rebuild
, or sync+restart
.
Our first item tells Docker to sync any changed file in the current directory on our machine to the target path on the container, excluding node_modules
. I am using sync
here because I am using a dev command inside the container, but I could have also used a production command with sync+restart
.
The second item tells Docker to rebuild the image and restart the container if we change the package.json
file, like when we install a new dependency.
You run it via docker compose watch
or docker compose up --watch
.
Learn more here: https://docs.docker.com/reference/compose-file/develop/
You can already see why the new method is superior. Now we declare what we need, and Docker will do it for us. We did not need a workaround for excluding node_modules
, we are not forced into using a dev command, and we can properly handle new dependencies without manual intervention.
With the Dockerfile, .dockerignore, and Docker Compose made, you are ready to start your development! All changes you make will be reflected in your application.
Anyone cloning your project and running docker compose up --watch
will immediately start the development process without any additional steps.
Conclusion
With that, we have seen how Docker can transform your development workflow and make your life easier. We have seen the problems in traditional development and how Docker can solve them. We have seen the benefits of using Docker for development and how it can make your life easier. We have seen 2 ways to set up a Dockerized development workflow and compared them.
I would like to request, especially from open-source maintainers, to consider Dockerizing their projects. Instead of pages of installation steps, you can make the required Docker image and Docker Compose file and let the contributors run the project with a single command. This will make your project more accessible and easier to contribute to.
I hope you enjoyed this blog and learned something new!
Top comments (4)
That's not actually all there is to it.
I have an entire alpine dev container with tmux, helix and a lot of other binareis I've built from source in other build steps to trim down on file bloat. It's 1.1 GB despite shipping with python, go, node, lsps, lots of additional utility programs, and docker.
It's possible to do other quality of life stuff like bind mounting folders from the host, or joining docker networks to talk to other containers. Ended up ditching vs code.
Interesting! It's very powerful indeed
lovely
amazing work
Some comments may only be visible to logged-in visitors. Sign in to view all comments.