DEV Community

Cover image for Dockerize your Development Environment for NodeJS
Jakob Klamser
Jakob Klamser

Posted on • Edited on • Originally published at klamser.dev

Dockerize your Development Environment for NodeJS

Using Docker in your development workflow has a positive impact on your productivity. It eliminates the typical "It worked on my machine" type of bugs and the setup on a different machine only requires a running Docker daemon and nothing else.
Before we get started implementing we will go over Docker real quick.

What is Docker?

Docker is a platform that can run containers, packages of software. To run these containers Docker uses OS-level virtualization. You can think of a container as a lightweight version of a virtual machine.

All containers you run on your Docker platform are isolated from one another. For example, the host, on which Docker runs, and one container running on the host, do not share the same filesystem except to explicitly tell them to.

To start a container you need a Docker image. This image is the blueprint for your container. You can take already predefined images from Docker-Hub or configure your own ones by writing a so-called Dockerfile.

This is just a quick overview of Docker if you want to dig deeper I encourage you to start here.

Why would you dockerize your development workflow?

In the introduction, I already touched on one benefit of using Docker in your development environment. This being the fact that it gets rid of the typical "It works on my machine" issue. Some other benefits are:

  • Standardize development workflow between team members even more
  • Reduction of production-only bugs if you use Docker for deployment too (Configurations between production and development can be quite similar)
  • Getting rid of the forementioned "Works on my machine" type of bugs

Getting started

We start out by creating a new folder in which we place our project, and we create our Dockerfile like this:

$ mkdir node-docker && cd node-docker
$ touch Dockerfile

Dockerfile

The container that we will use for our express application will be configured in the Dockerfile. For that, we need to give it some life:

FROM node:latest

WORKDIR /usr/src/app
COPY package*.json ./
ENV PORT 5000

RUN npm cache clear --force && npm install

ENTRYPOINT ["npm", "start"]

FROM tells Docker to get an image called node (version: latest) from the docker hub.

WORKDIR sets the directory in which all the upcoming commands will be executed.

COPY does exactly what it says, it gets the package.json and package-lock.json and copies it to the WORKDIR.

ENV sets an environment variable inside the container with the name PORT and the value 5000

RUN executes the commands we pass in. In this case, clearing the npm cache and then installing all the dependencies from package.json.

ENTRYPOINT executes the command you insert here, right when the docker container is started

Simple Express App

Now that we have our Dockerfile ready to go we need a simple express application that we can run inside a container. For that, we create two new files like this:

$ touch server.js package.json

package.json will get two dependencies, first express, and second nodemon:

{
  "name": "node-docker",
  "version": "1.0.0",
  "description": "",
  "main": "server.js",
  "scripts": {
    "start": "nodemon server.js"
  },
  "author": "Jakob Klamser",
  "license": "MIT",
  "dependencies": {
    "express": "^4.17.1"
  },
  "devDependencies": {
    "nodemon": "^2.0.4"
  }
}

The express application will just return simple HTML when hitting the main page. Therefore server.js should look like this:

const express = require('express');

const app = express();

const PORT = process.env.PORT || 5000;

app.get('/', (req, res) => {
  res.send(`
    <h1>Express + Docker</h1>
    <span>This projects runs inside a Docker container</span>
  `);
});

app.listen(PORT, () => {
  console.log(`Listening on port ${PORT}!`);
});

.dockerignore

Before we start setting up a MongoDB container together with our express container, we want to exclude some files from the running container. The syntax of a .dockerignore files is exactly the same as for a .gitignore file:

# Git
.git
.gitignore

# Docker
Dockerfile
docker-compose.yml

# NPM dependencies
node_modules

docker-compose.yml

Last but not least we want to define a docker-compose.yml. This file will contain all the information needed to run the express application and the MongoDB at the same time in two different containers. Let's go ahead and create the file.

$ touch docker-compose.yml

Now we configure it like this:

version: '3'
services:
  api:
    build: .
    ports:
      - "5000:5000"
    depends_on:
      - mongo
    volumes:
      - "./:/usr/src/app"
      - "reserved:/usr/src/app/node_modules"
  mongo:
    image: "mongo" 
    ports:
      - "27017:27017"
volumes:
  reserved:

version: First we define the version of the docker-compose we want to use. There are quite a lot of differences between version 3 and 2, so be careful when picking a version!

services: This is the section in which we define our express API (api) and the MongoDB (mongo)

build & image: build tells Docker to build an image out of a Dockerfile. In our case we want it to use the Dockerfile in the current directory. That's why we put . as a parameter because this defines the current directory. image tells Docker to pull an already existing image from docker hub.

ports & volumes: As the name of ports suggests we define the ports here. The colon is a mapping operator. We map the port 5000 of the container to the port 5000 of our host system, in this case, our local machine so that we can access the application outside of the container. The same goes for the port mapping of the MongoDB. volumes do something similar but this time with volumes. We map our local directory in which we write our code into the WORKDIR of the container. This way the container immediately reacts if we change anything in the source code.

reserved: This is a special volume that the local node_modules folder if existing, won't override the node_modules folder inside the container.

If you run the following command Docker will create an image from our Dockerfile and then run both containers (api and mongo):

$ docker-compose up

If you want to stop the containers just use this command:

$ docker-compose down

Conclusion

This is a simple Docker development environment setup that can easily be extended. If you want to change the database or add an Nginx to render your frontend, just go ahead and add a new service to docker-compose.yml or change an existing one.

You can also dockerize .NET Core, Java, or GoLang applications if you want to. Tell me about your experience with Docker in the comment section down below, I'd love to hear it!

The code for this is up on my GitHub as usual.

Photo by Dominik Lรผckmann on Unsplash

Top comments (11)

Collapse
 
timo_ernst profile image
Timo Ernst

I love Docker for prod deployments but when using it in dev environments itโ€˜d have to rebuild and re-run the image every time I change something in the code in order to see the result, correct? Any solution to this?

Collapse
 
klamserdev profile image
Jakob Klamser

Actually no, in this example your volume is connected to the container. So every time you change something, for example, in server.js it automatically updates your "localhost".
I recommend trying out this setup ;)

Collapse
 
timo_ernst profile image
Timo Ernst

Oh, I didnโ€™t notice that! Thatโ€˜s a great way to solve this problem. Thanks, great post! :-)

Collapse
 
vinaysudani profile image
Vinay Sudani

I tried Nodedock (github.com/nodedock/nodedock) Which has some pre configured services, and can be configured further with .env file.

Collapse
 
elonmir profile image
Markus Reisenhofer • Edited

Keep in mind if you use some node tools which normally would run on 127.0.0.1 might need to bind 0.0.0.0 in their respective configs.

Collapse
 
gr3g profile image
Greg Motyl

You have managed to not overcomplicate things and yet explain the topic very well. Great job and thank's!

Collapse
 
klamserdev profile image
Jakob Klamser

Thanks for the awesome feedback

Collapse
 
andrewbaisden profile image
Andrew Baisden

Good introduction into Docker.

Collapse
 
dbarwikowski profile image
Daniel Barwikowski

Nice one! I need to try it :)

Collapse
 
devwhoruns profile image
devwhoruns

If you put the same article as a readme.md in your repos it will be great Jakob.

Collapse
 
klamserdev profile image
Jakob Klamser

Maybe I will do that. Need to think about it