DEV Community

Cover image for Containers Made Easy: Mastering Docker with a Practical Project
Sayuj Sehgal
Sayuj Sehgal

Posted on

Containers Made Easy: Mastering Docker with a Practical Project

Hello fellow developers! Today we are going to learn about basics of docker and see how can we dockerize our Nodejs app. But before that let's first try to understand what is a docker, why do we need it and what problem does it solve?

What is a Docker?

Docker is a platform that allows you to develop, ship, and run applications inside containers. A container is a standalone executable package that includes everything needed to run an application, including the code, a runtime, libraries, environment variables, and config files.

Why Do We Need Docker?

Imagine you have an app that works perfectly on your computer. Now, your friend wants to run it on theirs, but, oh no! They have a different operating system, libraries, and configurations. This is where Docker swoops in like a superhero. It provides a standardized environment, ensuring that your app runs the same way, regardless of where it's deployed. It's like saying, "Hey, I don't care what computer you're on; I've got everything I need right here in my Docker container!"

Problems Docker Solves:

1. Dependency Hell:

  • Problem: Different environments have different libraries and dependencies, leading to the notorious "It works on my machine" issue.

  • Docker Solution: It encapsulates everything your app needs, eliminating the compatibility chaos.

2. Consistency Across Environments:

  • Problem: Development, testing, and production environments often differ, causing unexpected bugs.

  • Docker Solution: It ensures consistency, making sure your app behaves the same way, no matter where it runs.

3. Isolation and Security:

  • Problem: Apps on the same server can interfere with each other and create security risks.

  • Docker Solution: Containers provide isolation, like little sandboxes for your apps, keeping them safe and sound.

4. Efficient Resource Utilization:

  • Problem: Running multiple apps can be resource-intensive and messy.

  • Docker Solution: It optimizes resource usage by sharing the host OS kernel, making everything efficient and tidy.

5. Portability:

  • Problem: Moving apps between different environments is often a headache.

  • Docker Solution: It turns your app into a portable package, making it a breeze to move from your laptop to the cloud and beyond.

Let's Dockerize our Node.Js App

Let's take a practical example. Below is a simple Express.js application that listens on port 3000 and responds with "Hello World!" when you access the root URL.

// app.js
const express = require('express')

const app = express()

const port = 3000

app.get('/', (req, res) => res.send('Hello World!'))

app.listen(port, () => console.log(`Example app listening on port ${port}!`))
Enter fullscreen mode Exit fullscreen mode

To dockerize this application, we would need to create a Dockerfile. A Dockerfile is a text document that contains all the commands a user could call on the command line to assemble an image.

Here's an example Dockerfile for the given Express.js application:

# Use an official Node runtime as the base image
FROM node:14

# Set the working directory in the container to /app
WORKDIR /app

# Copy package.json and package-lock.json to the working directory
COPY package*.json ./

# Install any needed packages specified in package.json
RUN npm install

# Copy the rest of the application to the working directory
COPY . .

# Make port 3000 available to the world outside the container
EXPOSE 3000

# Define the command to run the application
CMD [ "node", "app.js" ]
Enter fullscreen mode Exit fullscreen mode

Once you have this Dockerfile, you can build your Docker image by running docker build -t my-app . in your terminal. This command builds a Docker image named "my-app" from the Dockerfile in the current directory.

After the image is built, you can run your application inside a Docker container with the command docker run -p 3000:3000 my-app. This command starts a new container running the "my-app" image and maps port 3000 inside the Docker container to port 3000 on your local machine.

Now, if you navigate to http://localhost:3000 in your web browser, you should see "Hello World!" displayed, which is being served by your Express.js application running inside a Docker container.

Let's Understand Docker Commands

You must be wondering what these keywords FROM, WORKDIR, COPY, RUN, EXPOSE, and CMD in a Dockerfile are for?

These are the instructions that define the environment and the behavior of the Docker container. Here's what each of them does:

  1. FROM: This instruction initializes a new build stage and sets the base image for subsequent instructions. In the given Dockerfile, FROM node:14 means the base image will be the official Node.js image, version 14.

  2. WORKDIR: This instruction sets the working directory for any instructions that follow it in the Dockerfile. In the given Dockerfile, WORKDIR /app sets the working directory to /app.

  3. COPY: This instruction copies new files or directories from the source and adds them to the filesystem of the container at the path. In the given Dockerfile, COPY package*.json ./ copies package.json and package-lock.json (if it exists) to the working directory in the container.

  4. RUN: This instruction will execute any commands in a new layer on top of the current image and commit the results. The resulting committed image will be used for the next step in the Dockerfile. In the given Dockerfile, RUN npm install installs the dependencies defined in package.json.

  5. EXPOSE: This instruction informs Docker that the container listens on the specified network ports at runtime. In the given Dockerfile, EXPOSE 3000 means that the application inside the container will be accessible through port 3000.

  6. CMD: This instruction provides defaults for an executing container. These can include an executable, or they can omit the executable, in which case you must specify an ENTRYPOINT instruction. In the given Dockerfile, CMD [ "node", "app.js" ] means that Docker will execute node app.js when the container is run.

These instructions are executed in the order they appear in the Dockerfile, and each instruction creates a new layer in the Docker image. This layered approach is what makes Docker images so lightweight and quick to build.

Conclusion

So, there you have it! Docker is the ultimate packing buddy for your applications, ensuring they run smoothly on any system, just like your favorite travel mug keeps your coffee perfectly warm (or cold) no matter where you go. With Docker, you can say goodbye to deployment headaches and hello to a more consistent, portable, and efficient development experience.

If you like this blog, you can visit my blog sehgaltech for more content.

Top comments (0)