DEV Community

Ravi Bhuvan
Ravi Bhuvan

Posted on

Docker - one read to understand

problem statement
developer working on a project, he has his own configuration. he installs all dependencies (Nodejs, MongoDB and so on).
when other developer comes into picture then, he want to install all dependencies.
he may also have diff OS, he installed all the latest versions when he installed. both has different versions and environment.
he can't use OS-specific software.

Containers -

  • these containers have their own environment with the version specified.
  • these containers can be shared when he shares the image.
  • these are lightweight, sharable and have their own env. Docker Setup Docker Daemon - this is the brain of docker, creating, pulling images. docker run -it ubuntu this is a normal CLI in windows it says start a ubuntu container i.e. an container with ubuntu as OS. this checks for image of ubuntu locally in your machine, if not downloads it from the docker hub , and you get an image of ubuntu. now u have a ubuntu container running in docker, u can use it. after creation of the docker container, the docker(daemon ) uses your host kernel, and not full OS and creates a container, now what ever changes you make in the container is only specific to the container. this does not affect your local OS, anything installed its only in the container.

Container and Images
you can create as many container using a single image. An image is like a blueprint for containers. each containers are isolated from each other, one data created in one container is not accessible in container created using the same image. meaning each data is isolated to their own container.

Containerizing a app
create your application.
now create a file named Dockerfile without extension.
this is your image configuration basically.

FROM ubuntu
RUN apt-get update
RUN apt-get install -y curl
RUN curl -fsSL https://deb.nodesource.com/setup_20.x | bash -
RUN apt-get upgrade -y
RUN apt-get install -y nodejs

COPY package.json package.json` -
COPY package-lock.json package-lock.json` -
COPY server.js server.js`

RUN npm install
ENTRYPOINT ["node" , "server.js"]
Enter fullscreen mode Exit fullscreen mode

FROM ubuntu - this is you base image, we build on this image.
RUN apt-get update - this updates the packages.
RUN apt-get install -y curl - we are installing curl tool inside the container, and now we can download files using curl command in the container.
RUN curl -fsSL https://deb.nodesource.com/setup_20.x | bash - this downloads the nodejs packages
RUN apt-get upgrade -y updates the package.
RUN apt-get install -y nodejs this runs the nodejs.

till here we installed a OS ubuntu and then installed the nodejs application on ubuntu

COPY package.json package.json
COPY package-lock.json package-lock.json
COPY server.js server.js
this says copy the fileA in the folder to the fileA inside the container.

RUN npm install installs and creates node_modules folder with packages.

ENTRYPOINT ["node" , "server.js"] - this starts the node adn server.js

Now you have created a basic image configuration for your application.
in your terminal

docker build -t image_name .
Enter fullscreen mode Exit fullscreen mode

this builds your image and . that the file is in the pwd.
-t means tag name you are giving
build builds your docker image using the configuration in Dockerfile

docker exec -it <container_ID> bash
Enter fullscreen mode Exit fullscreen mode

this give u a interactive(i) terminal(t) inside the container,
lets say u build the image using ubuntu, then it gives u ubuntu terminal to interact with the container.

docker run -it -e PORT=5000 -p 8000:5000 <image_tag> 
Enter fullscreen mode Exit fullscreen mode

run runs your image
-it interactive terminal
-e environment variable
PORT=5000 we are defining the env variable PORT to 5000
-p this is port mapping
8000:4000 here, 8000 ---> host port and 4000 --> containers port

docker build test .
Enter fullscreen mode Exit fullscreen mode

test:latest tag is used for reference for building image.
Docker images are immutable, it creates a new image replaces tag with the new images.

Layer Caching (each code in Dockerfile can be called Layers off building image).
so, basically the order of the Dockerfile command is important, lets say u do some server.js , now the docker checks the Dockerfile till where the code is cached, now everything after that line commands are executed when built again. so keep common code(installing node or other dependencies) before the actual code (that keeps changing) to avoid unnecessary downloading of dependencies on repeat.

Publishing to Docker hub
first create a repo in docker hub
now create a local image using the name given in the repo
finally docker push <image_name> to push it to your repo
u need to be logged in to push the image.

lets say you are a developer working on multiple containers that work together. like Nodejs, PostgreSQL or MongoDB. so, now manually managing these container would be huge headache. so,

Docker Compose it is a tool used to configure, define and multi-container application easily.
this lets u configure your application to run with multiple containers.
docker-compose.yml just like Dockerfile this file is used for configure the managing of containers and interacting with other containers and communication seamlessly.

version: "3.8"

services: 
  postgres: 
    image: postgres # takes the postgres imag from docker hub
    ports:
      - "5432:5432"
    environment:
      POSTGRES_USER : username
      POSTGRES_DB : review
      POSTGRES_PASSWORD : password

  redis:
    image : redis
    ports : ["6379:6379"]

Enter fullscreen mode Exit fullscreen mode

version can be anything.
services: says what are the services you want
Postgres: which service u want
image: image name in docker hub. u can use ur own image too.
port: port mapping
environment: env variables

this is a basic setup for docker compose.

docker compose up
Enter fullscreen mode Exit fullscreen mode

this starts the configuration and with the port mapping.
now, this creates a stack of containers with your services needed in you containers page in docker desktop.
docker compose down clears the stack of containers

Top comments (0)