DEV Community

Cover image for What Exactly is Docker and Why its necessary
P. Acharya
P. Acharya

Posted on

What Exactly is Docker and Why its necessary

Why Docker Matters

Being a rookie developer we all have been there: "If it works on my system then why did it not work on the server?" We won't know how to send the entire application code along with the installed dependencies (services) to our colleagues so that they can test it without "Dependency Hell." That is where Docker and containers come in.

Before containers, when a team of developers needed to test and run the application code from other developers in the team, they would need to install the required services and libraries the application code uses with their correct versions. For example, if your app is a Java app that uses PostgreSQL as a database, Redis for caching, and RabbitMQ for messaging, the developers would need to install the precise versions of each of these services on their local system. Another problem that arose from this situation is that the installation process is different on every operating system, involving multiple steps where many things can go wrong. This will not sound like a big problem right now, but in hindsight, if your application uses ten services, you would need to complete the installation of ten services on your local system with the correct versions, which can lead to unexpected errors.

How Docker Affected Deployment Process

The development team would create an application artifact or package along with instructions on how to set up the application on your server. Along with this were the other services that the application needed, with instructions on how to set up those services, which would be handed to the operations team. The problem with this approach is that there might be multiple services that depend on the same library but each requires different versions to work, leading to dependency conflicts. Now with Docker, everything that the app needs is packaged inside the Docker artifact and sent to the operations team. This process does not require any configuration in the server itself.

Docker containers improved deployment as well as the development process because the container, being portable, can be easily shared with your DevOps and development team, streamlining the process. It solves dependency hell by packaging the relevant dependencies.

What is a Container?

A container is a way to package services with all their required dependencies in a single “box.” All the required configuration files, start scripts, and other dependencies for the service (for example PostgreSQL) are packaged in a container and simply installed with a single Docker command. Now, instead of downloading binaries for ten different services and going through the gruesome installation process, you can just run ten Docker commands to start the ten services that your application depends on.

Docker containers enable you to spend more of your time and energy on development rather than being stuck installing and fixing dependencies of multiple services for your application to run.

Virtual Machine VS Docker

Docker is a virtualization tool. You might be wondering how Docker enables us to run the services in a container on any operating system with a single command. To understand this, let us dissect the OS. The OS has two main layers: the OS kernel and the OS applications layer. The kernel interacts with the hardware and applications layer to enable communication between them.

Docker virtualizes the applications layer. When you run a Docker container, it contains the OS applications layer and uses the OS kernel of the host machine to interact with the hardware.

The virtual machine, on the other hand, has the OS applications layer and its own kernel. You can save a lot of disk space when using Docker.

The size of virtual machines is much larger than the size of Docker containers (images). Docker can start within seconds, while virtual machines can take minutes to start because they need to boot up their own kernel.

You can run a virtual machine of any OS on any other operating system, but you cannot do that in Docker, at least not directly. Let us say you have a Windows-based host machine and you want to run a Linux-based Docker container. The problem is that the virtual Linux applications layer will not be compatible with the Windows kernel. But there is a workaround for this. You can download Docker Desktop, which uses a hypervisor layer with a lightweight Linux distro providing the Linux kernel, letting you run Docker containers on Windows and Mac hosts easily.

Docker Images VS Docker Containers

A Docker image can be thought of as an executable application artifact that not only includes the app source code but also the complete environment configuration. It includes the OS applications layer, any services the app needs, and the main app source code.

You can also add environment variables in this Docker image (though not recommended for security) and create directories inside the Docker image.

A container, on the other hand, is nothing but a running instance of the image. A container is basically when you are actually running the application source code inside the Docker image. The advantage of this is that we can create multiple Docker containers, running instances, from a single Docker image.

Docker Registry

Now you might wonder, we would need to get the images for a service from somewhere to run the service. That is where registries come into play. This is storage specifically for Docker-image-type artifacts. Docker registries contain official images that are maintained by the companies. Docker hosts one of the biggest Docker registries called DockerHub, where you can find many images that different companies and individual developers have created and shared.

Image Versioning

Technology changes, and as new features are added to the service, the Docker image for this service is changed. As you add new features in the application code, you can create a separate image for the service.

You can track and name the different versions of the Docker images using tags.

Main Docker Commands

docker pull {image name}:{tag}

This command is used to pull an image from the registry into your local machine. Docker uses DockerHub as the default image registry to pull Docker images from.

docker run {image name}:{tag}

Creates a container for the given image and starts it. It pulls the image if it is not available in the host machine. Running this command in the terminal will actually block the container. You can use a “-d” or “--detach” flag (detached mode) to run the container in the background. You can add the “--name” flag to give the Docker container a specific name instead of using the default one that Docker gives you.

Even when running the container in detached mode, you may want to see the container logs. For that you can run this command:

docker logs {container ID}

Using this command, you can view the logs of the service running inside the Docker container.

docker ps

This command is used to list the running containers. The “-a” flag is used with this command to see all the containers, even the ones that were stopped.

docker stop {container name or container ID}

This command stops the running container.

Port Binding

Docker solves the problem of running different versions of the same application with the help of port binding. Applications inside containers run in an isolated Docker network. We need to expose the container port to the host machine port so the application inside the container can be accessed.

You can bind the container port to the host port at the time of running a Docker image:

docker run -d -p {host port}:{container port} {image name}

You can specify two containers to run on different ports of the same local host. For example, you can run container 1 with port 80 on port 80 and container 2 with port 80 on local host port 3000.

Conclusion

At the end of the day, Docker just makes life easier. No more dependency nightmares, no more “but it worked on my system,” and no more hours wasted setting up services on every machine. You package everything once, run it anywhere, and focus on actually building stuff. That is the real power of Docker and why every developer should at least know the basics. Happy containerizing!

Top comments (0)