DEV Community

Özcan Kara
Özcan Kara

Posted on

Hello! Docker

Welcome to my new article! In this piece, I delved into Docker technology. I compiled my notes on the fundamental concepts I learned while getting acquainted with Docker and wanted to share them with you. As I wrote the article, I researched Docker concepts in depth, and there's still more for me to learn. I hope my writing proves beneficial in your Docker learning journey.

Image description

**

What is Docker??

**
Docker, an open-source platform that facilitates application development and deployment, is a popular technology today with a history dating back to 2013. Docker provides lightweight and portable container technology, making software applications easily transportable and replicable across different environments.

Written in the Go programming language, Docker is software that allows you to quickly compile, test, and deploy your applications. It is commonly used in the implementation of Continuous Integration (CI) and Continuous Delivery (CD) processes.

By utilizing Docker container technology, it simplifies application development, testing, and deployment. This technology, which can be effectively used in CI/CD processes, makes software development processes more efficient and portable.

Reference: Openstack

**

Docker CI/CD Processes

**
Continuous Integration (CI): CI processes provide an environment where each code change is automatically integrated and tested. Docker can assist in quickly creating an environment for each CI process.

Continuous Delivery (CD): It involves the automatic deployment of successfully tested code to a production or staging environment. Docker enables packaging the application in a containerized form, ensuring consistent operation across different environments.

**

What is Docker File?

**
A Dockerfile is a text-based file used to create Docker containers. This file defines step-by-step how a Docker image should be built and specifies the runtime, dependencies, environment, and other configurations of a container. The Dockerfile is read and executed by the Docker Engine to build a Docker image.

**

What is Docker Registry?

**

Docker Registry is a service or repository used for storing and sharing Docker container images. Docker container images are packaged elements containing all the files and configurations required for an application to run. These images are stored in Docker Registry and can be pulled when needed.

Key purposes of Docker Registry:

I- Storage: Docker Registry is used to store created Docker container images. Developers create these images using Dockerfiles and upload them to the Registry to store them.

II- Sharing: Docker Registry is utilized for sharing Docker container images. After developers upload their created images to Docker Registry, other developers or system administrators can pull these images and use them in their own environments.

**

### What is the Logic of Docker Containers??
**
A Docker container is an isolated working environment created and managed by Docker. Docker enables the portability and lightweight execution of applications using container technology. A Docker container is a package containing all the dependencies, code, and configurations needed for an application.

Docker containers can be quickly started and stopped, speeding up development, testing, and deployment processes. Additionally, Docker allows an application to run consistently across different environments (local development machine, test server, production server). These containers are managed in a runtime environment called Docker Engine.

Docker containers are defined through a text file called Dockerfile. This file specifies how the application will be built, which dependencies it will include, and how it will be configured. Containers created using Dockerfile can be uploaded to and shared through Docker Registry storage service.

Reference: https://bytebytego.com

**

What is Docker Compose??

**
Docker Compose is a tool used to define and manage multiple Docker containers as a single application. It is configured with YAML files, in which the configurations for different components (services) of the application are defined, including how they will operate and interact with each other. Docker Compose allows for the simultaneous execution of various containers, performing actions such as starting, stopping, or restarting them with a single command, using these configuration files. This proves particularly useful in managing complex application structures, such as microservices architectures.

**

Docker Compose installation on Linux Ubuntu environment?

**
1-GitHub Docker Compose:

To download Docker Compose from the official GitHub repository:

2-Step-by-Step Installation on Ubuntu Terminal:

Open the terminal on your Ubuntu system.

Update the package list to ensure you have the latest information about available packages. Enter the following command and provide your password if prompted

sudo apt update

Enter fullscreen mode Exit fullscreen mode

Install the required dependencies for Docker Compose:

sudo apt install -y curl jq

Enter fullscreen mode Exit fullscreen mode

Download the latest version of Docker Compose from the official GitHub repository:

sudo curl -L "https://github.com/docker/compose/releases/latest/download/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose

Enter fullscreen mode Exit fullscreen mode

This command downloads the latest version of Docker Compose and saves it in the /usr/local/bin/ directory.

Apply executable permissions to the Docker Compose binary:

sudo chmod +x /usr/local/bin/docker-compose

Enter fullscreen mode Exit fullscreen mode

Verify the installation by checking the Docker Compose version:

docker-compose --version

Enter fullscreen mode Exit fullscreen mode

**

Docker Desktop Installation:

**

Perform installation procedures based on the operating system: Windows, MacOS, Linux.

Play with Docker:
"Play with Docker" (PWD) is an online service provided by Docker, offering users the opportunity to experiment with and learn Docker in a browser-based environment. Users can perform operations directly using Docker commands without the need for installation.

**

Docker Engine:

**
Docker Engine is the core component of the software platform that encompasses the fundamental elements of Docker and manages Docker containers. It combines a set of tools and services to facilitate the creation, distribution, and management of containerized applications.

The core components of Docker Engine are as follows:

**

I- Docker Daemon:

**
Docker Daemon is the main component running in the background of Docker Engine. It manages computer resources and performs basic operations such as creating, running, stopping, and deleting Docker containers. It ensures isolation of containers and effectively manages resource utilization.
**

II- Docker CLI (Command Line Interface):
**
Docker Command Line Interface provides users with the ability to interact with Docker Daemon. Users can manage containers, create images, and perform other Docker Engine operations using Docker CLI commands. It simplifies the use of Docker from the command line for developers and system administrators.

**

III- Docker API (Application Programming Interface):

**
Docker API is presented as an HTTP API following RESTful architectural principles. This allows the API to operate in a simple, lightweight, and web standards-compliant manner. Interaction with Docker is achieved through requests and responses over the HTTP protocol.

**

Docker Image:

**
A Docker image is a package containing all the software, tools, libraries, and dependencies required for an application to run. Docker provides a platform to create, distribute, and run these images. A Docker image enables the portable and repeatable execution of your application as all necessary components are isolated within a container.

Docker images are defined in text files called Dockerfiles, outlining step-by-step how the image should be built. Docker automatically creates images using these Dockerfiles.

Docker images allow for rapid distribution, scaling, and management of applications. They also facilitate easy transportation to different environments or consistent running in a development environment since Docker containers are independent of the environment they run in. This accelerates the software development process and enhances application reliability.

**

Docker Hub:

**
Docker Hub is a centralized repository provided by Docker. Users can download ready-made images from Docker Hub and share their own images.

**

Docker Registry:

**
Docker Registry is a repository for Docker images created by Docker. Docker Hub is an example provided by Docker, serving as a central repository where users can share and download ready-made images.

**

Docker Swarm Logic:

**
Docker Swarm is a container clustering and orchestration tool provided by Docker, offering integrated orchestration and management features. It aims to enable scalable and highly available application deployments by bringing together multiple Docker servers and facilitating the distribution and management of containers across these servers.

Key features of Docker Swarm include:

**

I- Clustering:

**
Docker Swarm transforms interconnected Docker servers (nodes) into a cluster, allowing an application to run on all servers within the cluster. This clustering enhances the high availability and scalability of the application.

**

II- Services:

**
Docker Swarm simplifies the deployment and management of applications using units called services. Services represent a specific container configuration and can be deployed across all nodes running on Docker Swarm.

**

III- Rolling Updates and Rollbacks:

**
Docker Swarm ensures secure updates for applications. The rolling updates feature replaces old containers with new versions sequentially, ensuring uninterrupted application functionality. Additionally, the rollback feature allows reverting to the previous version in case of errors.

**

IV- Security:

**
Docker Swarm ensures secure communication by using Transport Layer Security (TLS) when adding new servers to the cluster or facilitating communication between servers.

**

V- Integration:

**
Docker Swarm integrates seamlessly with Docker Compose and Docker CLI. This allows Docker Swarm users to continue using familiar tools.

**

Docker Volume:

**
Docker Volume is a mechanism that provides data persistence and sharing in Docker containers. While containers typically have an isolated file system by default, data can be lost when a container is terminated. Docker Volume offers a flexible and robust solution for data management and sharing in Docker containers.

Key features of Docker Volume include:

**

I- Data Persistence:

**
Docker Volume ensures the persistence of data stored in containers, allowing data to be preserved even when a container is terminated or deleted.

**

II- Data Sharing:

**
Docker Volume facilitates data sharing among multiple containers. Containers sharing the same volume can access the data in that volume together.

**

III- External Resource Binding:

**
Docker Volume allows binding a directory from the host system or a remote source to a directory inside the container. This can be used to access data from external sources, such as a database directory or data from remote locations.

**

SUMMARY:

**

Docker is a platform that provides containerization technology, enabling applications to run in a lightweight, portable, and repeatable manner. The platform allows each application to run within packages called containers, containing all dependencies. This ensures consistency across different environments and accelerates the development, testing, and deployment processes.
The core components of Docker include Docker Engine, Dockerfile, Docker Compose, and Docker Registry. Docker Engine is the runtime that manages containers. Dockerfile is a text file defining how a container is built. Docker Compose is a tool for defining and managing multiple containers as a single application. Docker Registry is a service or repository that stores and shares Docker container images.
Utilizing Docker simplifies the processes of running, transporting, and deploying applications in isolated environments. It also supports modern application development methods, such as microservices architectures, enabling the creation of more flexible and scalable applications.

**
Thank you for reading my writing**

Top comments (0)