Did you ever use docker in any of your project. I prefer to run the docker in production, and combining it with the Kubernetes to handle the scaling of the user base. Docker shows us a new way of handling the very basic "But, it works on my system", issue.
At the end, We can't give our PC to every user, so, we have to use docker in production to stop this issue.
Prerequisite
Before anything, you need to have the docker desktop installed in your system or either the docker CLI, and then you will be able to good to go with docker.
I am making this tutorial for a node.js project. But, you can use the same for any of the project you wanted with some tweaks.
Check for the open containers
The first step of setting up the docker in your project is to find out, if you have any containers already running that you don't want to have currently.
And for that, you can use the following docker command to run list all the running docker containers
docker ps
As you can see in the above image, currently, I dont have any running containers.
Delete all the unused data
After listing the container with docker ps
, if you find any containers, and you dont want any of them or caches, networks, volumes, or images that you want right now and wanted to delete every unsused data, then you can use the below command to do it.
docker system prune --all
As, you can see, that docker system prune command deleted one of my unused network.
Creating the Dockerfile
Now, the second step of setting up the docker in your project is creating the dockerfile, which is also the most important of them all.
In this step, at first you have to create a Dockerfile in the root of your project, and this file will contain all the commands that you want to run over the docker.
FROM node:18
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
ENV PORT=8080
EXPOSE 8080
CMD ["npm", "start"]
FROM node:18
: This command install the image, on which you want to run the project. Here, node is the image and 18 is the version of the node.
WORKDIR /app
: With this command, you define the base directory of your OS, which is /app and your whole project will be in this directory.
COPY package*.json ./
: This command will copy the package.json and package-lock.json file from the local project to the docker.
RUN npm install
: This command will run the npm install command over the server and install everything that has been specified under package.json and create a node_modules file over the docker.
COPY . .
: This command will copy all of the other project code to its base directory, /app.
ENV PORT=8080
: With this command, you can define the port that you want to expose to the local environment, as you may have using the .env.
EXPOSE 8080
: This command will expose the same 8080 port defined before to the local codebase.
CMD ["npm", "start"]
: This command will combine npm start and run it the docker, to run the project.
Build The DockerFile
After creating the dockerfile, you have to build the dockerfile, so that you can use it to publish later, to be used by other users.
docker build -t <name>:<version> .
The above command specifies:
docker build
: This part of the command instructs Docker to build an image.
-t <name>:<version>
: This is the -t (or --tag) option, which is used to specify a name and optionally a version tag for the Docker image. Here's what each part means:
<name>
: This should be replaced with the desired name for your Docker image. The name is typically in lowercase and can consist of letters, numbers, hyphens, and underscores. It is used to identify your image.
<version>
: This should be replaced with a version tag for your Docker image. The version tag is optional and is used to differentiate different versions or releases of the same image. It's common to use a version number or a version name (e.g., 1.0, latest, or a date) for this field.
.
: The period (.) at the end of the command specifies the build context, which is the directory where the Dockerfile and any related files for building the image are located. In this case, it tells Docker to use the current directory as the build context.
Run the project over docker
After the above command, you are able to see an image in the docker desktop, and either you can run the below command to check the list of images running
docker images
Now, you can copy this Image ID, and paste the Image ID with the below command to run the project in the docker server.
docker run -p 5000:8000 b4ccb712e301
The command docker run -p 5000:8000 b4ccb712e301 is used to run a Docker container from the image with the image ID b4ccb712e301. Let's break down the command and its components:
docker run
: This is the command to run a Docker container.
-p 5000:8000
: The -p option is used to specify port mapping, allowing you to map ports on your host system to ports inside the container. In this case, it's mapping port 5000 on your host to port 8000 inside the container. This means that if the application inside the container listens on port 8000, you can access it from your host machine using port 5000.
b4ccb712e301
: This is the image ID of the Docker image from which you want to create a container. The image with this ID is used as the base for the container.
So, when you run this command, Docker will create a new container based on the image with ID b4ccb712e301, and it will map port 5000 on your host to port 8000 inside the container. If the container contains a service or application that listens on port 8000, you can access that service on your host machine by connecting to http://localhost:5000.
For example, if you have a web application running inside the container on port 8000, you can access it from your web browser by going to http://localhost:5000.
Keep in mind that the image ID b4ccb712e301 should correspond to a valid Docker image that contains the application or service you want to run. If the image doesn't exist on your system, Docker will attempt to pull it from a Docker registry (like Docker Hub) if it's available there.
With the help of the above commands and files, you will be able to run your project over docker. There are many things, that you still need to apply in your project with docker to make it more scalable, which will be discussed in later articles.
Top comments (0)