As a developer, one of the most satisfying moments is finally getting your web app live on the internet for the world to see! However, turning your locally running code into an accessible web app can be tricky sometimes. I learned this the hard way when I tried to deploy my first Node.js app.
After weeks of late nights and endless debugging, I had a web app that ran flawlessly on my own machine, for once i felt like a genius 😁. But this happiness was short lived. When it came time to launch it on a cloud server, things started breaking left and right! After banging my head on the desk troubleshooting deployment issues, I knew there had to be a better way. That's when I discovered Docker, and it ended up being the magical solution I needed to easily deploy my Node app and many more after!
In this post, I'll walk through how taking the time to Dockerize my application gave me the keys to rapidly deploying it to the cloud with minimal fuss. I hope my experience will convince you to embrace Docker for your next Node.js project! Let's get started on this journey from localhost to the cloud!
My Application Architecture
First, let me tell you a bit about the simple web application I had built. It was called CatsGram, and it allowed users to post pictures of their cats and leave comments for other fuzzy felines. The app had:
- A frontend written in React that let users upload cat photos + comments
- A backend REST API written in Node.js and Express that handled the data and storage
- A MongoDB database to store the cat profiles and comments
Here is what the React frontend looked like:
// Frontend in React
import React from 'react';
const App = () => {
const handleSubmit = (e) => {
e.preventDefault();
// Call API to submit form data
}
return (
<div>
<h1>CatsGram</h1>
<form onSubmit={handleSubmit}>
<input type="file" />
<button type="submit">Add Photo</button>
</form>
</div>
)
}
export default App;
And here is part of the Express backend that handled the API calls from the frontend:
// Backend API in Express
const express = require('express');
const mongoose = require('mongoose');
const cors = require('cors');
const app = express();
app.use(cors());
// Connect to MongoDB
mongoose.connect('mongodb://localhost/catsgram', {useNewUrlParser: true});
// Cat profile model
const Cat = mongoose.model('Cat', new mongoose.Schema({
name: String,
picUrl: String
}));
app.post('/cats', async (req, res) => {
// Create new cat profile
const newCat = await Cat.create(req.body);
res.send(newCat);
});
app.listen(3000);
The app worked flawlessly on my local machine, but deploying it to an online server was another story...
The Deployment Headache
I decided to deploy my app to a popular cloud provider and rented a Linux server. After SSH-ing in, I hit my first roadblock - the server was running an older version of Node than the one I used for development. My app crashed with an error about missing modules and so on!
After fumbling with NVM to try and install the right Node version, I finally got the backend API running. But then the React frontend failed to build due to mismatching webpack versions with the create-react-app starter I used.
Each error I fixed seemed to unveil yet another environmental issue between my local machine and the server. Path issues, missing dependencies, environment variables - you name it!
I was tearing my hair out trying to get things working. I finally conceded defeat and turned to my savior... Docker!
Docker to the Rescue!
Docker is a tool that allows you to package applications into standardized units called containers. These containers bundle up the code, dependencies, system libraries, and settings into an isolated executable package.
The key benefit is that this container will run the same way regardless of the underlying environment. No more worrying about compatibility issues across different machines!
Some other awesome benefits of Docker:
- Cross-platform portability - Ship your containers to any Linux, Windows, cloud provider, etc
- Environment consistency - Containers include everything needed to run the app
- Isolation - Apps run in isolated environments without conflicting with other apps
- Speed - Containers start instantly compared to virtual machines
Docker seemed like the perfect solution to my deployment woes. By Dockerizing my app, I could neatly package it up with all its needed dependencies and specs into a standardized container. This container could seamlessly run on my local machine for development, then be deployed to the cloud server without any environment mismatches!
Let's look at how I Dockerized the CatsGram app.
Dockerizing the Backend API
The first step was containerizing my Express backend API. Docker uses special Dockerfile configuration files to build container images. Here is the Dockerfile for my backend:
# Dockerfile
FROM node:16-alpine
WORKDIR /app
COPY package*.json .
RUN npm install
COPY . .
CMD ["node", "server.js"]
This does the following:
- Starts with a Node.js base image
- Sets the working directory to
/app
- Copies the backend code into the image
- Installs dependencies with
npm install
- Specifies the command to run the app -
node server.js
With this Dockerfile, I could build a container image for my backend:
$ docker build -t catsgram-api .
This built an image tagged catsgram-api
based on my Dockerfile. I could then run a container from that image:
$ docker run -p 4000:3000 catsgram-api
This started a container on port 4000 and mounted the internal port 3000 to be accessible externally. My backend API was now running in an isolated Docker container!
Containerizing the Frontend
For my React frontend, I used a multi-stage Docker build:
# Stage 1 - Build
FROM node:16 AS build
WORKDIR /app
COPY package*.json .
RUN npm install
COPY . .
npm run build
# Stage 2 - Run
FROM nginx:alpine
COPY --from=build /app/build /usr/share/nginx/html
This first installs Node to build the React app, then copies the built artifacts to an Nginx image for the runtime. This gave me a lean production image!
Again I could docker build
this and run a container to serve my frontend on port 3000.
Defining Services with Docker Compose
At this point, I had two containers - one for the backend API and one for the frontend. To link them together, I used Docker Compose to define the app services:
# docker-compose.yml
services:
backend:
build: ./backend
ports:
- "4000:3000"
frontend:
build: ./frontend
ports:
- "3000:80"
Running docker-compose up
would now start both containers and wire them together!
Deploying to the Cloud
With Docker, deploying these containers to the cloud was a breeze! I pushed my images up to a registry:
$ docker push catsgram-api
$ docker push catsgram-frontend
Then on the server I just had to run:
$ docker pull catsgram-api
$ docker pull catsgram-frontend
$ docker-compose up -d
The containers started up just as they did locally and my app was live on the internet! 🎉
Docker is Deployment Magic
No more fussing with dependencies, runtimes, builds, etc across different environments. Docker let me develop my app locally as I normally would, then package everything needed up into portable containers ready for deployment anywhere.
Some of the key benefits I saw:
- Consistent environments - Containers included the exact dependencies and Node runtime needed
- Cross-platform - I could develop on OSX but deploy the same containers to Linux servers
- Lightweight - Containers are much more efficient than VMs
- Modular - Services like frontend and backend were compartmentalized into separate containers
Docker really is a game-changer when it comes to deploying applications. I can now develop apps faster without worrying about environment differences between my machine and servers.
If you're struggling to deploy Node apps, I highly recommend exploring Docker! It will save you those late night "works on my machine" debugging sessions when you'd rather be sleeping.
Let me know if you have any questions! I'm happy to chat more about my experience Dockerizing my first Node app. Wishing you happy coding and smooth deploying!
Top comments (13)
I really liked the way you broke down the process of the front-end, backend and linking them together. I'm pretty new to docker so I didn't quite understand what is docker compose compared to just docker?
Docker compose is a docker too that allow management of deployment of more than one docker application provided they all depends on each other
It just like package.json in NPM,It is a declarative way to use docker.With the help of docker compose, we don't need to run command one by one.
I too love Docker, especially for development environments! No more fussing trying to get all our team members setup on different computers. Just spin up a container, and magically they're working in the same environment!
Nice article! Keep up the great work!
How does this work with ip/domains? For example, are the front end and back end at different HTTP addresses? And how do you get custom domains linked to the front and in the back in separately?
Great question! Here's how Docker containers work with IPs, domains, and linking frontends and backends:
By default, Docker containers get their own virtual IP address assigned by Docker's networking. So your frontend and backend containers would have different IP addresses, like 172.17.0.2 for front-end and 172.17.0.3 for back-end.
You don't necessarily need to expose these default IPs publicly. Instead, you can map custom ports on the host machine to the internal container ports.
For example:
This would expose the frontend on the host's port 8080 and backend on port 3000.
To link them, the frontend would just need to make requests to host-ip:3000 to hit the backend API.
Now to use custom domains, you'd point the domains to the host IP, and route traffic to the mapped ports.
For example:
Route traffic from frontend.mydomain.com:80 to host port 8080
Map api.mydomain.com to host IP
Route traffic from api.mydomain.com:80 to host port 3000
This way your domains are abstracted from the internal Docker ports/IPs.
You can also use a reverse proxy like Nginx to handle routing requests from custom domains to your Docker containers.
Hope this helps explain how to handle networking and domains with Docker! Let me know if you have any other questions.
Excellent answer to my question really appreciate it.
How will you host your app?
Most server provider has docker as an option.
When hosting on your own vps, you'd install docker just like you'd do on your machine.
Which server host did you deploy your docker image to?
render
This is really cool, I have faced the same deployment issues since I first tried to deploy my express application. Are there recommended resources one can use to learn Docker? thanks.
yes, one from net ninja on youtube.