I remember the first time I heard "it works on my machine." Three developers, three different environments, one broken deploy. Docker solves exactly this problem.
In this guide you'll go from zero to a running containerized app in 30 minutes.
What is Docker, actually?
Docker packages your app + its dependencies + runtime into a single image. That image runs the same way everywhere — your laptop, a teammate's Mac, AWS EC2.
Think of it like a shipping container: standardized box, works on any ship.
Prerequisites
- Docker Desktop installed (docs.docker.com)
- Basic terminal knowledge
- Any web app (we'll use a simple Node.js example)
Step 1: Create a simple app
// app.js
const http = require('http');
const server = http.createServer((req, res) => {
res.end('Hello from Docker! 🐳');
});
server.listen(3000, () => console.log('Running on port 3000'));
// package.json
{
"name": "docker-demo",
"version": "1.0.0",
"scripts": { "start": "node app.js" }
}
Step 2: Write a Dockerfile
# Start from official Node image
FROM node:20-alpine
# Set working directory inside container
WORKDIR /app
# Copy dependency files first (layer caching)
COPY package*.json ./
RUN npm install
# Copy the rest of the app
COPY . .
# Expose the port
EXPOSE 3000
# Start command
CMD ["node", "app.js"]
Step 3: Build and run
# Build the image
docker build -t my-app .
# Run a container from it
docker run -p 3000:3000 my-app
Open http://localhost:3000 — you'll see "Hello from Docker! 🐳"
Step 4: Add Docker Compose
For real apps you need a database too. Docker Compose manages multiple containers:
# docker-compose.yml
version: '3.8'
services:
app:
build: .
ports:
- "3000:3000"
environment:
- DATABASE_URL=postgres://user:password@db:5432/myapp depends_on:
- db
db:
image: postgres:16-alpine
environment:
POSTGRES_USER: user
POSTGRES_PASSWORD: password
POSTGRES_DB: myapp
volumes:
- postgres_data:/var/lib/postgresql/data
volumes:
postgres_data:
docker compose up -d
One command — app + database, both running.
Key concepts to remember
| Concept | What it is |
|---|---|
| Image | Blueprint (like a class) |
| Container | Running instance (like an object) |
| Dockerfile | Recipe to build an image |
| Docker Compose | Tool to run multiple containers |
| Volume | Persistent storage for containers |
Common mistakes beginners make
1. Copying node_modules into the image
Add a .dockerignore file:
node_modules
.git
.env
2. Running as root
Add this to your Dockerfile before CMD:
RUN addgroup -S appgroup && adduser -S appuser -G appgroup
USER appuser
3. Not using layer caching
Always copy package.json and run npm install BEFORE copying your source code. Docker caches each layer — if your code changes but dependencies don't, it won't reinstall everything.
What's next?
Once you're comfortable with basics:
- Multi-stage builds — smaller production images
- Health checks — Docker restarts unhealthy containers
- Docker secrets — secure way to pass credentials
- Kubernetes — when you need to scale beyond one server
Real talk
Docker has a learning curve. The first Dockerfile feels weird. But after a week you won't be able to imagine developing without it.
No more "works on my machine." No more dependency conflicts. No more manual server setup.
I use Docker daily running a Rails + PostgreSQL + Nginx stack on AWS EC2. If you have questions about specific setups — drop them in the comments.
Top comments (0)