Why I Stopped Using Docker for Side Projects
Docker is great for teams. For solo developers running $5 VPSes? It's often overkill. Here's when I use it — and when I don't.
The Docker Trap
Every "modern development" tutorial starts with:
- Install Docker
- Create Dockerfile
- docker-compose up
- Profit?
And yes, Docker solves real problems:
- Consistent environments (works on my machine = works on server)
- Dependency isolation (no version conflicts)
- Easy deployment (one command to run anywhere)
- Scaling (docker-compose up --scale app=3)
But it also adds:
- ~200MB+ of disk per image
- Layered build complexity
- Debugging inside containers (fun with
docker exec -it) - Volume permission headaches
- Another thing to learn when things break
When I Use Docker (And Why)
Scenario 1: Third-Party Services That Need It
# docker-compose.yml for services I didn't write
version: '3.8'
services:
postgres:
image: postgres:16-alpine
environment:
POSTGRES_DB: myapp
POSTGRES_PASSWORD: ${DB_PASSWORD}
volumes:
- pgdata:/var/lib/postgresql/data
ports:
- "5432:5432"
redis:
image: redis:7-alpine
ports:
- "6379:6379"
Why: I didn't write PostgreSQL or Redis. Docker gives me a clean, disposable instance that I can destroy and recreate in seconds.
Scenario 2: Build Pipelines
# Dockerfile for CI/CD
FROM node:20-alpine AS builder
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . .
RUN npm run build
FROM node:20-alpine AS runner
WORKDIR /app
COPY --from=builder /app/dist ./dist
COPY --from=builder /app/node_modules ./node_modules
EXPOSE 3000
CMD ["node", "dist/index.js"]
Why: Consistent build environment. What passes CI will work in production because they're using the exact same base image.
Scenario 3: Apps Other People Will Run
If I'm open-sourcing something or giving it to a client, Docker makes their life easier:
# They just run:
docker compose up -d
# Done.
No "install Node 18, then set up this env var, then install these system dependencies..."
When I Skip Docker (And What I Use Instead)
My Default Setup for Personal Projects
project/
├── server.js # Entry point
├── package.json # Dependencies
├── .env # Environment vars
├── guardian.sh # Auto-restart script
└── logs/ # Log files
That's it. No Dockerfile. No docker-compose.
How I deploy:
# SSH into server
cd /path/to/project
git pull origin main
pm2 restart app # or just kill + node server.js
How I handle dependencies:
# nvm manages Node versions
nvm install 20
nvm use 20
# npm installs everything locally
npm install --production
# System deps? apt-get install once, forget about it
How I handle process management:
// server.js — simple, no PM2 needed
const server = app.listen(PORT, () => {
console.log(`Running on port ${PORT}`);
});
// Graceful shutdown
process.on('SIGTERM', () => {
server.close(() => process.exit(0));
});
Plus a cron job guardian script that checks every 5 minutes if the process is alive.
Why This Works for Me
1. One server = no environment inconsistency
My dev machine doesn't matter. I code on the server directly (via SSH + VS Code Remote). The "environment" IS the production environment.
2. Disk space is precious on a $5 VPS
Docker images, layers, volumes, build cache... it adds up fast.
$ df -h /
Filesystem Size Used Avail Use%
/dev/vda2 59G 37G 21G 65%
# Without Docker: 65% used
# With Docker + 3 images: probably 75-80%
3. Debugging is simpler
# Without Docker:
node --inspect server.js
# Open Chrome DevTools → done
# With Docker:
docker exec -it container_name sh
# Then figure out how to forward ports, attach debugger...
# Or install node-inside-docker, rebuild image...
4. Startup is faster
# Without Docker:
node server.js
# → Running in 0.3s
# With Docker:
docker compose up
# → Pulling images... Building... Starting...
# → Running in 5-15s (or longer if building)
For a $5 VPS where RAM is limited, those extra seconds and MB matter.
The Middle Ground: When I Regret Not Using Docker
Case 1: Python Project Needed Specific System Libs
I built a data processing tool that required libmagic, ffmpeg, and libx264. Installing these on the host was painful:
apt-get install libmagic1 ffmpeg libx264-dev
# Version conflicts with existing packages
# Need to pin specific versions
# Breaks something else
# Spent 2 hours debugging
With Docker, this would have been a FROM python:3.11 with RUN apt-get install and done. Isolated. Clean. Repeatable.
Lesson: If your project needs weird system deps → use Docker.
Case 2: Wanted to Test on Different Node Versions
I needed to verify my app worked on Node 18, 20, and 22.
Without Docker: install nvm, switch versions, test, switch, test, switch, test. Doable but tedious.
With Docker: three docker run --rm node:18..., node:20..., node:22.... Clean and parallel.
Lesson: Multi-version testing → Docker wins.
Case 3: Gave Code to a Friend
"Here's my cool project! Just clone it and..."
"Oh, I need to install Node 20, and this Python dependency, and set up this environment variable, and..."
With Docker: docker compose up -d. Period.
Lesson: Sharing with non-technical people → Docker (or a better README).
My Decision Framework
Ask yourself these questions:
| Question | Yes → Docker | No → Skip Docker |
|---|---|---|
| Will others run this? | ✅ | ❌ |
| Does it need weird system deps? | ✅ | ❌ |
| Do you need multi-version testing? | ✅ | ❌ |
| Is this a database/cache service? | ✅ | ❌ |
| Are you the only user? | ❌ | ✅ |
| Is disk space tight (<10G free)? | ❌ | ✅ |
| Do you debug directly on the server? | ❌ | ✅ |
| Is this your first project? | ❌ | ✅ |
The Real Takeaway
Docker isn't good or bad. It's a tool with trade-offs:
- Use it when: isolation, reproducibility, and distribution matter
- Skip it when: simplicity, speed, and resource efficiency matter more
For solo developers on budget VPSes, the answer is often "skip it." At least until you hit one of the "use Docker" scenarios.
Start simple. Add complexity only when the pain of simplicity exceeds the pain of complexity.
What's your Docker philosophy? Essential tool or unnecessary overhead? Sound off in the comments.
Follow @armorbreak for more practical takes on developer infrastructure.
Top comments (0)