Hello, I'm Maneshwar. I'm working on FreeDevTools online currently building **one place for all dev tools, cheat codes, and TLDRs* — a free, open-source hub where developers can quickly find and use tools without any hassle of searching all over the internet.
If you've ever screamed at an nginx.conf
, you're not alone.
Configuring traditional web servers often feels like learning a new language every time.
Now, meet Caddy — your friendly, modern, no-config-needed web server.
It’s written in Go, serves HTTPS by default, and it just works.
This post walks you through a real-world Docker setup using Caddy to se
rve a frontend (running on port 3030
) with automatic HTTPS, compression, and structured logs — no nginx wizardry required.
Why Caddy?
Caddy is built for developers who want to:
- Serve sites over HTTPS without touching certbot
- Use readable, minimal config files
- Reverse proxy without getting a headache
- Add compression, logging, and headers easily
- Avoid surprises in deployment
What We're Building
You're running a service (on port 3030
) and want it served securely at https://frontend.localhost
, with logs and compression baked in.
We're using:
- Caddy (reverse proxy + HTTPS)
- Docker Compose
- Frontend Service
-
frontend.localhost
as your dev domain
Project Structure
.
├── docker-compose.yaml
├── Caddyfile
├── .env
The Caddyfile
This is where Caddy shines. The config is almost readable by non-humans too.
frontend.localhost {
tls internal
encode zstd gzip
reverse_proxy frontend-ui:3030 {
header_up Host {host}
header_up X-Real-IP {remote}
header_up X-Forwarded-For {remote}
header_up X-Forwarded-Proto {scheme}
}
log {
output stdout
format console
level INFO
}
}
http://frontend.localhost {
redir https://frontend.localhost{uri} permanent
}
Breakdown
-
tls internal
: Avoid needing Let's Encrypt for local dev. Caddy uses its own CA. -
reverse_proxy
: Routes traffic to thefrontend-ui
Docker service. -
header_up
: Forwards original client IP and host — useful for analytics/logging. -
encode
: Enables Brotli + Gzip compression. -
log
: Pretty logs directly to the container stdout. -
redir
: Ensures HTTP auto-upgrades to HTTPS.
docker-compose.yaml
services:
frontend-ui:
image: git.apps.hexmos.com:5050/hexmos/client/main
restart: unless-stopped
ports:
- "${FRONTEND_UI_PORT:-3000}:3030"
env_file:
- .env
caddy:
image: caddy:2-alpine
restart: unless-stopped
ports:
- "80:80"
- "443:443"
volumes:
- ./Caddyfile:/etc/caddy/Caddyfile:ro
- caddy_data:/data
- caddy_config:/config
depends_on:
- frontend-ui
volumes:
caddy_data:
caddy_config:
Notes
- The frontend (
frontend-ui
) exposes3030
. It's mapped to3000
for your local machine. - Caddy runs on
80
and443
as expected. - Volume mounts are used for Caddy’s persisted TLS data/config.
- No need to install certbot or mess with
/etc/nginx
.
Local Dev Setup
1. Add frontend.localhost
to your /etc/hosts
sudo echo "127.0.0.1 frontend.localhost" >> /etc/hosts
2. Spin up the stack
docker compose up --build
3. Access your app
Go to https://frontend.localhost
HTTPS
Reverse proxy
Logs
Compression
No drama
Bonus: Real HTTPS with Let’s Encrypt
In prod, just swap:
tls internal
to
tls you@example.com
Caddy will fetch and auto-renew Let’s Encrypt certificates — no extra CLI needed.
Use Cases
- Serve single-page apps
- Reverse proxy to backend APIs
- Auto-secure staging environments
- Replace nginx for microservices
Wrap-up
Caddy is the kind of tool that makes you wonder why you ever suffered through writing 100-line nginx configs.
It gives you security, simplicity, and clarity in one Go-powered package.
If you’re running small-to-medium services, especially for internal or developer use, Caddy is one of the cleanest choices out there.
Further Reading
I’ve been building FreeDevTools.
A collection of UI/UX-focused tools crafted to simplify workflows, save time, and reduce friction in searching tools/materials.
Any feedback or contributors are welcome!
It’s online, open-source, and ready for anyone to use.
👉 Check it out: FreeDevTools
⭐ Star it on GitHub: freedevtools
Let’s make it even better together.
Top comments (1)
This setup genuinely cuts down local dev headaches for me. Have you tried this with hot reloading setups or SSR frameworks too?