DEV Community

John Still
John Still

Posted on

Stop Fighting Your Server: Use Nginx Like a Pro (with FastAPI, Load Balancing & Caching)

“Code’s ready — but deployment sucks?”
“Server’s running — but the public can’t access it?”
“Traffic spikes — and your app crashes?”

We’ve all been there. But the good news is: DevOps doesn’t have to be a pain, especially when you have Nginx and some modern tools like ServBay to back you up.

I started out messing with configs blindly, but after learning Nginx properly (and automating parts of it), deployments became much smoother and scalable.

In this guide, I’ll walk you through:

  • What Nginx is (and why it’s so damn popular)
  • How to use it to deploy a FastAPI app
  • How to handle traffic spikes with load balancing
  • How to improve performance via caching
  • And how ServBay can help you set up your dev environment in minutes

Let’s go! 👇


🤖 What Exactly Is Nginx?

Think of Nginx as the high-IQ front desk of your app:

  • It serves content directly like a web server.
  • It routes requests smartly like a reverse proxy.
  • It balances load across multiple servers when things get crowded.

As a high-performance open-source tool, Nginx can handle HTTP, TCP, UDP, caching, and even email protocols. That’s why it’s in almost every modern backend architecture today.

If you’re deploying anything beyond a toy project, Nginx is your best friend.


🧪 Deploying a FastAPI App with Nginx (Step-by-Step)

Let’s use a simple Todo app from GitHub and walk through a real deployment.

1️⃣ Run the App Locally

git clone https://github.com/HATAKEkakshi/Todo.git
cd Todo
pip install -r requirement.txt
uvicorn main:app --host 0.0.0.0 --port 8000
Enter fullscreen mode Exit fullscreen mode

Access it at:

  • http://localhost:8000 (local)
  • http://<your-public-ip>:8000 (cloud)

Great — now let’s make it production-ready.


2️⃣ Create a systemd Service (Auto-Restart on Reboot)

sudo vim /etc/systemd/system/fastapi.service
Enter fullscreen mode Exit fullscreen mode

Paste this:

[Unit]
Description=FastAPI Service
After=network.target

[Service]
User=root
WorkingDirectory=/root/Todo
ExecStart=/usr/bin/python3 -m uvicorn main:app --host 127.0.0.1 --port 8000 --reload
Restart=always

[Install]
WantedBy=multi-user.target
Enter fullscreen mode Exit fullscreen mode

Enable and start it:

sudo systemctl daemon-reload
sudo systemctl enable fastapi
sudo systemctl start fastapi
Enter fullscreen mode Exit fullscreen mode

Now your backend is always up, even after a reboot.


3️⃣ Use Nginx as a Reverse Proxy (Make It Public)

sudo vim /etc/nginx/sites-enabled/fastapi_nginx
Enter fullscreen mode Exit fullscreen mode
server {
    listen 80;
    server_name <your-public-ip>;

    access_log /var/log/nginx/fastapi.access.log;
    error_log /var/log/nginx/fastapi.error.log;

    location / {
        proxy_pass http://127.0.0.1:8000;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";
    }
}
Enter fullscreen mode Exit fullscreen mode

Then restart:

sudo nginx -t
sudo systemctl restart nginx
Enter fullscreen mode Exit fullscreen mode

Now open http://your-ip/docs — hello, FastAPI Swagger UI 👋


⚙️ Add Load Balancing with Nginx

Handling traffic spikes? Just spin up more backend instances and load balance them:

upstream myapp {
    server 192.168.1.100:8000;
    server 192.168.1.101:8000;
    server 192.168.1.102:8000;
}

server {
    listen 80;
    server_name <your-ip>;

    location / {
        proxy_pass http://myapp;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    }
}
Enter fullscreen mode Exit fullscreen mode

Nginx will distribute traffic round-robin style. You can even log the upstream response time with a custom log_format.


⚡ Boost Performance with Caching

Caching helps avoid hitting your backend for every request:

proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=custom_cache:10m inactive=60m;

location / {
    proxy_pass http://127.0.0.1:8000;
    proxy_cache custom_cache;
    add_header X-Proxy-Cache $upstream_cache_status;
}
Enter fullscreen mode Exit fullscreen mode

You'll get a X-Proxy-Cache: HIT or MISS header — a great way to debug and optimize.


🧩 Bonus: Speed Up Setup with ServBay

Let’s be honest: manually setting up Nginx, backend services, and environments can get tedious.

Enter ServBay — a dev-friendly tool that gives you:

  • ✅ One-click Nginx + PHP + DB environment
  • ✅ Multi-version PHP and CLI support
  • ✅ Domain + SSL mapping out of the box
  • ✅ GUI + command-line freedom

It recently added Windows support, plus system monitoring and Nginx config management — making cross-platform local development a breeze.

I use ServBay to spin up local environments in seconds. Works great with FastAPI, Laravel, WordPress, and pretty much anything.


🧠 TL;DR: Nginx Is Your DevOps Superpower

Mastering Nginx means:

  • Hosting static and dynamic sites ✅
  • Proxying backend services ✅
  • Load balancing traffic ✅
  • Caching for performance ✅

You don’t need to memorize configs — just understand what each block solves. Once you’ve done it a couple of times, you’ll wonder why you ever deployed without it.

And if you want to avoid copy-pasting YAML forever — definitely give ServBay a try.


🗣 Got thoughts? Drop them in the comments.

🧡 Like this post? Share it with your backend buddies.

Top comments (0)