Lesson learned about Vercel vs. Traditional Backend
On my journey to learn Next.js in-depth through projects, i tried to architect my Next.js backend server like normal golang's backend (dependency injection). Turns out its not possible.
I treat Vercel’s /api routes like a "mini Express server." This is a fundamental architectural error. While both run Node.js, their underlying infrastructure (Serverless vs. Long-Running) dictates completely different rules.
Then i dig further to understand why service model like Vercel are different from backend in common...
Other services that work similar are Netlify, Cloudflare Pages / Workers, AWS Amplify.
1. Process Lifecycle: Ephemeral vs. Persistent
Traditional Backend (Long-Running)
The Concept: The process starts and stays alive until you manually stop it or it crashes.
The "Global Root" is permanent. Memory is allocated once and remains available across thousands of different requests.
Implication: You can rely on the server "remembering" things in its internal RAM for days or weeks.
Vercel (Serverless)
The Concept: The process is Ephemeral. It is "born" when a request arrives and "executed" (terminated or frozen) the millisecond the response is sent.
The "Global Root" is destroyed frequently.
Implication: Any internal state is wiped. You must treat every request as if the server just performed a "Cold Start."
2. State Management: The Global Variable Trap
Traditional Backend (Long-Running)
Memory: Global variables are Shared. If User A updates a global counter, User B sees the updated value because they are hitting the same process.
Garbage Collection: As long as the process lives, GC ignores global variables because they are "reachable." This allows for in-memory caching or simple counters.
Vercel (Serverless)
Global variables are Isolated. If 10 users hit your API, Vercel may spin up 10 separate "Execution Environment" (lets call this instance).
The Trap: User A updates a counter in Instance #1. User B hits Instance #2, which has a completely different Global Root where the counter is still 0.
(This is bit more nuance to this,
Global variable can still persist in its warm state, but reuse only happens sequentially, not simultaneously. if its Idle, it will return to its cold state, all global variable will be destroyed too.)
You cannot use in memory data to keep state you want to persist on different request. You must use an external store like Redis or PostgreSQL.
3. Concurrency: Thread Pooling vs. Horizontal Scaling
Traditional Backend (Long-Running)
Mechanism: One server handles multiple concurrent requests using a single thread with an Event Loop (Node.js). It manages a "pool" of connections.
High traffic is handled by the server working harder. If it gets overwhelmed, requests queue up.
Vercel (Serverless)
Mechanism: Vercel scales by Multiplication. If traffic spikes, it doesn't make one function work harder; it creates 1,000 copies of that function.
This creates a "Connection Exhaustion" problem. If 1,000 functions spin up, they all try to open a new connection to your database at once, potentially crashing your DB. You must use a Connection Pooler (like Prisma Accelerate or Supabase Pooling).
4. Execution Timing: Synchronous vs. Background Tasks
Traditional Backend (Long-Running)
You can send a response to the user and continue running code in the background (e.g., res.send(); fireAndForgetEmail();).
The process stays alive, so the email function finishes its work comfortably in the background.
Vercel (Serverless)
The Flow: The moment res.send() is called, the runtime environment is paused or killed.
The Disaster: If you try to run a background task after the response, Vercel will likely "cut the power" mid-execution. Your email might be half-sent or never sent at all.
Every task must be finished before the response, or delegated to a queue service (like Upstash Workflow).
5. Persistent Connections: WebSockets vs. HTTP
Traditional Backend (Long-Running)
Connectivity: Supports Stateful connections.
A client can keep a TCP socket open (WebSockets) for real-time chat or live updates. The server process is always there to "hold the other end of the string."
Vercel (Serverless)
Connectivity: Supports Stateless HTTP requests only.
Since the function dies after the response, it cannot "hold" a connection open. If you try to use socket.io, it will fail because the "server" disappears every few seconds.
For real-time features, you must use a third-party "Realtime-as-a-Service" like Pusher or Ably.
This article made with AI assistance, and note to myself.
Top comments (0)