Serverless is good, serverless is cool, serverless is innovative. But how far can we push with its uses? Let's navigate some ups and downs around serverless technology.
When someone hears and "understands" the serverless concept for their fist time. what come to their mind is, So:
I can move all these my applications to serverless and save cost?
- I can have managed services but yet very cheap?
- This mean no more paying for idle servers?
- This this remove the need for DevOps?
- I can scale automatically without any complex setup?
Stop being happy (^_-), you are assuming without knowing what is going on. The right questions comes when understanding: What, Why, When, How, Who regarding serverless.
What is Serverless ?
1. Execution Model
Your code runs:
- On demand
- In response to events (HTTP requests, messages, schedules, file uploads, etc.)
- For short-lived executions (typically seconds to minutes) You don’t keep servers running, the platform spins up execution environments when needed.
2. Cost Model
| You pay for | Great for | Not always cheaper for |
|---|---|---|
| Execution time | Spiky traffic | Constant high-throughput workloads |
| Number of requests/events | Infrequent workloads | Long-running processes |
| Resources consumed (memory, duration) | Event-driven systems | Predictable, steady traffic |
3. Scaling Model
| What scaling is | What scaling is not |
|---|---|
| Automatic | You still need to understand concurrency limits |
| Horizontal | You still design for backpressure and failures |
| Event-driven | Bad architecture can still scale expensive problems |
4. Operations Reality
Serverless does not remove DevOps, it changes it.
| What you still need | Where the difference lays |
|---|---|
| Monitoring and logging | Less infrastructure ops |
| CI/CD pipelines | More platform |
| Security and IAM | More observability |
| Cost controls | More architecture work |
| Incident response | More studies of compatibilities |
The Right Questions Come After Understanding
Once the excitement fades, the real questions begin:
What workloads fit serverless?
- Why should I use it instead of containers or VMs?
- When does it save cost and when does it not?
- How do I design, test, monitor, and debug at scale?
- Who owns reliability, security, and cost control?
Serverless is a powerful tool, not a universal solution.
Do I have your attention now? Fine, let's code!
Steps to Migrate
Step 0: The Starting Point (FastAPI App)
# apps.py
from fastapi import FastAPI, status
app = FastAPI()
@app.get("/health", status_code=status.HTTP_200_OK)
def health():
return {"status": "ok"}
@app.post("/items", status_code=status.HTTP_201_CREATED)
def create_item():
return {"message": "item created"}
@app.put("/items/{item_id}", status_code=status.HTTP_200_OK)
def update_item(item_id: int):
return {"message": f"item {item_id} updated"}
Running the code:
uvicorn app:app --reload
Step 1: Understand the Serverless Mapping (Mental Model)
Step 2: Add Mangum (ASGI → Lambda Adapter)
Install dependencies
pip install fastapi mangum
Update to apps.py file.
from fastapi import FastAPI, status
from mangum import Mangum
app = FastAPI()
@app.get("/health", status_code=status.HTTP_200_OK)
def health():
return {"status": "ok"}
@app.post("/items", status_code=status.HTTP_201_CREATED)
def create_item():
return {"message": "item created"}
@app.put("/items/{item_id}", status_code=status.HTTP_200_OK)
def update_item(item_id: int):
return {"message": f"item {item_id} updated"}
# Lambda entry point
handler = Mangum(app)
The handleris what AWS Lambda will invoke.
Step 3: Create requirements.txt
fastapi
mangum
Step 4: Package the Application for Lambda
Create a build directory
mkdir package
pip install -r requirements.txt -t package
cp app.py package/
Zip it
cd package
zip -r app.zip .
Step 5: Create the Lambda Function
Go to Go to
- AWS Lambda
- Create functionAWS Lambda
- Runtime: Python 3.10+
- Architecture: x86_64
- Upload app.zip
- Set Handler: app.handler
- Memory: 512 MB (good default)
- Timeout: 10–15 seconds
Step 6: Create API Gateway (HTTP API)
- Go to API Gateway
- Create HTTP API
- Integration:
- Type: Lambda
- Choose your Lambda function
- Routes:
- ANY /{proxy+}
- Deploy
This single route lets FastAPI handle all routing internally.
Step 7: Test your endpoint
Its a wrap!




Top comments (0)