Introduction
FastAPI and Docker are a powerful combination for building modern, scalable, and portable APIs. This guide walks you through creating a production-ready FastAPI application with a PostgreSQL database, containerized using Docker and Docker Compose. We'll cover everything from setup to advanced configurations, including security, monitoring, testing, and deployment considerations.
What is FastAPI?
FastAPI is a high-performance Python web framework for building APIs. It leverages Python type hints for automatic request/response validation and generates interactive Swagger UI documentation out of the box. Built on Starlette and Pydantic, it’s optimized for asynchronous programming and outperforms many traditional frameworks.
What is Docker?
Docker is a containerization platform that packages applications and their dependencies into lightweight, portable containers. Containers ensure consistency across development, testing, and production environments, eliminating dependency mismatches.
Why Combine FastAPI and Docker?
- Portability: Run your app anywhere Docker is installed.
- Scalability: Easily scale containers horizontally.
- Isolation: Keep dependencies isolated and reproducible.
- Production-Readiness: Simplify deployment with multi-service orchestration.
Step 1: Prerequisites
Before starting, ensure you have:
- Python 3.12+ installed locally for development.
- Docker and Docker Compose installed (see installation instructions below).
- Basic familiarity with Python, SQL, and terminal commands.
Installing FastAPI
pip install "fastapi[all]" sqlalchemy psycopg2-binary alembic
-
fastapi[all]
: Includesuvicorn
and other optional dependencies. -
sqlalchemy
: ORM for database interactions. -
psycopg2-binary
: PostgreSQL driver. -
alembic
: Database migration tool.
Installing Docker
- Windows/macOS: Install Docker Desktop.
- Linux:
sudo apt update && sudo apt install docker.io docker-compose -y
sudo systemctl enable docker --now
Verify:
docker --version && docker-compose --version
Step 2: Project Setup
Project Structure
A well-organized structure is key for maintainability:
fastapi_project/
├── app/
│ ├── __init__.py
│ ├── main.py # Entry point
│ ├── dependencies.py # Dependency injection
│ ├── routers/ # API route modules
│ │ ├── __init__.py
│ │ ├── users.py
│ │ └── items.py
│ ├── models/ # SQLAlchemy models
│ │ ├── __init__.py
│ │ ├── user.py
│ │ └── item.py
│ ├── schemas/ # Pydantic schemas
│ │ ├── __init__.py
│ │ ├── user.py
│ │ └── item.py
│ ├── db.py # Database configuration
│ ├── config.py # App configuration
│ ├── utils/ # Helper functions (e.g., logging)
│ │ ├── __init__.py
│ │ └── logger.py
│ └── tests/ # Unit tests
│ ├── __init__.py
│ └── test_main.py
├── Dockerfile # Docker configuration for FastAPI
├── docker-compose.yml # Multi-container orchestration
├── requirements.txt # Python dependencies
├── .dockerignore # Exclude unnecessary files
├── .env # Environment variables
├── alembic.ini # Alembic configuration
├── migrations/ # Database migrations
├── scripts/
│ ├── run.sh # Startup script
│ └── wait-for-db.sh # Wait for DB readiness
├── README.md # Project documentation
└── .gitignore # Git ignore file
Step 3: Writing the FastAPI Application
app/config.py
Store configuration settings:
import os
from dotenv import load_dotenv
load_dotenv()
class Config:
DATABASE_URL = os.getenv("DATABASE_URL", "postgresql://postgres:password@db:5432/fastapi_db")
SECRET_KEY = os.getenv("SECRET_KEY", "your-secret-key")
LOG_LEVEL = os.getenv("LOG_LEVEL", "INFO")
ENVIRONMENT = os.getenv("ENVIRONMENT", "development")
app/db.py
Database setup with connection pooling:
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
from app.config import Config
engine = create_engine(
Config.DATABASE_URL,
pool_size=10, # Max connections
max_overflow=20, # Extra connections allowed
pool_timeout=30 # Timeout for acquiring a connection
)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
app/models/user.py
Define a simple User model:
from sqlalchemy import Column, Integer, String
from app.db import Base
class User(Base):
__tablename__ = "users"
id = Column(Integer, primary_key=True, index=True)
username = Column(String, unique=True, index=True)
email = Column(String, unique=True, index=True)
app/schemas/user.py
Pydantic schema for validation:
from pydantic import BaseModel
class UserCreate(BaseModel):
username: str
email: str
class UserResponse(BaseModel):
id: int
username: str
email: str
class Config:
orm_mode = True
app/routers/users.py
API endpoints:
from fastapi import APIRouter, Depends, HTTPException
from sqlalchemy.orm import Session
from app.models.user import User
from app.schemas.user import UserCreate, UserResponse
from app.db import get_db
router = APIRouter(prefix="/users", tags=["users"])
@router.post("/", response_model=UserResponse)
def create_user(user: UserCreate, db: Session = Depends(get_db)):
db_user = User(username=user.username, email=user.email)
db.add(db_user)
db.commit()
db.refresh(db_user)
return db_user
@router.get("/{user_id}", response_model=UserResponse)
def get_user(user_id: int, db: Session = Depends(get_db)):
user = db.query(User).filter(User.id == user_id).first()
if not user:
raise HTTPException(status_code=404, detail="User not found")
return user
app/main.py
Main application:
from fastapi import FastAPI
from app.routers import users
from app.db import Base, engine
from app.utils.logger import setup_logging
app = FastAPI(title="FastAPI Dockerized API", version="1.0.0")
setup_logging() # Initialize logging
# Create tables on startup
Base.metadata.create_all(bind=engine)
app.include_router(users.router)
@app.get("/")
def read_root():
return {"message": "FastAPI running inside Docker with PostgreSQL!"}
app/utils/logger.py
Add logging for better debugging:
import logging
from app.config import Config
def setup_logging():
logging.basicConfig(
level=getattr(logging, Config.LOG_LEVEL),
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s"
)
Step 4: Dockerfile Configuration
Dockerfile
A multi-stage build for efficiency:
# Build stage
FROM python:3.12-slim AS builder
WORKDIR /app
COPY requirements.txt .
RUN pip install --user --no-cache-dir -r requirements.txt
# Final stage
FROM python:3.12-slim
WORKDIR /app
COPY --from=builder /root/.local /root/.local
COPY . .
ENV PATH=/root/.local/bin:$PATH
EXPOSE 8000
CMD ["./scripts/run.sh"]
scripts/run.sh
Startup script with DB wait:
#!/bin/bash
./scripts/wait-for-db.sh db 5432
uvicorn app.main:app --host 0.0.0.0 --port 8000 --workers 4
scripts/wait-for-db.sh
Wait for PostgreSQL to be ready:
#!/bin/bash
host="$1"
port="$2"
until pg_isready -h "$host" -p "$port"; do
echo "Waiting for database at $host:$port..."
sleep 1
done
echo "Database is ready!"
Make scripts executable:
chmod +x scripts/run.sh scripts/wait-for-db.sh
Step 5: Docker Compose Setup
docker-compose.yml
version: '3.8'
services:
fastapi:
build: .
ports:
- "8000:8000"
environment:
- DATABASE_URL=postgresql://postgres:password@db:5432/fastapi_db
- ENVIRONMENT=production
- LOG_LEVEL=INFO
depends_on:
db:
condition: service_healthy
volumes:
- ./app:/app # Hot-reload for development
restart: unless-stopped
db:
image: postgres:15
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: password
POSTGRES_DB: fastapi_db
ports:
- "5432:5432"
volumes:
- postgres_data:/var/lib/postgresql/data
healthcheck:
test: ["CMD", "pg_isready", "-U", "postgres"]
interval: 5s
timeout: 5s
retries: 5
volumes:
postgres_data:
Step 6: Environment Variables
.env
DATABASE_URL=postgresql://postgres:password@db:5432/fastapi_db
SECRET_KEY=your-very-secure-secret-key
LOG_LEVEL=INFO
ENVIRONMENT=production
.dockerignore
__pycache__
*.pyc
*.pyo
.env
.git
.gitignore
README.md
migrations/
tests/
*.log
Step 7: Database Migrations with Alembic
Initialize Alembic
alembic init migrations
alembic.ini
Update sqlalchemy.url
:
sqlalchemy.url = postgresql://postgres:password@db:5432/fastapi_db
Generate Migration
docker-compose exec fastapi alembic revision --autogenerate -m "Initial Migration"
Apply Migration
docker-compose exec fastapi alembic upgrade head
Step 8: Testing
app/tests/test_main.py
Basic test with pytest
:
from fastapi.testclient import TestClient
from app.main import app
client = TestClient(app)
def test_read_root():
response = client.get("/")
assert response.status_code == 200
assert response.json() == {"message": "FastAPI running inside Docker with PostgreSQL!"}
Install pytest
:
pip install pytest
Run tests:
pytest
Step 9: Security Enhancements
- HTTPS: Use a reverse proxy like Nginx or Traefik with SSL.
- Secrets Management: Store sensitive data in a vault (e.g., HashiCorp Vault).
- Input Validation: Rely on Pydantic schemas to prevent injection attacks.
-
Rate Limiting: Add
slowapi
:
pip install slowapi
Example:
from slowapi import Limiter
from slowapi.util import get_remote_address
limiter = Limiter(key_func=get_remote_address)
app.state.limiter = limiter
@app.get("/limited")
@limiter.limit("5/minute")
def limited_endpoint():
return {"message": "This is rate-limited!"}
Step 10: Monitoring and Logging
-
Prometheus Metrics: Integrate
prometheus-fastapi-instrumentator
. -
Structured Logging: Use
loguru
for JSON logs:
pip install loguru
Step 11: CI/CD Integration
Example GitHub Actions workflow (.github/workflows/deploy.yml
):
name: Deploy
on:
push:
branches: [main]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Build and Push Docker Image
run: |
docker build -t my-fastapi-app .
docker tag my-fastapi-app myrepo/my-fastapi-app:latest
echo "${{ secrets.DOCKER_PASSWORD }}" | docker login -u "${{ secrets.DOCKER_USERNAME }}" --password-stdin
docker push myrepo/my-fastapi-app:latest
Conclusion
You’ve now built a production-ready FastAPI application with:
- Dockerized services (FastAPI + PostgreSQL).
- Database migrations with Alembic.
- Security, logging, and testing.
- Scalability and CI/CD readiness.
Run it:
docker-compose up --build -d
Visit http://localhost:8000/docs
to explore your Swagger UI 🚀
Top comments (0)