DEV Community

Allen Elzayn
Allen Elzayn

Posted on

Building Streaky: Zero-Cost Production Architecture (Part 4)

How I built a production app that handles 1000+ users for $0/month using free tiers.

Part 4: Running Production on Free Tiers

In Part 1, I shared the journey from sequential to distributed processing. In Part 2, I explained the Rust VPS proxy. In Part 3, I dove deep into the distributed queue system.

Now, let's talk about the most satisfying part: running a production app that handles 1000+ users for $0/month.


The Challenge

Building a production app is one thing. Running it sustainably is another.

Requirements:

  • Handle current load (10 users/day)
  • Scale to 1000+ users without code changes
  • 99.9% uptime
  • Fast response times (< 5 seconds)
  • Secure (encryption, authentication)
  • Zero cost (or as close as possible)

The constraint: I can't afford $50-100/month for infrastructure, and my app must survive entirely on free-tier magic, caffeine, and sheer willpower.


The Stack

After evaluating options, I settled on this stack:

Frontend: Next.js on Vercel (Free tier)
Backend: Cloudflare Workers + D1 (Free tier)
Proxy: Rust on Koyeb (Free tier)
Total cost: $0/month
Enter fullscreen mode Exit fullscreen mode

Let's break down each component.


Component 1: Frontend (Vercel)

Tech: Next.js 15, React 19, Tailwind CSS, shadcn/ui

Deployment: Vercel

Free tier limits:

  • 100 GB bandwidth/month
  • Unlimited deployments
  • Automatic HTTPS
  • Edge network (global CDN)
  • Serverless functions (100 GB-hours)

Current usage:

  • Bandwidth: ~2 GB/month (mostly static assets)
  • Deployments: ~20/month (development + production)
  • Functions: ~1 GB-hour/month (NextAuth.js)

Headroom: 50x capacity

Why Vercel?

  • Best Next.js experience (they built it)
  • Zero config deployment (git push = deploy)
  • Automatic preview deployments
  • Edge network (fast globally)
  • Generous free tier

Configuration:

// package.json
{
  "name": "frontend",
  "version": "0.1.0",
  "scripts": {
    "dev": "next dev --turbopack",
    "build": "next build",
    "start": "next start"
  },
  "dependencies": {
    "next": "15.5.5",
    "react": "19.1.0",
    "next-auth": "^5.0.0-beta.29",
    "tailwindcss": "^3.4.0"
  }
}
Enter fullscreen mode Exit fullscreen mode

Deployment:

# Connect GitHub repo to Vercel
# Every push to main = automatic deployment
git push origin main
Enter fullscreen mode Exit fullscreen mode

Cost: $0/month


Component 2: Backend API (Cloudflare Workers)

Tech: Hono framework, TypeScript

Deployment: Cloudflare Workers

Free tier limits:

  • 100,000 requests/day
  • 10ms CPU time per request
  • 128 MB memory per request
  • Unlimited bandwidth

Current usage:

  • Requests: ~50/day (cron + API calls)
  • CPU time: ~3 seconds/day (distributed across requests)
  • Memory: ~20 MB per request

Headroom: 2000x capacity

Why Cloudflare Workers?

  • Edge network (deployed globally)
  • Fast cold starts (< 10ms)
  • No servers to manage
  • Generous free tier
  • Built-in cron triggers

Configuration (wrangler.toml):

name = "streaky"
main = "src/index.ts"
compatibility_date = "2025-10-11"

# Observability
[observability]
enabled = true

# D1 Database Binding
[[d1_databases]]
binding = "DB"
database_name = "streaky-db"
database_id = "your-database-id"

# Analytics Engine Binding
[[analytics_engine_datasets]]
binding = "ANALYTICS"
dataset = "streaky_metrics"

# Cron Triggers - Daily at 12:00 UTC
[triggers]
crons = ["0 12 * * *"]

# Service Bindings
[[services]]
binding = "SELF"
service = "streaky"

# Environment Variables
[vars]
VPS_URL = "https://your-vps-url.koyeb.app"
Enter fullscreen mode Exit fullscreen mode

Deployment:

cd web/backend
npx wrangler deploy
Enter fullscreen mode Exit fullscreen mode

Cost: $0/month


Component 3: Database (Cloudflare D1)

Tech: SQLite (via D1)

Free tier limits:

  • 5 GB storage
  • 5 million reads/day
  • 100,000 writes/day

Current usage:

  • Storage: ~50 MB (users, notifications, queue)
  • Reads: ~100/day (user queries, queue checks)
  • Writes: ~50/day (queue updates, notifications)

Headroom: 2000x capacity

Why D1?

  • SQLite (familiar, powerful)
  • Integrated with Workers (no network latency)
  • Generous free tier
  • Automatic backups
  • No connection pooling needed

Schema:

-- Users table
CREATE TABLE users (
  id TEXT PRIMARY KEY,
  github_username TEXT NOT NULL UNIQUE,
  github_pat TEXT,
  discord_webhook TEXT,
  telegram_token TEXT,
  telegram_chat_id TEXT,
  is_active INTEGER DEFAULT 1,
  created_at TEXT DEFAULT (datetime('now')),
  updated_at TEXT DEFAULT (datetime('now'))
);

-- Notifications table
CREATE TABLE notifications (
  id TEXT PRIMARY KEY,
  user_id TEXT NOT NULL,
  channel TEXT NOT NULL,
  status TEXT NOT NULL,
  error_message TEXT,
  sent_at TEXT DEFAULT (datetime('now')),
  FOREIGN KEY (user_id) REFERENCES users(id)
);

-- Cron queue table
CREATE TABLE cron_queue (
  id TEXT PRIMARY KEY,
  user_id TEXT NOT NULL,
  batch_id TEXT NOT NULL,
  status TEXT NOT NULL,
  created_at TEXT DEFAULT (datetime('now')),
  started_at TEXT,
  completed_at TEXT,
  error_message TEXT,
  retry_count INTEGER DEFAULT 0
);

-- Indexes
CREATE INDEX idx_cron_queue_status ON cron_queue(status);
CREATE INDEX idx_cron_queue_batch ON cron_queue(batch_id);
CREATE INDEX idx_notifications_user ON notifications(user_id);
Enter fullscreen mode Exit fullscreen mode

Management:

# Create database
npx wrangler d1 create streaky-db

# Run migrations
npx wrangler d1 execute streaky-db --file=schema.sql

# Query database
npx wrangler d1 execute streaky-db --command="SELECT * FROM users"
Enter fullscreen mode Exit fullscreen mode

Cost: $0/month


Component 4: Notification Proxy (Koyeb)

Tech: Rust, Axum framework

Deployment: Koyeb (Docker)

Free tier limits:

  • 512 MB RAM
  • 0.1 vCPU
  • 100 GB bandwidth/month
  • 2.5 GB disk

Current usage:

  • RAM: ~20 MB (idle), ~40 MB (peak)
  • CPU: < 1% (idle), ~5% (peak)
  • Bandwidth: ~500 MB/month
  • Disk: 85 MB (Docker image)

Headroom: 10x capacity (RAM is the bottleneck)

Why Koyeb?

  • Generous free tier (512 MB RAM)
  • Docker support
  • Automatic HTTPS
  • Global edge network

Dockerfile:

# Build stage
FROM rust:1.83-slim AS builder

WORKDIR /app

# Install dependencies
RUN apt-get update && apt-get install -y \
    pkg-config \
    libssl-dev \
    && rm -rf /var/lib/apt/lists/*

# Copy manifests
COPY Cargo.toml Cargo.lock ./

# Copy source code
COPY src ./src

# Build release binary
RUN cargo build --release

# Runtime stage
FROM debian:bookworm-slim

WORKDIR /app

# Install runtime dependencies
RUN apt-get update && apt-get install -y \
    ca-certificates \
    libssl3 \
    && rm -rf /var/lib/apt/lists/*

# Copy binary from builder
COPY --from=builder /app/target/release/streaky-server /app/

# Create non-root user
RUN useradd -r -s /bin/false appuser && \
    chown appuser:appuser /app/streaky-server
USER appuser

# Expose port
EXPOSE 8000

# Run application
CMD ["./streaky-server"]
Enter fullscreen mode Exit fullscreen mode

Deployment:

  1. Push to GitHub
  2. Connect Koyeb to repo
  3. Configure build (Dockerfile path: server/Dockerfile)
  4. Set environment variables (ENCRYPTION_KEY, VPS_SECRET)
  5. Deploy

Cost: $0/month


Architecture Diagram

┌─────────────────────────────────────────────────────────────┐
│                    User Browser                             │
└────────────────────────┬────────────────────────────────────┘
                         │
                         │ HTTPS
                         ▼
┌─────────────────────────────────────────────────────────────┐
│              Vercel (Next.js Frontend)                      │
│  • Static pages (HTML, CSS, JS)                            │
│  • NextAuth.js (GitHub OAuth)                              │
│  • Edge network (global CDN)                               │
│  Cost: $0/month                                            │
└────────────────────────┬────────────────────────────────────┘
                         │
                         │ HTTPS API calls
                         ▼
┌─────────────────────────────────────────────────────────────┐
│         Cloudflare Workers (Backend API)                    │
│  • Hono framework                                           │
│  • D1 database (SQLite)                                    │
│  • Service Bindings (distributed cron)                     │
│  • Analytics Engine                                         │
│  Cost: $0/month                                            │
└────────────────────────┬────────────────────────────────────┘
                         │
                         │ HTTPS + Auth Header
                         │ (Encrypted credentials)
                         ▼
┌─────────────────────────────────────────────────────────────┐
│              Koyeb (Rust VPS Proxy)                         │
│  • Axum web framework                                       │
│  • AES-256-GCM decryption                                  │
│  • Discord/Telegram API calls                              │
│  • Clean IP (no rate limiting)                             │
│  Cost: $0/month                                            │
└────────────────────────┬────────────────────────────────────┘
                         │
                         │ HTTPS
                         ▼
┌─────────────────────────────────────────────────────────────┐
│         Discord / Telegram APIs                             │
│  • Receive notifications                                    │
│  • No rate limiting (clean IP)                             │
└─────────────────────────────────────────────────────────────┘

Total Cost: $0/month
Enter fullscreen mode Exit fullscreen mode

Cost Breakdown

Current Usage (10 users/day)

Service Free Tier Limit Current Usage Headroom
Vercel 100 GB bandwidth 2 GB 50x
Cloudflare Workers 100k req/day 50 req/day 2000x
Cloudflare D1 100k writes/day 50 writes/day 2000x
Koyeb 512 MB RAM 40 MB 12x

Total cost: $0/month

Projected Usage (1000 users/day)

Service Free Tier Limit Projected Usage Still Free?
Vercel 100 GB bandwidth 20 GB Yes (5x headroom)
Cloudflare Workers 100k req/day 5k req/day Yes (20x headroom)
Cloudflare D1 100k writes/day 5k writes/day Yes (20x headroom)
Koyeb 512 MB RAM 200 MB Yes (2.5x headroom)

Total cost: Still $0/month

When Would I Need to Pay?

Vercel:

  • Pro tier ($20/month) needed at ~2000 users/day (200 GB bandwidth)

Cloudflare Workers:

  • Paid tier ($5/10M requests) needed at ~20,000 users/day (2M requests/month)

Cloudflare D1:

  • Paid tier ($5/month) needed at ~20,000 users/day (600k writes/month)

Koyeb:

  • Paid tier ($7/month) needed at ~5000 users/day (512 MB RAM limit)

First bottleneck: Koyeb RAM at ~5000 users/day

Solution: Upgrade Koyeb to $7/month (1 GB RAM) or optimize Rust memory usage


Performance Metrics

Response Times

Frontend (Vercel):

  • Time to First Byte (TTFB): ~50ms
  • First Contentful Paint (FCP): ~200ms
  • Largest Contentful Paint (LCP): ~500ms

Backend (Cloudflare Workers):

  • API response time: ~100ms (cold start)
  • API response time: ~10ms (warm)
  • Database query time: ~5ms

Notification Proxy (Koyeb):

  • Cold start: ~10 seconds (VPS sleeping)
  • Warm: ~3.6 seconds (VPS active)
  • Processing time: ~100ms (decrypt + forward)

Reliability

Uptime:

  • Vercel: 99.99% (SLA)
  • Cloudflare: 99.99% (SLA)
  • Koyeb: 99.9% (observed)

Error rates:

  • Frontend: < 0.1%
  • Backend: < 0.1%
  • Notifications: 0% (100% success rate)

Scalability

Current load:

  • 10 users/day
  • 50 requests/day
  • 50 database writes/day

Theoretical capacity (free tier):

  • 5000 users/day (Koyeb RAM bottleneck)
  • 100,000 requests/day (Cloudflare Workers)
  • 100,000 writes/day (D1)

Headroom: 500x current load


Optimization Strategies

1. Minimize Database Writes

Problem: D1 free tier = 100k writes/day

Solution:

  • Batch queue inserts (1 transaction for N users)
  • Cache GitHub API responses (reduce redundant queries)
  • Cleanup old data (delete after 7 days)

Result: 2 writes per user (queue + notification) instead of 5+

2. Optimize Rust Memory Usage

Problem: Koyeb free tier = 512 MB RAM

Solution:

  • Use Rust (20 MB idle vs Node.js 50 MB)
  • Stateless design (no in-memory cache)
  • Small Docker image (85 MB vs 200+ MB for Node.js)

Result: 10x more capacity on same RAM

3. Edge Caching

Problem: Repeated API calls for same data

Solution:

  • Cloudflare CDN caches static assets
  • Vercel Edge Network caches pages
  • SWR (stale-while-revalidate) on frontend

Result: 90% cache hit rate, faster load times

4. Distributed Processing

Problem: Single Worker CPU limit (30 seconds)

Solution:

  • Service Bindings (N Workers for N users)
  • Each Worker gets fresh CPU budget
  • Parallel processing

Result: 10x faster processing, no TLE errors


Monitoring & Observability

Cloudflare Analytics

Built-in metrics:

  • Request count
  • Error rate
  • Response time (p50, p95, p99)
  • CPU time usage
  • Memory usage

Access:

# View analytics
npx wrangler tail streaky

# Real-time logs
npx wrangler tail streaky --format=pretty
Enter fullscreen mode Exit fullscreen mode

Vercel Analytics

Built-in metrics:

  • Page views
  • Unique visitors
  • Core Web Vitals (LCP, FID, CLS)
  • Deployment status

Access: Vercel dashboard

Custom Logging

Implementation:

// Log to Analytics Engine
await env.ANALYTICS.writeDataPoint({
  blobs: ['user_processed'],
  doubles: [processingTime],
  indexes: [userId],
});

// Query analytics
const results = await env.DB.prepare(`
  SELECT 
    blob1 as event,
    AVG(double1) as avg_time,
    COUNT(*) as count
  FROM analytics
  WHERE timestamp > datetime('now', '-7 days')
  GROUP BY blob1
`).all();
Enter fullscreen mode Exit fullscreen mode

Security Considerations

1. Encryption

All sensitive data encrypted:

  • GitHub PAT (AES-256-GCM)
  • Discord webhooks (AES-256-GCM)
  • Telegram tokens (AES-256-GCM)

Key storage:

  • Cloudflare Secrets (not in code)
  • Koyeb environment variables (not in image)

2. Authentication

Frontend:

  • NextAuth.js (GitHub OAuth)
  • JWT tokens (signed, verified)
  • HTTP-only cookies

Backend:

  • JWT verification
  • API secret headers (X-Cron-Secret)
  • Rate limiting (60 req/min)

3. Network Security

HTTPS everywhere:

  • Vercel: Automatic HTTPS
  • Cloudflare: Automatic HTTPS
  • Koyeb: Automatic HTTPS

CORS:

  • Strict allowlist (only frontend domain)
  • No wildcard origins

Security headers:

  • X-Content-Type-Options: nosniff
  • X-Frame-Options: DENY
  • Strict-Transport-Security: max-age=31536000

Lessons Learned

1. Free Tiers Are Generous

Modern platforms offer incredible free tiers:

  • Vercel: 100 GB bandwidth
  • Cloudflare: 100k requests/day
  • Koyeb: 512 MB VPS

You can build production apps for $0.

2. Architecture Matters

Smart architecture maximizes free tier capacity:

  • Distributed processing (avoid CPU limits)
  • Edge caching (reduce requests)
  • Stateless design (minimize memory)

3. Rust Is Worth It

For resource-constrained environments:

  • 10x less memory than Node.js
  • 5x smaller binary
  • Blazing fast

4. Monitoring Is Essential

Even on free tiers:

  • Cloudflare Analytics (built-in)
  • Vercel Analytics (built-in)
  • Custom logging (Analytics Engine)

5. Plan for Scale

Design for 100x current load:

  • Identify bottlenecks early
  • Know when you'll need to pay
  • Have upgrade path ready

When to Upgrade

Signals you need paid tier:

  1. Hitting rate limits (100k req/day on Workers)
  2. Memory errors (512 MB on Koyeb)
  3. Slow response times (need more resources)
  4. Storage limits (5 GB on D1)

Upgrade path:

  1. Koyeb: $7/month (1 GB RAM) - First bottleneck at ~5000 users
  2. Cloudflare Workers: $5/10M requests - Needed at ~20,000 users
  3. Vercel: $20/month (Pro) - Needed at ~2000 users
  4. D1: $5/month - Needed at ~20,000 users

Total cost at 5000 users: $7/month (just Koyeb)


Conclusion

Building a production app for $0/month is possible with:

  • Smart architecture (distributed, stateless, cached)
  • Right tech stack (Rust, Workers, D1)
  • Generous free tiers (Vercel, Cloudflare, Koyeb)

Current status:

  • 10 users/day
  • $0/month cost
  • 99.9% uptime
  • 100% notification success rate

Capacity:

  • Can handle 5000 users/day on free tier
  • 500x current load
  • First paid tier at $7/month

Key takeaway: Free tiers are powerful. Use them wisely.


Try It Out

Live App: streakyy.vercel.app

GitHub: github.com/0xReLogic/Streaky

Full Stack:


Series Complete

This concludes the 4-part series on building Streaky:

  • Part 1: The journey from sequential to distributed processing
  • Part 2: Solving IP blocking with Rust VPS
  • Part 3: Distributed queue system with Service Bindings
  • Part 4: Zero-cost production architecture

Thanks for following along! If you found this helpful, give it a reaction and follow for more content.


Let's Connect

Building on free tiers? Have questions about the stack? Drop a comment!

GitHub: @0xReLogic

Project: Streaky

Top comments (0)