DEV Community

Cover image for Solved: I open-sourced my 6-year-old self-hostable Vercel alternative
Darian Vance
Darian Vance

Posted on • Originally published at wp.me

Solved: I open-sourced my 6-year-old self-hostable Vercel alternative

🚀 Executive Summary

TL;DR: Modern web deployment often faces challenges like vendor lock-in, unpredictable costs, and strict compliance needs when relying on managed platforms like Vercel. Building or adopting a self-hostable Vercel alternative provides ultimate control over infrastructure, optimizes costs, and ensures compliance, albeit with increased operational overhead.

🎯 Key Takeaways

  • Managed PaaS solutions like Vercel offer excellent developer experience and global CDNs but can lead to vendor lock-in, unpredictable scaling costs, and limitations for strict compliance or customization.
  • Open-source deployment platforms such as Coolify or CapRover provide a self-hostable middle ground, replicating PaaS features on owned infrastructure, reducing costs while still requiring server management.
  • Architecting a custom self-managed edge deployment platform grants ultimate control over every layer, enabling deep cost optimization and tailored compliance, but demands significant DevOps expertise and operational burden.

This post explores how to break free from proprietary platforms, diving deep into building and managing your own robust, high-performance web deployment infrastructure, inspired by a battle-tested, self-hostable Vercel alternative.

Symptoms: The Pain Points of Modern Deployment

As IT professionals, we often find ourselves caught between the convenience of managed services and the desire for control, cost efficiency, and compliance. While platforms like Vercel and Netlify offer unparalleled developer experience and global reach, relying solely on them can introduce several challenges:

  • ### Vendor Lock-in and Customization Limitations

Becoming deeply integrated with a specific platform’s ecosystem can make migration challenging. Your build processes, serverless functions, and edge configurations become tied to their unique APIs and deployment mechanisms. This can stifle innovation if your needs diverge from the platform’s standard offerings.

  • ### Unpredictable Costs at Scale

While free tiers are generous, scaling applications on managed platforms can lead to significant and sometimes unpredictable costs, especially concerning bandwidth (egress), build minutes, or serverless function invocations beyond typical allowances. For high-traffic applications or those with specific compute requirements, these costs can quickly accumulate.

  • ### Compliance and Data Residency Requirements

Certain industries (e.g., healthcare, finance) or geographies have strict data residency and compliance regulations (GDPR, HIPAA). Using a global managed service might not always meet these stringent requirements, necessitating deployments within specific geopolitical boundaries or on private infrastructure.

  • ### Performance and Edge Network Control

While managed platforms offer excellent global CDNs, there might be specific regions or niche network configurations where you desire more granular control over caching, routing, or edge functions. Relying on a third-party for your entire edge strategy limits your ability to fine-tune for very specific use cases.

  • ### Lack of Infrastructure Ownership and Transparency

For some organizations, having complete ownership of the underlying infrastructure, from hardware to OS to network, is non-negotiable. Managed platforms abstract away much of this, which is a benefit for many, but a blocker for others who require full visibility and control over their deployment pipeline and runtime environment.

If these symptoms resonate, it’s time to explore alternatives that grant you greater autonomy and control over your deployment infrastructure.

Solution 1: Leveraging a Cloud-Managed PaaS (Vercel/Netlify)

For many teams, the convenience, speed, and global reach of managed platforms like Vercel and Netlify are hard to beat. They excel at automated Git-based deployments, serverless functions, and integrated global CDNs, making them ideal for static sites, JAMstack applications, and serverless backends.

Pros

  • Developer Experience: Git-push deployments, automatic SSL, custom domains, CLI tools.
  • Global CDN: Instant global distribution and caching for optimal performance.
  • Serverless Functions: Integrated serverless compute with easy setup.
  • Collaboration: Built-in team features, preview deployments, environment management.
  • Maintenance-Free: No infrastructure to manage, patch, or scale yourself.

Cons

  • Cost at Scale: Can become expensive for high traffic, bandwidth, or extensive build times.
  • Vendor Lock-in: Build and function configurations are specific to the platform.
  • Limited Customization: Less control over the underlying runtime, network, or hardware.
  • Compliance: May not meet all strict data residency or compliance requirements.

Example: Deploying a Next.js Application with Vercel

Deploying to Vercel is often as simple as connecting your Git repository. For more complex setups, you can define build commands, output directories, and serverless function configurations in a vercel.json file.

{
  "rewrites": [
    { "source": "/api/(.*)", "destination": "/api" },
    { "source": "/legacy/(.*)", "destination": "https://legacy.example.com/$1" }
  ],
  "headers": [
    {
      "source": "/_next/static/(.*)",
      "headers": [{ "key": "Cache-Control", "value": "public, max-age=31536000, immutable" }]
    }
  ],
  "functions": {
    "api/my-function.ts": {
      "runtime": "node_16.x",
      "memory": 256,
      "maxDuration": 10
    }
  },
  "git": {
    "deploymentEnabled": {
      "github": true,
      "gitlab": true,
      "bitbucket": true
    },
    "postDeployHook": "npx playwright test --project=chromium"
  }
}
Enter fullscreen mode Exit fullscreen mode

After configuring, a simple git push to your connected repository will trigger an automatic deployment.

Solution 2: Open-Source Deployment Platforms (e.g., Coolify, CapRover)

A middle ground between fully managed services and architecting everything yourself is utilizing existing open-source deployment platforms. These tools aim to replicate the PaaS experience on your own infrastructure, often leveraging Docker and Git for automated deployments.

Pros

  • Self-Hostable: Full control over where your applications run.
  • Cost-Effective: Pay only for your servers, not for PaaS fees.
  • Ease of Use (relative): Simpler setup than a completely custom solution.
  • Community Support: Benefit from an active open-source community.

Cons

  • Operational Overhead: Still requires server management, backups, and monitoring.
  • Limited “Edge” Capabilities: May not offer the same global CDN or advanced edge functions out-of-the-box as Vercel.
  • Learning Curve: Requires familiarity with Docker, server administration, and the platform’s specific CLI/UI.
  • Feature Parity: May not have every feature found in commercial PaaS solutions.

Example: Deploying with Coolify

Coolify is an open-source, self-hostable Heroku/Netlify/Vercel alternative. It allows you to deploy various services (applications, databases, static sites) directly from Git repositories.

Installation (on your VPS/server):

# Ensure Docker is installed
curl -fsSL https://get.docker.com -o get-docker.sh
sh get-docker.sh

# Install Coolify
wget -qO - https://get.coollabs.io/coolify/install.sh | bash
Enter fullscreen mode Exit fullscreen mode

Deployment via UI/CLI:

Once installed, you access Coolify via its web UI. You connect your Git repository, specify the build pack (e.g., Node.js, Static site), and Coolify handles the rest, including automatic SSL and domain management.

# Example CLI command (conceptual, Coolify primarily uses UI)
# Assuming you've configured a project via the UI
# coolify deploy --project-id=my-app-id --git-ref=main
Enter fullscreen mode Exit fullscreen mode

This approach significantly reduces the manual effort compared to building everything from scratch, while still keeping your applications on your infrastructure.

Solution 3: Architecting Your Own Self-Managed Edge Deployment Platform (The “6-Year-Old Alternative” Approach)

This solution dives into the philosophy of the “6-year-old self-hostable Vercel alternative” – a robust, custom-built system designed for full control, cost efficiency, and specific compliance needs. This is about understanding the core components and processes that enable a Vercel-like experience on your own terms.

Pros

  • Ultimate Control: Complete ownership of every layer, from infrastructure to runtime.
  • Cost Optimization: Significant savings at scale by optimizing resource usage.
  • Compliance & Security: Design for specific regulatory requirements and integrate custom security measures.
  • Customization: Tailor every aspect of the build, deploy, and runtime environment.
  • Performance Tuning: Fine-tune every aspect of the network and caching for your specific use cases.

Cons

  • High Operational Overhead: Requires significant expertise in DevOps, system administration, and network engineering.
  • Initial Setup Time: Complex and time-consuming to build and configure.
  • Maintenance Burden: Responsible for all upgrades, patches, monitoring, and scaling.
  • No Built-in CDN: Requires integrating with external CDN providers (e.g., Cloudflare, Fastly) for global edge presence.

Architectural Overview

A self-managed edge deployment platform typically involves several key components working in concert:

  1. Git Repository & Webhooks: Source code management with triggers for new commits.
  2. Build Server: A dedicated server or container orchestrator (e.g., Kubernetes, Docker Swarm) that pulls code, runs builds, and creates deployable artifacts.
  3. Artifact Storage: An object storage solution (e.g., MinIO, S3-compatible) for static assets and build outputs.
  4. Deployment Agent/Orchestrator: A mechanism to distribute artifacts to edge nodes or target servers.
  5. Reverse Proxy & CDN Integration: An edge proxy (e.g., NGINX, Caddy, HAProxy) handling routing, SSL termination, caching, and serving content, ideally integrated with a global CDN.
  6. Serverless Function Runtime (Optional but Recommended): A platform like OpenFaaS or Knative for running ephemeral serverless code.
  7. Dashboard/CLI (Optional): A custom interface for managing deployments, domains, and environment variables.

Example: Building a Simplified Self-Managed Edge Deployment Pipeline

Let’s outline a simplified version focusing on Git-triggered static site deployment with basic serverless function capabilities, using common open-source tools.

1. Git Webhook Receiver (e.g., using a simple Node.js app or a tool like Webhookd)

On your build server, set up an endpoint that listens for Git push events.

// webhook-receiver.js (Simplified Node.js example)
const http = require('http');
const { exec } = require('child_process');

http.createServer((req, res) => {
  if (req.method === 'POST' && req.url === '/webhook') {
    let body = '';
    req.on('data', chunk => {
      body += chunk.toString(); // assuming JSON payload
    });
    req.on('end', () => {
      console.log('Received webhook:', body);
      // Validate webhook payload (e.g., check secret, verify ref)
      exec('./deploy.sh', (err, stdout, stderr) => {
        if (err) {
          console.error(`exec error: ${err}`);
          return res.writeHead(500).end('Deployment failed');
        }
        console.log(`stdout: ${stdout}`);
        console.error(`stderr: ${stderr}`);
        res.writeHead(200).end('Deployment initiated');
      });
    });
  } else {
    res.writeHead(404).end('Not Found');
  }
}).listen(8000, () => {
  console.log('Webhook receiver listening on port 8000');
});
Enter fullscreen mode Exit fullscreen mode

2. Deployment Script (deploy.sh)

This script handles pulling the latest code, building the application, and pushing assets to storage.

#!/bin/bash

REPO_DIR="/var/www/my-app-repo"
BUILD_DIR="/tmp/my-app-build"
STATIC_BUCKET="s3://my-static-assets-bucket" # Using MinIO or S3
APP_NAME="my-project-123" # Identifier for the current deployment

echo "--- Starting deployment for $APP_NAME ---"

# 1. Update repository
cd $REPO_DIR || exit
git pull origin main

# 2. Build the application (example for Next.js/React)
echo "Building application..."
npm install
npm run build # This typically creates a 'dist' or 'out' directory

# 3. Deploy static assets to object storage
# For static sites, sync 'out' or 'dist' folder
if [ -d "$REPO_DIR/out" ]; then
    echo "Syncing static assets to object storage..."
    /usr/local/bin/mc cp --recursive $REPO_DIR/out/ $STATIC_BUCKET/$APP_NAME/
    echo "Static assets deployed to $STATIC_BUCKET/$APP_NAME/"
elif [ -d "$REPO_DIR/dist" ]; then
    echo "Syncing static assets to object storage..."
    /usr/local/bin/mc cp --recursive $REPO_DIR/dist/ $STATIC_BUCKET/$APP_NAME/
    echo "Static assets deployed to $STATIC_BUCKET/$APP_NAME/"
else
    echo "No 'out' or 'dist' directory found. Assuming server-rendered or custom build."
    # Handle specific framework builds or container images here
fi

# 4. Trigger cache purge on CDN (if integrated)
# curl -X POST "https://api.cloudflare.com/client/v4/zones/$CLOUDFLARE_ZONE_ID/purge_cache" \
#      -H "X-Auth-Email: $CLOUDFLARE_EMAIL" \
#      -H "X-Auth-Key: $CLOUDFLARE_API_KEY" \
#      -H "Content-Type: application/json" \
#      --data '{"purge_everything":true}'

echo "--- Deployment finished for $APP_NAME ---"
Enter fullscreen mode Exit fullscreen mode

Note: The mc command line client for MinIO is used here as an S3-compatible tool.

3. Reverse Proxy & Routing (e.g., Caddy with dynamic configuration)

Caddy can dynamically serve content from your object storage or proxy to serverless functions. For dynamic domain management, you could use a small service that regenerates Caddy configs or leverages Caddy’s API.

# Caddyfile example
# This Caddyfile would need to be dynamically updated or reloaded
# when new deployments occur or new domains are added.

# Example for a static site served from object storage
my-app.example.com {
    reverse_proxy /api/* localhost:3001 # Proxy API calls to a serverless function endpoint
    # For static assets, we'd typically have a small service proxy to MinIO/S3
    # or directly serve if mounted locally.
    # A more robust solution involves a small golang/nodejs server
    # proxying to S3/MinIO to handle paths correctly.

    # This is a simplification. In reality, you'd likely use a dedicated
    # S3 proxy service or Cloudflare Workers for true edge functionality.
    # For local static files:
    # root * /var/www/my-app-repo/out
    # file_server
}

# Example for a serverless function endpoint
:3001 {
    reverse_proxy localhost:8080 # Points to your OpenFaaS gateway or custom function runtime
}
Enter fullscreen mode Exit fullscreen mode

4. Serverless Functions (e.g., OpenFaaS)

For serverless capabilities, you can deploy a platform like OpenFaaS on your Kubernetes cluster or a dedicated server.

# Example: Deploying a simple Node.js function with OpenFaaS
# 1. Install faas-cli
# curl -SLsf https://cli.openfaas.com | sh

# 2. Login to your OpenFaaS gateway
# faas-cli login --gateway=http://localhost:8080 --username admin --password your-password

# 3. Create a new function
# faas-cli new --lang node16 my-node-function

# 4. Define your function handler (./my-node-function/handler.js)
// module.exports = async (event, context) => {
//   const result = {
//     'status': 'Function executed!',
//     'input': event.body
//   };
//   return context.status(200).succeed(result);
// }

# 5. Build and deploy
# faas-cli up -f my-node-function.yml
Enter fullscreen mode Exit fullscreen mode

This example demonstrates the modularity of building your own platform. Each component can be swapped out or enhanced based on your specific needs, providing unparalleled flexibility.

Comparison: Managed PaaS vs. Self-Managed Edge Platform

To help decide which path is right for your organization, here’s a comparison:

Feature Cloud-Managed PaaS (e.g., Vercel) Self-Managed Edge Platform (DIY)
Developer Experience Excellent (zero-config, Git-push, CLI) Custom, requires initial setup (Git hooks, scripts)
Operational Overhead Minimal (platform handles infra, scaling, monitoring) High (responsible for all infra, scaling, updates, security)
Cost Structure Subscription/usage-based, can scale unpredictably Fixed server costs + internal labour, highly predictable at scale
Control & Customization Limited (vendor-specific features/runtimes) Full control (any tech stack, custom build steps, infra)
Performance (CDN) Integrated global CDN with edge functions Requires integrating with external CDN (e.g., Cloudflare)
Compliance & Data Residency Varies by provider, may have regional options but less granular Full control over data location and compliance stack
Security Managed by provider (shared responsibility model) Fully owned by your team, requiring dedicated effort
Time to Market Very fast for typical use cases Slower initial setup, faster for subsequent custom deployments

Conclusion

The decision between a managed PaaS, an open-source deployment platform, or a completely self-architected solution hinges on your organization’s specific needs, budget, compliance requirements, and available engineering talent.

While cloud-managed platforms offer undeniable convenience, the “6-year-old self-hostable Vercel alternative” represents a powerful philosophy: reclaiming ownership and tailoring your deployment infrastructure to perfectly fit your unique demands. It’s a path that demands greater investment in DevOps expertise but rewards with unparalleled control, cost efficiency, and the satisfaction of complete autonomy. For IT professionals dealing with the symptoms outlined, this self-managed approach can be a game-changer, transforming limitations into opportunities for innovation and control.


Darian Vance

👉 Read the original article on TechResolve.blog

Top comments (0)