The Problem
I needed a reliable way to download videos from multiple platforms — YouTube, TikTok, Instagram, Reddit, X, and more. Existing tools were either ad-infested, unreliable, or didn't offer an API for automation.
So I built BlackHole — a video downloading platform with a clean UI and a REST API for developers.
The Tech Stack
Frontend:
- Next.js 15 with React 19 and TypeScript
- Tailwind CSS v4 for styling
- Clerk for authentication
- Convex for the database (real-time, serverless)
Backend (Self-hosted on Hetzner VPS):
- yt-dlp — handles YouTube downloads with cookie management
- Cobalt — handles TikTok, Instagram, Reddit, X, Facebook, Pinterest, Bluesky
- Nginx with SSL termination
- Docker Compose for orchestration
Payments:
- Mollie (European payment provider — supports iDEAL, cards, PayPal)
Architecture: Why Two Download Engines?
This was the first big design decision. YouTube is hard. Their anti-bot measures evolve weekly. Using a single tool for everything means when YouTube breaks (and it will), your entire platform goes down.
My solution: split the workload.
User Request
│
├── YouTube URL? → yt-dlp service (port 8000)
│ └── Cookie manager + JS solver
│
└── Other platform? → Cobalt service (port 9000)
└── Residential proxy for geo-restricted content
yt-dlp runs with a custom cookie manager that rotates browser cookies and handles YouTube's consent pages. Cobalt handles everything else — it's faster and more reliable for non-YouTube platforms.
The Cookie Problem (YouTube)
YouTube aggressively fights automated downloads. Here's what I learned:
- Cookies expire fast — YouTube session cookies last hours, not days
- IP reputation matters — datacenter IPs get rate-limited instantly
- JavaScript challenges — YouTube serves JS challenges that headless browsers struggle with
My solution was a unified cookie manager:
# Simplified cookie manager concept
class CookieManager:
def __init__(self):
self.cookies = self.load_cookies()
self.last_refresh = time.time()
def get_cookies(self):
if time.time() - self.last_refresh > COOKIE_TTL:
self.refresh_cookies()
return self.cookies
def refresh_cookies(self):
# Extract fresh cookies from browser session
self.cookies = extract_browser_cookies()
self.last_refresh = time.time()
self.notify_health_status() # Telegram alerts
The manager sends health alerts to Telegram — if cookies go stale or downloads start failing, I know within 10 minutes.
Rate Limiting: Don't Get Abused
Free tier users get 5 downloads/day at 720p. But I also needed to prevent abuse from bots hammering the API.
// IP-based rate limiting middleware
const RATE_LIMIT = 10; // requests per minute per IP
const WINDOW_MS = 60_000;
const rateLimiter = new Map<string, { count: number; resetAt: number }>();
export function checkRateLimit(ip: string): boolean {
const now = Date.now();
const entry = rateLimiter.get(ip);
if (!entry || now > entry.resetAt) {
rateLimiter.set(ip, { count: 1, resetAt: now + WINDOW_MS });
return true;
}
if (entry.count >= RATE_LIMIT) return false;
entry.count++;
return true;
}
The API (For Developers)
The Business tier includes a REST API with HMAC-SHA256 signed webhooks:
curl -X POST https://api.blhole.com/v1/download \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"url": "https://youtube.com/watch?v=dQw4w9WgXcQ",
"quality": "1080p",
"format": "mp4",
"webhook_url": "https://your-app.com/webhook"
}'
Webhooks are signed so you can verify they're from BlackHole:
const crypto = require('crypto');
function verifyWebhook(payload, signature, secret) {
const expected = crypto
.createHmac('sha256', secret)
.update(payload)
.digest('hex');
return crypto.timingSafeEqual(
Buffer.from(signature),
Buffer.from(expected)
);
}
Deployment: Hetzner VPS + Vercel
The frontend runs on Vercel (free tier handles it fine). The download services run on a Hetzner VPS in Helsinki:
# docker-compose.yml (simplified)
services:
ytdlp:
build: ./ytdlp-service
ports:
- "8000:8000"
volumes:
- ./cookies:/app/cookies
cobalt:
image: ghcr.io/imputnet/cobalt:latest
ports:
- "9000:9000"
environment:
- HTTP_PROXY=${PROXY_URL} # For geo-restricted content
nginx:
image: nginx:alpine
ports:
- "443:443"
volumes:
- ./nginx.conf:/etc/nginx/nginx.conf
- /etc/letsencrypt:/etc/letsencrypt
Total VPS cost: ~€10/month for a server that handles hundreds of downloads daily.
Monitoring: Know Before Users Complain
I set up a monitoring stack that checks every platform every hour:
- Download a test video from each supported platform
- Verify file integrity (size > 0, valid container)
- Alert via Telegram if any platform fails
- Track success rates over time
This has saved me multiple times — I've caught YouTube cookie expirations, Cobalt updates that broke TikTok, and VPS disk space issues before any user reported them.
Pricing Strategy
After researching lifetime deal pricing extensively:
| Plan | Price | Limits |
|---|---|---|
| Free | $0 | 5/day, 720p, 20min |
| Pro | $29/year or $149 lifetime | 200/day, 4K, 60min |
| Business | Custom | API access, webhooks, batch |
Key lesson: Don't price lifetime deals too low on products with ongoing server costs. I initially had the lifetime at $69 — way too cheap when each user costs bandwidth and compute forever.
Results
- Launched on Product Hunt (February 2026)
- Listed on G2, DEV Community, SaaSHub, Fazier, and more
- Handles YouTube, TikTok, Instagram, Reddit, X, Facebook, Pinterest, and Bluesky
- REST API serving developer integrations
What I'd Do Differently
- Start with the API first — the developer audience is more willing to pay
- Don't fight YouTube alone — yt-dlp community updates are essential
- Monitoring from day one — not after the first outage
- Price higher — my initial $69 lifetime was leaving money on the table
Try It Out
🔗 BlackHole — Download videos from 8+ platforms
The free tier gives you 5 downloads/day — enough to test everything. If you're building something that needs video downloads, the API might save you weeks of dealing with platform quirks.
What's your experience with video downloading tools? Have you built anything similar? Drop a comment — I'd love to hear about your approach.
Top comments (0)