DEV Community

Tagg
Tagg

Posted on

Zero-Cost API Infrastructure: Running a DaaS Business on an Idle Server

I Had a Server Doing Nothing 22 Hours a Day

I run quantitative trading bots — one for the Korean stock market, one for the US market, one for crypto. They sit on AWS and GCP, watching for signals, executing trades, then going back to sleep.

Most of the time, these servers are idle. And AWS isn't free forever — once the 12-month free tier expired, I was looking at ~$10/month for a t2.micro doing almost nothing for most of the day.

So I asked myself: what if the idle server could pay for itself?


The Idea: Data-as-a-Service on RapidAPI

I'm not a professional developer. My background is in IT architecture — I understand systems, infrastructure, and data flows. But I'd never built an API from scratch before.

What changed was AI-assisted development. With vibe coding (writing in natural language, letting AI generate the code), I could suddenly build things that would have taken me months to learn the traditional way.

I needed a product that:

  1. Runs on the same server as my trading bots without interference
  2. Requires minimal maintenance — no daily babysitting
  3. Has near-zero marginal cost — data I can get for free
  4. Generates recurring revenue — subscriptions, not one-time sales

That pointed me to DaaS (Data-as-a-Service): take publicly available but hard-to-access data, clean it up, and serve it through a REST API on RapidAPI.


Why Korean Data?

Here's something most developers outside Korea don't realize: Korean government data is locked behind walls that have nothing to do with technology.

  • Government websites are Korean-only
  • Authentication requires Korean phone numbers or certificates
  • Data is scattered across multiple agencies in incompatible formats
  • Documentation is in Korean PDFs, not API specs

If you're a developer in Romania or Brazil trying to check whether a cosmetic ingredient is legal in Korea, you're stuck. Google Translate won't help you navigate a government portal that requires Korean authentication.

This is a moat. Not a technical one — a linguistic and bureaucratic one. And it applies to dozens of Korean datasets that foreign businesses need.

I chose cosmetic ingredients because K-Beauty is a global trend, and regulatory compliance data is something businesses will pay for.


The Stack: Deliberately Boring

I intentionally picked the simplest tools that would work:

Component Choice Why
Language Python Already using it for trading bots
Framework FastAPI Fast, auto-docs, type validation
Database SQLite Zero config, read-only, fast for ~22K records
Server Gunicorn + Uvicorn Production-ready, 2 workers
Container Docker Same environment everywhere
Hosting AWS EC2 (existing) Already paying for it
Distribution RapidAPI Handles auth, billing, marketing
Monitoring Telegram alerts 10 alert types, instant notifications

Why SQLite? My data is essentially static — updated monthly. SQLite in read-only mode gives me sub-50ms response times with zero database administration. No connection pools, no migration scripts, no separate database server. The entire database is a single file I can back up with cp.

Why RapidAPI? They take 20% of revenue, which stings. But they handle payment processing, API key management, rate limiting, and put my API in front of millions of developers. For a solo operation, that trade-off makes sense — especially at the start.


Cost Structure: Actually Zero

Item Monthly Cost
AWS EC2 $0 (shared with trading bots)
Domain $0 (not purchased yet)
SSL $0 (RapidAPI handles it)
Database $0 (SQLite file)
CDN $0 (not needed)
Monitoring $0 (Telegram bot API is free)
Total fixed cost $0

The only cost is RapidAPI's 20% commission — but that only kicks in when I'm making money. Zero risk.

If one PRO subscriber signs up at $29/month, I keep ~$23 after commission. That alone covers the AWS bill when free tier ends.


Resource Sharing: Bots + API on One Server

This was the part I was most worried about. Trading bots need to execute quickly when signals fire. Would the API interfere?

In practice, it's fine:

  • Trading bots spike CPU for a few seconds during market hours, then sleep
  • API handles occasional requests throughout the day
  • Docker isolates the API from the bot processes
  • SQLite read-only means no disk write contention

The key insight: most API businesses at the early stage get very few requests. I'm not serving 10,000 requests per second — I'm serving maybe 10 per day. The server barely notices.

If traffic ever grows to the point where contention becomes real, that's a champagne problem. I'll spin up a dedicated instance and the API revenue will easily cover it.


What I Built

K-Beauty Cosmetic Ingredients API — 21,796 ingredients with regulatory data across 10 countries.

Four pricing tiers:

Tier Price Requests/Month
BASIC Free 100
PRO $29 2,000
ULTRA $79 5,000
MEGA $199 15,000

The free tier exists for discovery. The paid tiers add more data fields, partial text search, and country-specific regulation data.


Infrastructure That Runs Itself

Since I can't babysit the server, I automated everything I could:

Telegram Alerts (10 types):

  • Server start/stop
  • 500 errors (instant notification)
  • Authentication failures
  • Rate limit violations
  • New subscriber / cancellation
  • Daily stats and weekly revenue (planned)

Docker restart policy:

--restart=always
Enter fullscreen mode Exit fullscreen mode

If the container crashes at 3 AM, Docker brings it back up. I find out in the morning from the Telegram alert, but the API was down for maybe 5 seconds.

Server management script:

bash deploy.sh
# 1) Status check
# 2) Restart
# 3) Full redeploy (stop → build → run)
# 4) Live logs
# 5) Stop
# 6) Rollback to previous DB
Enter fullscreen mode Exit fullscreen mode

One script, six options. No remembering Docker commands.

Data updates: Monthly. Download two Excel files from the government open data portal, run a build script, deploy. Total time: about 15 minutes of hands-on work.


Security on a Budget

Just because it's a side project doesn't mean I can skip security:

  • RapidAPI Proxy Secret — blocks direct access to the server
  • Per-tier rate limiting — prevents abuse (BASIC: 10/min, MEGA: 40/min)
  • Input validation — min/max length, type checking, SQL wildcard escaping
  • Docker non-root user — container runs as apiuser, not root
  • SQLite read-only mode — even if someone gets in, they can't modify data
  • HSTS headers — forces HTTPS
  • Security headers — nosniff, frame deny, XSS protection

Total cost of all this security: $0. It's all code-level.


Lessons After One Month

1. The hardest part isn't building — it's marketing

The API works. The docs are good. The data is solid. But discovery is the bottleneck. RapidAPI's marketplace helps, but it's not magic. I've written blog posts, set up GitHub examples, and optimized my listing. It's a slow grind.

2. Government data is messy but valuable

Cleaning and normalizing data from multiple government sources took far longer than building the API. But that mess is exactly what creates value — if it were clean and easy to access, someone would have done it already.

3. Vibe coding works for real products

I built this entire system — scraper, database builder, API server, deployment pipeline, monitoring — using AI-assisted development. Not as a toy project, but as a production service handling real requests. The key is having enough IT knowledge to architect the system, even if you can't write every line of code from memory.

4. Start with zero fixed costs

If your side project costs $0 to run, you can wait indefinitely for product-market fit. No pressure to monetize immediately. No "burning runway." The server is already there, the tools are free, and RapidAPI only charges when you earn. This patience is a competitive advantage.


Revenue So Far

Let's be honest: close to zero. One free-tier user from Romania tested the API. No paid subscribers yet.

But that's okay. The API is live, the infrastructure runs itself, and my total investment is time — no money. I'll keep improving the product, writing about it, and waiting for the right customers to find it.

The trading bots haven't made me rich either. But between DaaS subscriptions and quantitative trading, I'm building multiple income streams that don't require me to be physically present. That's the goal.


The Framework: Applicable Beyond K-Beauty

The approach works for any closed-ecosystem data:

  1. Find data that's publicly available but practically inaccessible (language barriers, authentication, format issues)
  2. Clean it and serve it through a standard REST API
  3. Host it on infrastructure you're already paying for
  4. Distribute through a marketplace that handles billing
  5. Automate monitoring so you don't have to watch it

Korean business registration data, customs clearance codes, pharmaceutical approvals — there are dozens of datasets locked behind Korean-language government portals that international businesses need.

Each one is a potential API product. Same stack, same server, same pattern.


Try It

If you're curious about the API itself:

🔗 K-Beauty Cosmetic Ingredients API on RapidAPI

📂 GitHub - Example Code

If you're thinking about turning your own idle server into a side business, I'd love to hear your ideas. Drop a comment below.

Top comments (0)