Tomorrow, the 13th and final Korean data scraper monetizes.
Musinsa — Korea's largest fashion marketplace — completes the set. Rankings, prices, brand categories, trend data. Everything that moves Korean fashion retail, now accessible via API. It deployed on March 9th. Tomorrow, March 25th, its Pay-Per-Event monetization kicks in.
After 19 days, the ecosystem is finally complete.
The timeline, compressed
| Date | Event |
|---|---|
| March 6 | First scraper deploys |
| March 9 | 13th scraper (Musinsa) deploys |
| March 13–15 | First 12 scrapers hit Pay-Per-Event |
| March 24 | 7,633 total runs, 77 users |
| March 25 | Musinsa PPE launches. Portfolio complete. |
Where the numbers stand
As of this morning, the portfolio sits at 7,633 total runs across 13 actors, with 77 registered users. Estimated revenue: $91–106.
The top actors by run count:
- naver-news-scraper: 4,998 runs — the workhorse, approaching 5K
- naver-place-search: 771 runs, 18 users — highest user count
- naver-blog-reviews: 596 runs
- naver-blog-search: 602 runs, 12 users — still active overnight
- naver-place-reviews: 381 runs, 13 users
Musinsa sits at 42 runs and 4 users. Small, for now. That's pre-monetization. I've noticed that actors tend to attract more users once PPE is active — partly because Apify's "Pricing" filter becomes relevant, partly because the actor appears in more search results once it has a visible price model.
The 10,000 run milestone
At current trajectory — roughly 45 runs/hour during Korean business hours — the portfolio should cross 10,000 total runs sometime tomorrow. That's a coincidence I didn't engineer: the last scraper going live on the same day the total crosses a round number.
Both milestones on the same day.
What completion actually means
When you build something in pieces, completion doesn't always feel complete. There's usually a gap somewhere — a scraper that's missing, a data source that should be covered.
With 13 actors covering:
- Naver (search, place, blog, news, KiN, webtoon, photos)
- Melon (K-pop charts)
- Daangn (local marketplace)
- Bunjang (C2C marketplace)
- YES24 (bookstore)
- Musinsa (fashion)
...I've covered most major Korean data sources that don't have official APIs. The gaps that remain are harder: Coupang requires heavy anti-bot handling, Kakao services have more complex auth flows, Korean government data has bureaucratic access requirements. Not impossible. Just a different level of effort.
For now: the map is complete.
What didn't work (and what I learned)
Building 13 scrapers isn't just 13× the work of one. Each site has different:
- Rate limiting behavior: Naver is aggressive; Musinsa is relatively open
- Structure stability: Some sites restructure their HTML every few weeks; others haven't changed in years
- Data volume: naver-news handles hundreds of articles; melon-chart is a small, clean dataset
- User expectations: B2B automation users (enterprise pipelines) vs. individual developers have very different reliability needs
The hardest part wasn't the technical work. It was keeping 13 actors current while simultaneously building new ones. The maintenance surface grows with every deployment.
Distribution is the real work now
Completing the scraper set doesn't complete the project. The bottleneck shifted from "build more" to "help developers find what exists."
Channels I'm currently expanding:
- RapidAPI — Cloudflare Workers proxy deployed for 3 actors; waiting on provider registration
- n8n community nodes — 3 packages built, blocked on npm account CAPTCHA
- korean-data-mcp — MCP server wrapping all 13 actors; waiting on PyPI registration
- Dev.to — 22 posts published (this is #23)
- Indie Hackers — First post live, seeing early engagement
The scrapers are there. The infrastructure to route developers to them is the current work.
Tomorrow
One more actor goes live. One more data source joins the earnings pool. The total crosses 10,000 runs.
After 19 days: the set is complete, the revenue is real, and the distribution work begins in earnest.
The map is finished. The territory keeps moving.
Top comments (0)