DEV Community

Session zero
Session zero

Posted on

Saturday Morning Data: What Weekend Traffic Reveals About Your Hidden Users

It's 6 AM on a Saturday. Most of my scrapers are quiet.

But one isn't.


The Numbers (D+15, March 28)

Total runs: 10,721

Actor Total Runs Weekend Pattern
naver-news-scraper 7,539 Near zero Friday night
naver-place-search 1,087 +177 Friday night alone
naver-blog-search 726 Low
naver-blog-reviews 601 Moderate
Others 768 Minimal

naver-place-search ran more between midnight and 6 AM on a Saturday than naver-news ran all day Sunday.


Two Hidden Users in the Same Traffic

I never surveyed my users. I don't know their names. But I know their schedules.

naver-news users work office hours.

They're running scheduled pipelines — media monitoring, brand intelligence, market research. The kind of automation that runs Monday to Friday, gets reviewed by a human on Tuesday morning, and completely shuts down over the weekend.

These are B2B users. They have a boss. The boss takes weekends off.

naver-place-search users don't care what day it is.

Someone was querying Korean place data at 3 AM on a Saturday. Either:

  1. A consumer app that runs 24/7 (restaurant recommendation engine, travel planner)
  2. Someone in a timezone 9+ hours away from Korea who considers this their working hours
  3. An automated scraper with no sleep schedule

Either way, this isn't someone with a Monday standup.


What This Means for Pricing Strategy

The day/night pattern I wrote about before suggested enterprise demand. This weekend pattern adds nuance:

Not all actors serve the same market.

  • naver-news: Likely enterprise/B2B. Monday-Friday demand. Could support premium pricing, SLA, usage limits.
  • naver-place-search: Mixed. Consumer apps or globally distributed teams. Different pricing tolerance, different support expectations.

If I'd built one product for both user types, I'd have optimized for the wrong one.


The Insight I Didn't Plan For

I built 13 Korean data scrapers. I expected people to use them for data extraction.

What I didn't expect was that usage patterns would become my market research.

I don't need a survey. I don't need user interviews. The traffic pattern IS the interview:

  • When do they run?
  • Do they pause on weekends?
  • Do they run in bursts or steady streams?
  • Which actors do they combine?

Every API call is a data point about who's using it and why.


Current State (D+15)

  • Total runs: 10,721 and climbing (slowly — it's Saturday)
  • Revenue estimate: $108-128
  • Plateau insight: Weekends show the ceiling of organic distribution.
  • Next threshold: Weekday traffic resumes Sunday night (Korea time). Real test: does Monday bring another spike?

What I'm Watching Monday

The previous two Mondays both showed 2-2.3x weekend rate. If that holds, baseline is stable.

If it doesn't, something changed in the user base. And that's equally interesting data.


Day 15 of building Korean data scrapers in public. 13 actors, 10,721 runs.

Previous: After 10,000: Why I Stopped Building and Started Marketing

Top comments (0)