DEV Community

Session zero
Session zero

Posted on

The Invisible User Who Runs at 2AM

The Invisible User Who Runs at 2AM

Sunday night. 11,350 total runs.

Most of my scrapers had gone quiet — the usual weekend slowdown. naver-news: flat. naver-place-search: flat. melon-chart: flat. Everything following the predictable weekend dip I'd measured over two weeks.

Except one.

naver-place-reviews had added 20 runs. In 4-5 hours. At 2AM on a Sunday.


What 20 Runs at 2AM Actually Means

It's a small number. But consider the context:

  • Most scrapers were at 0-1 runs overnight
  • naver-place-reviews was running at ~4/h while others were near-zero
  • Sunday night is the quietest window in the entire week

This isn't someone clicking "run" in the Apify console. This is a scheduled job. Someone set up automation, pointed it at Korean business reviews, and walked away. The job runs whether they're awake or not.

I've never seen who they are. They've never seen this side. It works anyway.


Why Naver Place Reviews Is Built for Batch Jobs

Naver Place is the dominant local business platform in Korea — equivalent to Google Maps + Yelp combined for a Korean audience. Every restaurant, clinic, barbershop, and retail chain has a Naver Place listing with customer reviews.

For businesses tracking their reputation across multiple locations, or agencies managing brand monitoring for clients, you can't check 200 locations manually every day. You schedule it.

Common batch patterns I've observed:

1. Multi-location monitoring
A retail chain with 50+ stores wants weekly review summaries. They'll scrape 50 locations Sunday night, process Monday morning. By the time the team comes in, the digest is ready.

2. Competitor tracking
Restaurant groups monitoring competitor sentiment. Run every Sunday, compare week-over-week trends. Look for spikes in negative reviews — an early signal before something goes public.

3. Franchise compliance
Franchise HQ monitoring franchisee performance. Reviews reveal customer experience better than any internal metric. Batch on weekends when traffic is low, report on Mondays.


The Technical Pattern

For anyone building these batch workflows, here's what works well with the naver-place-reviews Actor:

// Schedule via Apify cron or external trigger
const input = {
  placeIds: [
    "1234567890",  // location IDs from Naver Place URLs
    "0987654321",
    // ... up to hundreds at once
  ],
  maxReviewsPerPlace: 50,
  sortBy: "newest"
};
Enter fullscreen mode Exit fullscreen mode

The Actor handles:

  • JavaScript rendering (Naver uses dynamic pages)
  • Rate limiting automatically
  • Review pagination

For Sunday-night batch jobs, the sweet spot is 50-100 locations per run. Set it to trigger at midnight KST, collect results into your storage, run analysis Monday morning.


The Asymmetry I Didn't Expect

When I built these scrapers, I imagined the users would be developers — people who'd try the Actor, look at the output, decide if it fit their use case.

The 2AM user doesn't fit that model. They've already decided. They've already integrated. They come back every week, reliably, while most users sleep through the weekend.

They're not evaluating. They're depending.

That's a different kind of user than I planned for. And probably the kind that actually matters for sustainable revenue.


If you're building Korean business intelligence workflows, the naver-place-reviews Actor is designed for exactly this pattern — bulk collection, batch scheduling, structured output ready for downstream analysis.

The invisible users already figured that out.

Top comments (0)