I have 48 actors running on Apify. Same platform, same developer, same tech stack. Two of those actors tell completely different stories about how software makes money.
My LinkedIn Employee Scraper has 353 users. It runs thousands of times per month. It charges $0.005 per profile scraped. Total monthly revenue from all those users and all those runs? About $9.
My Google Maps Lead Intel actor has 2 users. Two. They run it about 22 times per month between them, paying roughly $25 per run. Monthly revenue? Around $540.
That is a 60x difference in revenue per user. Same platform. Same developer. Same billing system.
What Makes Google Maps Worth $25 a Run
The LinkedIn scraper returns raw data. Names, titles, company info. It does one thing and does it well, but developers treat it like a commodity. They plug it into their own pipelines and expect it to cost almost nothing. At $0.005 per profile, it basically does.
Google Maps Lead Intel returns something different. For every business it finds, you get validated email addresses, a lead score based on 12 online presence signals, Google Ads detection, website tech stack analysis, social media profiles, and review sentiment. It is not scraping. It is intelligence.
The two users paying $25 per run are lead generation agencies. One services appointment setting clients across 15 metro areas. The other runs local SEO audits. For both of them, a single $25 run replaces 3 to 4 hours of manual research that would cost $200+ if done by a VA.
The Buyer Problem
Here is what I missed for months: the LinkedIn scraper attracts developers. Developers are price sensitive. They can build their own scraper given enough time, so they benchmark your tool against their hourly rate. If your scraper costs more than 20 minutes of their time to build, they will build it themselves.
The Google Maps actor attracts agencies. Agency buyers think in terms of client value, not engineering time. If their client pays $1,500/month for lead gen services and your tool costs $25 per market, that is a rounding error in their margin. They do not negotiate. They do not churn. They run it more as they sign more clients.
Same platform. Totally different buyer psychology.
What Actually Changed
The technical shift was not dramatic. I stopped returning raw JSON blobs and started returning enriched, scored, validated output. Specifically:
- Raw Google Maps results became leads with quality scores
- Guessed emails became validated emails with deliverability checks
- Basic business info became competitive intelligence with ad spend signals
- Flat data became actionable reports that agencies could forward to clients
The pricing shift followed naturally. When your output saves someone 4 hours of work and costs them $25, you are not competing on data volume. You are competing on time saved and decision quality.
The Numbers I Wish I Knew Earlier
353 users at $0.005/run = roughly $9/month. Those users submit support tickets, request features, and compare you to 6 other LinkedIn scrapers in the Apify Store.
2 users at $25/run = roughly $540/month. Those users send you "thank you" messages and ask if you can build them something custom.
If I could go back and rebuild my portfolio from scratch, I would build fewer tools and make each one solve a complete problem for a specific buyer. Not "scrape this website" but "find me qualified leads in this market with contact info I can trust."
The Takeaway
Stop counting users. Start counting revenue per user. Build for the buyer who measures your tool against the cost of the alternative, not against the cost of building it themselves. Package intelligence, not data.
The developer who needs 10,000 LinkedIn profiles will always shop on price. The agency owner who needs 200 qualified leads by Friday will pay whatever gets it done.
I know which buyer I am building for now.
Built in Nairobi. 48 actors in production. Questions? Drop them below.
Top comments (0)