The Problem: Local Business Data is a Rip-Off
If you've ever tried to build a cold outreach list for a local vertical — plumbers in Austin, dentists in Miami, HVAC companies in Chicago — you've hit the same wall:
- Apollo.io: ~$49/month, and their local business coverage is thin
- ZoomInfo: enterprise pricing, minimum commits
- Manual Google Maps scraping: breaks every few weeks, constant proxy management
- Outsourcing to a VA: slow, inconsistent formatting, still expensive
The real issue is that Google Maps has the freshest data. Businesses update their own listings. But scraping Google Maps directly means fighting bot detection, managing proxies, and dealing with a brittle browser automation setup.
There's a cleaner way.
The Solution: Apify's Managed Google Maps Actor
Apify runs managed web scraping infrastructure. Instead of maintaining your own scraper, you call an API, pass your search parameters, and get back structured JSON.
The actor we're using: lanky_quantifier/google-maps-scraper
It handles:
- Google's bot detection automatically
- Proxy rotation (no configuration needed)
- Structured output: name, address, phone, email, rating, reviews, website, coordinates
5-Step Setup
Step 1: Create a Free Apify Account
Go to apify.com and sign up. Free tier gives you $5/month in compute credits — enough to pull around 1,000 business leads.
Step 2: Get Your API Token
In the Apify dashboard → Settings → Integrations → copy your Personal API token.
Step 3: Install the Apify Python SDK
pip install apify-client
Step 4: Write the Scraper
from apify_client import ApifyClient
import json
APIFY_TOKEN = "YOUR_APIFY_API_TOKEN" # Replace with your token
ACTOR_ID = "lanky_quantifier/google-maps-scraper"
client = ApifyClient(APIFY_TOKEN)
# Configure your search
run_input = {
"searchQuery": "plumbers in Austin, TX",
"maxResults": 100,
"includeEmails": True,
"includePhones": True,
"includeRatings": True,
"language": "en",
}
# Run the actor and wait for it to finish
print("Starting actor run...")
run = client.actor(ACTOR_ID).call(run_input=run_input)
# Fetch results from the default dataset
print("Run finished. Fetching results...")
items = client.dataset(run["defaultDatasetId"]).iterate_items()
leads = []
for item in items:
leads.append({
"name": item.get("businessName", ""),
"address": item.get("address", ""),
"phone": item.get("phone", ""),
"email": item.get("email", ""),
"rating": item.get("rating", ""),
"reviews": item.get("reviewsCount", 0),
"website": item.get("website", ""),
"category": item.get("category", ""),
})
print(f"Pulled {len(leads)} leads")
# Save to JSON
with open("leads.json", "w") as f:
json.dump(leads, f, indent=2)
print("Saved to leads.json")
Step 5: Run It
python scraper.py
Output:
Starting actor run...
Run finished. Fetching results...
Pulled 100 leads
Saved to leads.json
Sample Output
[
{
"name": "Lone Star Plumbing Solutions",
"address": "1842 S Congress Ave, Austin, TX 78704",
"phone": "(512) 555-0193",
"email": "info@lonestarplumbing.com",
"rating": 4.7,
"reviews": 312,
"website": "https://lonestarplumbing.com",
"category": "Plumber"
},
{
"name": "Capitol City Drain & Pipe",
"address": "3301 Guadalupe St, Austin, TX 78705",
"phone": "(512) 555-0247",
"email": "contact@capitolcitydrain.com",
"rating": 4.4,
"reviews": 178,
"website": "https://capitolcitydrain.com",
"category": "Plumber"
}
]
Each record includes: businessName, address, phone, email, rating, reviewsCount, website, category, latitude, longitude, openingHours.
Export to CSV (Optional)
Add this after fetching leads:
import csv
with open("leads.csv", "w", newline="") as f:
writer = csv.DictWriter(f, fieldnames=leads[0].keys())
writer.writeheader()
writer.writerows(leads)
print("Saved to leads.csv")
Now you can open it in Google Sheets, import into your CRM, or feed into an outreach tool directly.
Cost Breakdown
| Volume | Apify Cost |
|---|---|
| 100 leads | ~$0.40 |
| 1,000 leads | ~$4.00 |
| 10,000 leads | ~$20.00 |
Compare that to Apollo.io at $49/month for leads that may not even cover your local vertical. The data is also fresher — businesses update their own Google Maps listings. You're pulling what they currently publish, not a database snapshot from last quarter.
Use Cases
- Cold outreach agencies — build niche prospect lists for any city/vertical on demand
- Local SEO consultants — find businesses with low ratings or no website (high-value prospects)
- Real estate investors — pull property managers, contractors, or brokers in target markets
- Franchise teams — map competitor density before opening new locations
What You DON'T Need
- No proxy setup
- No browser automation (Playwright, Puppeteer, Selenium)
- No VPS or always-on server
- No scraper maintenance when Google updates its layout
The actor handles all of that. You write a Python script, call an API, get structured data.
Try It Now
Actor URL: https://apify.com/lanky_quantifier/google-maps-scraper
The free tier includes $5 in credits — enough to test a real query and see the output format before committing to anything.
If you're building local business tooling, lead gen workflows, or need a one-time data pull, this is the fastest path from zero to structured data.
Have questions about the setup or want to see a specific use case? Drop a comment below.
Top comments (0)