DEV Community

agenthustler
agenthustler

Posted on

How to Scrape Google Maps Business Data with Python

Google Maps is the largest business directory on the planet. Millions of businesses with structured data — names, addresses, phone numbers, websites, reviews, ratings, opening hours, GPS coordinates.

For anyone doing lead generation, market research, or competitive analysis, this data is incredibly valuable. But Google doesn't offer an export button. So how do you actually get it into a spreadsheet or database?

Let's talk about what Google Maps data enables — and the most cost-effective way to extract it.


5 Use Cases for Google Maps Business Data

1. Lead Generation for Local Services

Need a list of every plumber in Phoenix? Every dentist in Chicago? Every restaurant in Austin?

Google Maps has it. With structured extraction, you can:

  • Build targeted outreach lists by industry and location
  • Filter by rating (e.g., only businesses rated 4.0+)
  • Get direct phone numbers and websites for outreach
  • Identify businesses without websites — they need your web design service

A marketing agency pulling 5,000 local businesses per city can build a qualified lead pipeline in hours instead of weeks of manual research.

2. Competitive Density Mapping

Opening a new coffee shop? A gym? A laundromat? Before you sign a lease, you need to know how saturated the area is.

Google Maps data lets you:

  • Count competitors within any radius
  • Map business density by neighborhood
  • Identify underserved areas with demand but few providers
  • Track competitor ratings — areas with low-rated competitors are opportunities

3. Location Intelligence for New Business

Retail and restaurant chains spend millions on location analysis. With Google Maps data, you can do a first-pass analysis for practically nothing:

  • Map foot-traffic proxies (review volume correlates with visits)
  • Analyze the business mix around potential locations
  • Compare review volumes across neighborhoods to estimate demand
  • Cross-reference with demographics data for a complete picture

4. Review Monitoring and Reputation Management

For multi-location businesses, tracking reviews across all your Google Maps listings manually doesn't scale.

Automated extraction lets you:

  • Monitor average ratings across all locations weekly
  • Get alerted when any location drops below a threshold
  • Track competitor review trends alongside your own
  • Identify which locations need reputation management attention

5. Data Enrichment for Sales Teams

Your CRM has company names but is missing phone numbers, addresses, and websites? Google Maps can fill the gaps:

  • Match company names to Google Maps listings
  • Pull verified contact information
  • Add review ratings as a lead scoring signal
  • Enrich with business categories for better segmentation

Why Other Methods Are Expensive or Unreliable

Google Places API: Accurate but Costly

Google's official Places API works well — at a price:

API Call Cost per 1,000
Text Search $32.00
Place Details $17.00
Nearby Search $32.00

Need details (phone, website, reviews) for each result? That's $32 + $17 = $49 per 1,000 businesses. Extracting 10,000 restaurants across a city costs ~$490. For ongoing monitoring, costs compound fast.

DIY Scrapers: Cheap but Fragile

Building your own scraper with Playwright or Puppeteer seems tempting. But Google Maps is one of the hardest sites to scrape reliably:

  • CSS class names rotate frequently (selectors that worked yesterday break today)
  • CAPTCHAs appear after a few dozen requests
  • IP blocks kick in quickly, even with proxy rotation
  • Infinite scroll pagination is hard to control programmatically
  • You only get ~20 results per scroll load

Maintaining a Google Maps scraper is effectively a part-time job. Teams report spending 10-15 hours/month just keeping their scrapers functional.


The Practical Approach: Managed Extraction

Instead of paying $49/1,000 for the API or fighting Google's anti-bot systems, use a managed scraper that handles the complexity:

from apify_client import ApifyClient

client = ApifyClient("YOUR_APIFY_TOKEN")

run_input = {
    "searchStrings": ["restaurants in Austin TX"],
    "maxResults": 500,
    "language": "en",
    "countryCode": "us",
}

run = client.actor("cryptosignals/google-maps-scraper").call(run_input=run_input)

for biz in client.dataset(run["defaultDatasetId"]).iterate_items():
    print(f"{biz['name']} | {biz.get('rating', 'N/A')}★ | {biz.get('phone', 'N/A')}")
Enter fullscreen mode Exit fullscreen mode

What you get back — clean, structured JSON for every listing:

{
  "name": "Franklin Barbecue",
  "address": "900 E 11th St, Austin, TX 78702",
  "phone": "+1-512-653-1187",
  "website": "https://franklinbbq.com",
  "rating": 4.7,
  "reviewCount": 12847,
  "category": "Barbecue restaurant",
  "latitude": 30.2701,
  "longitude": -97.7281,
  "openingHours": "Tue-Sun 11AM-3PM",
  "priceLevel": "$$",
  "placeUrl": "https://maps.google.com/..."
}
Enter fullscreen mode Exit fullscreen mode

No proxy management. No CAPTCHA solving. No broken selectors.


Real Example: Building a Lead List for a Marketing Agency

Here's how an agency might build a qualified lead pipeline:

from apify_client import ApifyClient

client = ApifyClient("YOUR_APIFY_TOKEN")

CITIES = ["Austin TX", "San Antonio TX", "Houston TX"]
BUSINESS_TYPE = "plumbers"

all_leads = []
for city in CITIES:
    run = client.actor("cryptosignals/google-maps-scraper").call(
        run_input={
            "searchStrings": [f"{BUSINESS_TYPE} in {city}"],
            "maxResults": 200,
        }
    )
    for biz in client.dataset(run["defaultDatasetId"]).iterate_items():
        # Filter: has phone, has website, rated 3.5+
        if biz.get("phone") and biz.get("website") and (biz.get("rating", 0) >= 3.5):
            all_leads.append(biz)

print(f"Qualified leads: {len(all_leads)}")
for lead in all_leads[:10]:
    print(f"  {lead['name']} | {lead.get('phone')} | {lead.get('rating')}")
Enter fullscreen mode Exit fullscreen mode

The scraper handles extraction and anti-bot complexity. Your code handles qualification and outreach logic. That's the right separation.


Cost Comparison

Extracting 10,000 business listings with full details:

Method Cost Reliability Maintenance
Google Places API ~$490 High None
DIY Playwright scraper ~$20 (proxies) Low (breaks weekly) 10-15 hrs/month
Apify managed actor ~$5-10 High None

The managed approach is 50-100x cheaper than the official API and doesn't require you to maintain any code.


Getting Started

  1. Sign up on Apify — free tier available
  2. Install the Python client: pip install apify-client
  3. Get your API token from Settings → Integrations
  4. Run the Google Maps Scraper with your search query
  5. Download results as JSON, CSV, or Excel

The free tier gives you enough credits to test with a few hundred results before committing.


Building something with local business data? Drop a comment — I'd love to hear what use cases people are working on.

Top comments (0)