DEV Community

agenthustler
agenthustler

Posted on • Edited on

Google Maps Scraping: Extract Places, Reviews, and Business Hours in 2026

Google Maps is one of the richest sources of local business data on the web. Whether you're building a local SEO tool, doing competitive analysis, or aggregating business directories, extracting places, reviews, and business hours from Google Maps can provide tremendous value.

In this guide, we'll walk through practical approaches to scraping Google Maps data in 2026, including working Python code and tips for handling common challenges.

Why Scrape Google Maps?

Local businesses rely on Google Maps visibility. Extracting this data enables:

  • Local SEO audits — track how businesses rank for specific queries in different locations
  • Lead generation — build lists of businesses by category and region
  • Competitive intelligence — monitor competitor reviews, ratings, and hours
  • Market research — understand business density and trends in specific areas

Setting Up Your Environment

# Implementation is proprietary (that IS the moat).
# Skip the build — use our ready-made Apify actor:
# see the CTA below for the link (fpr=yw6md3).
Enter fullscreen mode Exit fullscreen mode

Extracting Business Details

Once you have place URLs, you can extract detailed information:

# Implementation is proprietary (that IS the moat).
# Skip the build — use our ready-made Apify actor:
# see the CTA below for the link (fpr=yw6md3).
Enter fullscreen mode Exit fullscreen mode

Extracting Reviews

Reviews contain valuable sentiment data. Here's how to collect them:

def extract_reviews(page, max_reviews=50):
    """Extract reviews from a Google Maps place page."""
    # Click on reviews tab
    reviews_tab = page.query_selector('button[aria-label*="Reviews"]')
    if reviews_tab:
        reviews_tab.click()
        time.sleep(2)

    # Scroll to load reviews
    review_panel = page.query_selector('div.m6QErb.DxyBCb')
    if review_panel:
        for _ in range(max_reviews // 5):
            review_panel.evaluate('el => el.scrollTop = el.scrollHeight')
            time.sleep(1)

    reviews = []
    review_elements = page.query_selector_all('div.jftiEf')

    for el in review_elements[:max_reviews]:
        review = {}
        author = el.query_selector('div.d4r55')
        review['author'] = author.inner_text() if author else None

        rating = el.query_selector('span.kvMYJc')
        review['rating'] = rating.get_attribute('aria-label') if rating else None

        text = el.query_selector('span.wiI7pd')
        review['text'] = text.inner_text() if text else None

        date = el.query_selector('span.rsqaWe')
        review['date'] = date.inner_text() if date else None

        reviews.append(review)

    return reviews
Enter fullscreen mode Exit fullscreen mode

Handling Anti-Scraping Measures

Google Maps actively fights scraping. Here are strategies that work in 2026:

  1. Use residential proxies — Datacenter IPs get blocked fast. Services like ThorData offer rotating residential proxies that mimic real users.

  2. Randomize request timing — Add variable delays between 2-8 seconds between requests.

  3. Rotate user agents — Switch between different browser fingerprints.

  4. Use headless browsers wisely — Playwright with stealth plugins avoids basic detection.

import random

USER_AGENTS = [
    "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36",
    "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36",
    "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36",
]

def get_random_delay():
    return random.uniform(2, 8)

def get_random_ua():
    return random.choice(USER_AGENTS)
Enter fullscreen mode Exit fullscreen mode

Alternative: Using Review Scraping Tools

If you need structured review data without building your own scraper, check out ready-made solutions like the G2 Reviews Scraper on Apify, which handles pagination, rate limiting, and data formatting automatically for business review platforms.

Saving Data to CSV

def save_to_csv(places, filename="google_maps_data.csv"):
    if not places:
        return

    keys = places[0].keys()
    with open(filename, 'w', newline='', encoding='utf-8') as f:
        writer = csv.DictWriter(f, fieldnames=keys)
        writer.writeheader()
        writer.writerows(places)

    print(f"Saved {len(places)} places to {filename}")
Enter fullscreen mode Exit fullscreen mode

Legal and Ethical Considerations

Always check Google's Terms of Service before scraping. Consider:

  • Using the official Google Places API for production applications
  • Respecting robots.txt directives
  • Rate limiting your requests to avoid overloading servers
  • Not scraping personal information without consent

Conclusion

Google Maps scraping is powerful for local SEO, lead generation, and market research. While the techniques work in 2026, always consider using official APIs when available and ensure your scraping activities comply with applicable laws and terms of service.

For proxy infrastructure that keeps your scrapers running, ThorData's residential proxies provide reliable rotating IPs specifically suited for map data extraction.

Top comments (0)