Google Maps Scraper avatar

Google Maps Scraper

Pricing

$19.99/month + usage

Go to Apify Store
Google Maps Scraper

Google Maps Scraper

Extract business data from Google Maps with precision 📍🏢 Scrape names, addresses, phone numbers, ratings, reviews, websites, and more from any location search. Perfect for lead generation, local SEO, competitor research, and market analysis. Scale your data collection effortlessly 🚀

Pricing

$19.99/month + usage

Rating

0.0

(0)

Developer

ScrapeBase

ScrapeBase

Maintained by Community

Actor stats

0

Bookmarked

3

Total users

1

Monthly active users

13 days ago

Last modified

Share

Google Maps Scraper

Google Maps Scraper is a production-ready Google Maps data extractor that collects structured business details from public Google Maps search results. It solves the pain of manual copy-paste and incomplete lead lists by automating extraction of names, addresses, ratings, phones, hours, and more — making it a powerful Google Maps business listings scraper and Google Places scraper for lead generation, local SEO, and market analysis. Built for marketers, developers, data analysts, and researchers, it scales from quick lookups to city-wide scans with reliability and precision.

What data / output can you get?

Below are real fields returned in the dataset and final maps.json output. Examples reflect typical values you’ll see:

Data typeDescriptionExample value
namePlace or business name“Sample Coffee”
websiteCleaned external website URLhttps://samplecoffee.com”
avg_ratingAverage Google rating (numeric)4.6
total_reviewsTotal number of Google reviews (numeric)128
street_addressStreet line“123 Main St”
cityCity“New York”
stateState/region“NY”
zipPostal/ZIP code“10001”
country_codeTwo-letter country code“US”
full_addressConcatenated address components“123 Main St New York NY 10001 US”
tagsCategories/labels captured from Maps["Coffee shop"]
notesAdditional place notes if presentnull
place_idUnique Google Place ID“abcd1234”
phonePhone number (if available)“+1 212-555-0100”
latLatitude40.75
longLongitude-73.99
hoursNormalized opening hours by day (array of objects)[{"day":"Monday","hours":"8 AM–6 PM",...}]
successExtraction success flagtrue

Bonus data:

  • Optional per-place review capture adds reviews (array) and review_count to the final maps.json and to dedicated dataset items when enabled.
  • Results are written live to the Apify Dataset during the run and persisted as maps.json in the Key-Value Store.

Key features

  • 🛡️ Automatic proxy fallback (direct → datacenter → residential, sticky)
    Resilient proxy ladder with clear logging. Starts direct, falls back to datacenter, then switches to residential proxies with stickiness for the rest of the run when needed.

  • 🗺️ Grid-based viewport coverage
    Expands geographic coverage by scanning across a generated grid per location, maximizing discovery of businesses within the area.

  • 🧩 Bulk inputs: locations, keywords, and direct URLs
    Run city/region + keyword combinations and/or process direct Google Maps search URLs at scale.

  • 🔁 Live dataset writes + final merged maps.json
    Avoid data loss with incremental writes to the Dataset and a final, sorted list saved as maps.json in the Key-Value Store.

  • 🗂️ Deduplication by place_id
    Ensures each business appears once, even when discovered from multiple grid points or sources.

  • Optional reviews capture (best-effort)
    Enable per-place review extraction to turn this into a focused Google Maps reviews scraper for qualitative insights.

  • 🔤 Sorting options (rating, review_count, name, relevance)
    Final results can be sorted to match your workflow’s needs.

  • 🧪 Built in Python, Apify-native
    A reliable Google Maps scraper Python implementation optimized for Apify infrastructure and automation.

  • ⚙️ Transparent logs and progress updates
    Detailed logging of grid points, proxy events, and review progress for observability and debugging.

How to use Google Maps Scraper - step by step

  1. Sign in to your Apify account and open the Google Maps Scraper actor.
  2. Add input:
    • locations: a list of city/region names (e.g., “New York”).
    • keywords: a list of search terms (e.g., “coffee shops”). This field is required.
    • urls: optional list of direct Google Maps search URLs.
    • maxResults: set the global cap for places to collect.
    • proxyConfiguration: leave blank to start direct; the actor auto-fallbacks as needed.
  3. Start the run. The actor resolves a viewport for each location/URL and sweeps it using a grid strategy.
  4. Monitor logs for grid progress, deduplication counts, and any proxy mode changes.
  5. (Optional) Advanced via API: you can pass sortOrder, maxComments, and gridSize to control sorting, review capture depth, and grid resolution.
  6. Review results:
    • Live items are added to the Apify Dataset during the run.
    • The final merged list is saved to the Key-Value Store as maps.json.

Pro tip: Chain this Google Maps leads scraper into your enrichment workflow by consuming maps.json or the Dataset in subsequent Apify runs.

Use cases

Use case nameDescription
Local SEO + competitor mappingTrack category density, ratings, and review counts across neighborhoods to prioritize optimizations and spot gaps.
Sales prospecting & lead genBuild targeted lists of businesses with phones, websites, and locations — a practical Google Maps leads scraper workflow.
Market analysis by city/regionCompare categories across multiple cities using grid-based extraction for consistent geographic coverage.
Data enrichment pipelinesMerge place_id-based records with your CRM or BI data for deeper segmentation and outreach.
Academic & geospatial researchQuantify retail presence, services distribution, and urban patterns using structured latitude/longitude and address fields.
URL-driven campaignsFeed direct Maps search URLs to run reproducible queries for campaign tracking or audits.
Review intelligence (opt-in)Capture top reviews per place to add qualitative insights to quantitative metrics.

Why choose Google Maps Scraper?

This Google Business Profile scraper is purpose-built for precision, scale, and resilience on Apify infrastructure.

  • ✅ Accurate structured output with address components, ratings, phones, coordinates, and hours
  • 🌍 Grid-based coverage for broad, consistent discovery
  • 🧱 Resilient proxy ladder with residential stickiness after fallback
  • 🧰 Developer-friendly: Python implementation and transparent logs
  • 🔁 Live dataset writes plus final sorted maps.json for safe, repeatable workflows
  • 🧼 Clean deduplication by place_id across sources and grid points
  • 🆚 Built for reliability vs. brittle browser extensions and ad-hoc scripts

In short, it’s a production-grade Google Maps scraping tool that balances scale, stability, and clean output.

Yes — when used responsibly. This actor collects data from public Google Maps pages and does not access private or authenticated content.

Guidelines:

  • Only extract publicly available information.
  • Comply with local laws and regulations (e.g., GDPR, CCPA) applicable to your use case.
  • Review and respect the platform’s terms; you are responsible for how you use the data.
  • Consult your legal team for edge cases or commercial reuse questions.

Input parameters & output format

Example JSON input

{
"locations": ["New York"],
"keywords": ["coffee shops"],
"urls": [],
"maxResults": 20,
"proxyConfiguration": { "useApifyProxy": false }
}

Parameters (from the actor input schema):

  • locations (array)
    • Description: List of location names (e.g., New York, Florida).
    • Required: No
    • Default: Not set (example prefill: ["New York"])
  • keywords (array)
    • Description: Search keywords or user-specified terms (supports bulk).
    • Required: Yes
    • Default: Not set (example prefill: ["coffee shops"])
  • urls (array)
    • Description: Direct Google Maps search URLs (optional, supports bulk).
    • Required: No
    • Default: Not set
  • maxResults (integer)
    • Description: Maximum number of places to return (cap across all searches).
    • Required: No
    • Default: 20 (min: 1, max: 10000)
  • proxyConfiguration (object)
    • Description: Default is direct (no proxy). Actor auto-falls back to datacenter → residential if blocked.
    • Required: No
    • Default: {"useApifyProxy": false}

Advanced options (supported via input JSON, not in the UI schema):

  • sortOrder (string): “relevance” | “rating” | “review_count” | “distance” | “name”. Default: “relevance”.
  • maxComments (integer): Number of reviews to fetch per place; 0 disables reviews. Default: 0.
  • gridSize (integer): Grid dimension for viewport coverage. Default: 15.

Example JSON output (place item pushed to Dataset and stored in maps.json)

{
"name": "Sample Coffee",
"website": "https://samplecoffee.com",
"avg_rating": 4.6,
"total_reviews": 128,
"street_address": "123 Main St",
"city": "New York",
"state": "NY",
"zip": "10001",
"country_code": "US",
"full_address": "123 Main St New York NY 10001 US",
"tags": ["Coffee shop"],
"notes": null,
"place_id": "abcd1234",
"phone": "+1 212-555-0100",
"lat": 40.75,
"long": -73.99,
"hours": [],
"success": true
}

When reviews are enabled (maxComments > 0), the actor also pushes review records per place to the Dataset and augments the corresponding place object in maps.json:

{
"place_id": "abcd1234",
"reviews": [
{
"author_name": "Jane Doe",
"author_url": "https://maps.google.com/?cid=...",
"rating": 5,
"text": "Great coffee and friendly staff!",
"time": "1711920000",
"relative_time": "2 months ago",
"author_reviews_count": 42,
"author_photo": "https://lh3.googleusercontent.com/...",
"likes": 3
}
],
"review_count": 1
}

Notes:

  • The final maps.json contains the sorted array of place objects and, if enabled, embedded “reviews” and “review_count” fields per place.
  • Some fields may be missing or null when not present on the public Google Maps page (e.g., website, phone, hours).
  • 🔎 Google Maps contact info extractor — Focus on phones and websites from public listings
  • 🧭 Google My Business scraper — Capture structured profiles for local SEO audits
  • 📝 Google Maps reviews scraper — Enrich place data with qualitative review content

FAQ

Is there a free trial or starter allowance?

Yes. This actor lists a monthly plan at $19.99 and includes 120 trial minutes in the Apify Store. You can start with the trial to evaluate performance before upgrading.

Do I need to use the official Google API?

No. This is a Google Maps scraper without API usage. It collects public data from the web experience with built-in proxy handling.

Does it capture reviews?

Yes, optionally. Set maxComments > 0 in the input to enable per-place reviews. Reviews are best-effort and depend on availability and access.

How many results can I fetch per run?

Up to maxResults across all searches in a run (default 20, maximum 10,000). The actor stops early when the cap is reached.

What fields do you extract?

The actor extracts name, website, avg_rating, total_reviews, address components, tags, notes, place_id, phone, lat/long, hours, success, and — when enabled — reviews and review_count.

How does proxy fallback work?

The actor starts with a direct connection. If blocked, it automatically falls back to a datacenter proxy, and then to a residential proxy with sticky mode for the remainder of the run. All transitions are logged.

Can I sort results?

Yes. You can sort the final output by relevance, rating, review_count, distance, or name using the sortOrder input (advanced option via JSON).

Is this better than a Chrome extension?

For reliability and scale, yes. Unlike fragile browser extensions, this Google Maps scraping tool runs on Apify infrastructure with automatic proxy fallback, live dataset writes, and final maps.json for reproducible workflows.

Closing CTA / Final thoughts

Google Maps Scraper is built for fast, reliable extraction of structured business data from public Google Maps results. With grid-based coverage, automatic proxy fallback, deduplication by place_id, and optional review capture, it’s ideal for marketers, developers, analysts, and researchers.

Run it from the Apify Console for quick lists, or pass advanced JSON inputs to tailor sorting, grid size, and review depth. Developers can plug the Dataset and maps.json into automation or enrichment pipelines for downstream analysis.

Start extracting smarter local insights, build clean lead lists, and power your Google Maps web scraping service with a robust, production-ready workflow.