Google Maps Scraper
Pricing
from $5.00 / 1,000 results
Google Maps Scraper
Extract data from thousands of Google Maps locations and businesses, including reviews, reviewer details, images, contact info, including full name, email, and job title, opening hours, prices & more. Export data, run via API, schedule and monitor runs, or integrate with other tools.
Pricing
from $5.00 / 1,000 results
Rating
5.0
(1)
Developer

Boztek LTD
Actor stats
0
Bookmarked
2
Total users
1
Monthly active users
3 days ago
Last modified
Categories
Share
Google Maps Scraper (Minimal)
A fast, minimal Google Maps scraping actor built for clean, reliable place data. Give it search terms or URLs, set an optional location or custom area, and get a simple, consistent dataset with the fields you actually need.
Why this actor
Most Maps scrapers return bloated, noisy payloads. This actor focuses on the essentials: title, address, country code, rating, review count, and optional phone/website. It is designed for scale, speed, and predictable output.
What it does
- Searches Google Maps by keywords, URLs, or Place IDs
- Optionally focuses on a specific location or custom GeoJSON area
- Extracts place details with a minimal schema
- Handles retries, backoff, and session rotation for stability
Key features
- Minimal, production-friendly output
- Supports multiple search terms in one run
- Works with search URLs or direct place URLs
- Custom geolocation (polygon/multipolygon)
- Proxy and session pool support
Example input
{"searchStringsArray": ["restaurant", "cafe"],"locationQuery": "New York, USA","maxCrawledPlacesPerSearch": 50,"language": "en"}
Output schema (minimal)
Each place record contains only the following fields:
titletotalScorereviewsCountaddresscountryCodephone(when available)website(when available)url
Typical use cases
- Local lead generation
- Location-aware market research
- Store and venue directories
- Competitive intelligence
- Light-weight data enrichment
Notes
- Without geolocation, Google Maps UI limits results to roughly ~120 per search.
- Use proxies for better reliability and higher throughput.
- Increase
navigationTimeoutSecsor lowermaxConcurrencyif you see timeouts.