Google Maps Photos Scraper — Batch places avatar

Google Maps Photos Scraper — Batch places

Pricing

from $3.00 / 1,000 photo pages

Go to Apify Store
Google Maps Photos Scraper — Batch places

Google Maps Photos Scraper — Batch places

Scrape Google Maps photos at scale. Batch input: pass one or many place_ids / data_ids in a single run — no need to configure a separate task per place. Each photo is pushed as its own dataset record stamped with the source place_id. Cheapest photos-only actor on Apify.

Pricing

from $3.00 / 1,000 photo pages

Rating

0.0

(0)

Developer

Scrape Badger

Scrape Badger

Maintained by Community

Actor stats

0

Bookmarked

2

Total users

1

Monthly active users

2 days ago

Last modified

Share

What does Google Maps Photos Scraper do?

Scrape photos from Google Maps for many places in one Apify run. Photos-only micro-actor — accepts batch place_ids / data_ids.

Why use Google Maps Photos Scraper?

  • Batch input. Paste many place_ids once — no separate task per place.
  • Dedicated photos actor. Every other Apify Maps actor bundles photos with the full place crawl; this one is purpose-built.
  • Cheapest photos-only actor on Apify. $0.30 / 1k photos.
  • Per-place pagination budget. max_pages_per_place (1-50) caps spend.
  • Single-purpose UI. No mode dropdown, no fields you'd never use.

What data can Google Maps Photos Scraper extract?

FieldTypeDescription
place_idstringSource place ID
urlstringFull-resolution image URL
thumbnailstringSmall thumbnail URL
width / heightnumberImage dimensions
authorstringUploader display name
author_urlstringUploader Google profile
uploaded_atstringISO 8601 timestamp

How to scrape Google Maps

  1. Click Try for free.
  2. Paste one or many Google Place IDs into Place IDs.
  3. Optional: set gl, hl.
  4. Set max_pages_per_place — each page ≈ 10 photos.
  5. Click Start — photos stream into the dataset, tagged with source place_id.

How much will it cost?

$0.003 per photo page (≈ $0.30 per 1,000 photos). Batching 100 places at max_pages_per_place: 3 = 300 calls = $0.90.

Competitor benchmark

ActorAuthorPriceNotes
compass/crawler-google-placesCompass~$7 / 1k places (photos bundled)No dedicated photos actor
apify/google-maps-scraperApify~$9 / 1k placesPhotos not a dedicated output
lukas_krivka/google-maps-with-contact-detailsLukas Krivka~$5 / 1k placesPhotos not the use case
scrape-badger/google-maps-photos-scraperScrapeBadger$0.30 / 1k photosOnly dedicated photos-only actor on Apify

Input

Configure the run in the Input tab above, or pass a JSON object matching the fields below when calling the Actor via the Apify API.

FieldRequiredDescription
place_ids✅ (or data_ids)One place ID per line, or comma-separated.
data_ids✅ (or place_ids)Alternative.
gl / hlCountry + language.
max_pages_per_place1-50, default 5.

Output

Every successful run streams records into the run's dataset. Download as JSON, CSV, XML, Excel, or HTML from the Dataset tab; consume programmatically via the Apify API or webhooks.

Example record:

{
"place_id": "ChIJ_3Su08fj5UYRkFfNoiuWQUk",
"url": "https://lh3.googleusercontent.com/\u2026=w4032-h3024",
"thumbnail": "https://lh3.googleusercontent.com/\u2026=s100",
"width": 4032,
"height": 3024,
"author": "Jane Doe",
"uploaded_at": "2026-03-10T15:00:00Z"
}

Tips / Advanced options

  • Use with ML pipelines. Photo URLs stay fresh for weeks — download once, cache the binary.
  • Dedupe on url. Google sometimes returns the same image at different resolutions; dedupe by stripping the =w…-h… query string.
  • Pipe from google-maps-scraper. Search Places → pipe the place_id column here for bulk photo extraction.
  • Thumbnail vs. full image. thumbnail is a 100-px square. Use for UI; use url for full resolution.

FAQ, Disclaimers, Support

Do these URLs expire?

They're served by Google's CDN and stay valid for months. For production, download + rehost.

Why no reviews / posts fields?

Those are separate endpoints. Use google-maps-reviews-scraper and google-maps-scraper (List Posts mode).

Can I filter by upload date?

Not at the Google level — paginate newestFirst equivalent isn't exposed. Post-filter on uploaded_at.

What's the max batch size?

≈ 500-1000 places at 256 MB; chunk larger batches across runs.

Disclaimer

This Actor scrapes public Google data only. You're responsible for compliance with Google's Terms of Service and any applicable data-protection laws (GDPR, CCPA, etc.) in your jurisdiction. ScrapeBadger does not store the scraped results — they are delivered directly to your Apify dataset.

Support

Something not working? Open a ticket in the Issues tab above — we triage within one business day. Full API reference: docs.scrapebadger.com.

Powered by

ScrapeBadger — Google-optimised residential proxy pool + browser-farm fallback, 99.7% uptime, unmetered bandwidth. No CAPTCHAs reach you.