Free Google Maps Images Scraper — Bulk Photo URLs avatar

Free Google Maps Images Scraper — Bulk Photo URLs

Pricing

Pay per usage

Go to Apify Store
Free Google Maps Images Scraper — Bulk Photo URLs

Free Google Maps Images Scraper — Bulk Photo URLs

Pricing

Pay per usage

Rating

0.0

(0)

Developer

SR

SR

Maintained by Community

Actor stats

0

Bookmarked

1

Total users

0

Monthly active users

3 hours ago

Last modified

Share

Free Google Maps Images Scraper — Bulk Photo URLs from Any Place

A Google Maps photo scraper that returns every image URL Google has on file for any business listing — full direct image URLs, no thumbnail proxies, no lh3-googleusercontent.com redirects to resolve. Pass any canonical Google Maps place URL and get back up to 200 photos per place at $0.0008 each. No Google Places API photo references (which expire after 1 hour and bill $7-$70 per 1000), no headless browser, no manual right-click-and-save loop.

What you get

  • Direct image URLs at lh5.googleusercontent.com/p/<id>=w<width>-h<height> — pasteable into a browser, downloadable with curl, no expiring tokens
  • Image dimensions when Google embeds them in the URL (most images come in w408-h408, w1080-h720, or =s10000 variants you can resize on the fly by editing the suffix)
  • Place name and canonical URL echoed back so you can join photos back to their place rows
  • photos_count_returned — the actual count delivered for this run, useful for budget reconciliation
  • Photo ordering matches Google's relevance sort — the first photos in the array are the ones Google deems most representative of the place
  • Mix of merchant-uploaded photos (the cover shot, interior, exterior, menu, etc.) and customer-uploaded photos (food shots, room photos, signage)

Why scrape Google Maps photos

Google Places API "Place Photo" requests bill $7 per 1000 for the basic tier and $70 per 1000 for maxHeightPx > 1600 — and every photo URL it returns is a photoreference token that expires after one hour and resolves to a single fixed-size image you can't easily re-derive at a different resolution. For real estate listing tools, AI computer-vision training pipelines, hotel-content aggregators, food-delivery apps, and travel inspiration sites that need durable, high-res image URLs at scale, the official Photos API is both expensive and operationally awkward (the 1-hour token expiry forces re-fetching every time you render the same photo).

This actor solves that by hitting Google Maps' internal /maps/preview/place AJAX endpoint with curl_cffi (Chrome TLS impersonation) plus fresh anonymous Google session cookies from a managed minter pool. One call returns the embedded photo URLs Google's web client renders for the place panel — typically 30-150 photos per popular venue — for $0.0008 per image, or $0.80 per 1000. That's roughly 9× cheaper than Google Places API (Basic) and 88× cheaper than the high-res tier. The URLs are direct lh5.googleusercontent.com/p/... paths, which Google serves without expiry as long as the photo remains attached to the listing.

Concrete buyer math: a real-estate listings site that wants 30 photos per property × 5,000 properties pays 150,000 × $0.0008 = $120 once for a permanent image set — versus Google Photos API at $1,050 (basic) or $10,500 (high-res) plus the operational overhead of re-fetching every URL hourly. A computer-vision team training a "what does the inside of a coffee shop look like" classifier on 10,000 venues × 20 photos = 200,000 images pays $160.

Input

FieldDefaultDescription
place_urlsrequiredArray of canonical Google Maps place URLs (https://www.google.com/maps/place/<Name>/@<lat>,<lng>,17z/)
max_photos_per_place30Cap photos per place URL (1-200)
use_cookiestrueUse a fresh anonymous Google session — bypasses the EU consent banner and reduces CAPTCHA at scale

The actor extracts the FID, lat/lng, and place ID from the URL itself — you don't need to pass any extra metadata. Both short URLs (/maps/place/<Name>/) and long ones with /data=!4m... parameters are accepted.

Output

{
"place_name": "Eiffel Tower",
"place_url": "https://www.google.com/maps/place/Eiffel+Tower/@48.8583701,2.2922926,17z/",
"photos_count_returned": 30,
"photos": [
{
"url": "https://lh5.googleusercontent.com/p/AF1QipPZ8j2Yk_h8...=w408-h540-k-no",
"width": 408,
"height": 540
},
{
"url": "https://lh5.googleusercontent.com/p/AF1QipNgQrL2x4M9Z...=w1080-h720-k-no",
"width": 1080,
"height": 720
},
{
"url": "https://lh5.googleusercontent.com/p/AF1QipMzC8t5rAv...=s10000",
"width": null,
"height": null
}
]
}

The actor returns one record per place (not one per photo), with the photos bundled into a photos[] array. The =w408-h540 suffix on each URL is editable — change it to =w1920-h1080 or =s10000 (max-dim) to fetch the same image at a different resolution from the same googleusercontent host.

Use cases

Real-estate marketplace seeding listings with hero photos. You launch a Lisbon short-term rental aggregator and need 20 hero photos per apartment for 3,000 listings. Pull each apartment's Google Maps URL, run the actor with max_photos_per_place=20, and you have 60,000 direct image URLs for $48. Resize on the fly by editing the =w408 suffix to =w1080 — no re-scraping, no re-billing.

Food-delivery app populating restaurant gallery cards. Your app shows 8 photos per restaurant on the menu screen. Across 12,000 restaurants in 50 cities, that's 96,000 photos for $76.80. Through Places API: $672 (basic) or $6,720 (high-res), plus you'd need to re-fetch every URL hourly because of the 1-hour token expiry.

Computer-vision team training scene classifiers. You're training a model that classifies "is this a hotel lobby" vs. "restaurant interior" vs. "retail storefront". Pull 1,000 places per category × 30 photos = 30,000 photos per category at $24 each. Cheaper than the cost of one engineer's hour of API debugging.

Travel content site building "Top 50 attractions in Tokyo" galleries. You need a curated gallery of 30 photos per attraction across 50 attractions per article × 100 articles = 150,000 photos for $120. Each URL is durable (no expiry), so you can cache and CDN them indefinitely.

How it compares

ActorPrice per 1000 photosURL durabilityResize-on-the-flyMax per place
This actor (s-r/free-google-maps-images-scraper)$0.80durable (no expiry)yes (edit =w... suffix)200
slash-scrape/advanced-google-maps-photos-scraper (#3 on google maps photo scraper SERP)per-result, similar tierdurableyesvaries
Outscraper Google Maps Photos (#1 SERP)flat-rate subscriptiondurableyesup to 1000
Google Places API Photo (basic, ≤1600px)$7 + 1-hour expiry1-hour token expirypartial (size at request time)unlimited (per-request billing)
Google Places API Photo (high-res, >1600px)$70 + 1-hour expiry1-hour token expirypartialunlimited

The competitive edge here is the price floor combined with URL durability — Google's lh5.googleusercontent.com/p/<id> URLs work indefinitely as long as the photo is still attached to the listing, so you can cache them in your own CDN without re-scraping. Google's official Photo API tokens expire in 1 hour, which forces every render to round-trip through the API.

Pricing

This actor uses Apify's pay-per-event monetization at $0.0008 per photo returned — that's $0.80 per 1000 photos. All pricing is pay-per-event — you only pay for results you receive. No actor-start fee, no per-compute-unit charges, no charge for empty queries.

Limits and gotchas

  • Maximum 200 photos per place per run — most popular venues have 100-300 photos; for the long tail (the Eiffel Tower has thousands), 200 covers the most-relevant set Google's panel renders by default
  • Photo dimensions are only available when Google embeds them in the URL suffix (~85% of cases); for the rest, the width/height fields are null and you can probe by editing =s10000 to get the max available
  • The order of photos matches Google's relevance sort — the first photos are merchant-uploaded covers; deeper indices are mostly customer-uploaded
  • Apify residential proxy is not required — the cookie pool plus curl_cffi's Chrome TLS fingerprint is enough for sustained throughput
  • Cold-start time is ~3-5 seconds for the first place; subsequent places in the same run reuse the session pool
  • The place_url must be the canonical Maps URL with the FID encoded; shortlinks like g.co/... need to be resolved upstream
  • We return URLs only — the actor does not download the image bytes, so you're not paying Apify storage fees for the photo content. Download is on your side via plain curl or httpx

FAQ

How do I download photos from Google Maps in bulk? Pass any canonical Google Maps place URL to this actor. It returns up to 200 direct image URLs per place at lh5.googleusercontent.com/p/<id>=w<width>-h<height> paths — pasteable into a browser, downloadable with curl <url> -o photo.jpg, no Google Photos API key needed.

Can I use Google Maps photos commercially? Photos on Google Maps are uploaded under Google's Maps user contribution terms — generally permissive for non-commercial display (e.g. attribution-required reuse), but commercial republication is the photographer's call. Always check the photo's contributor attribution before republishing in a paid product. This actor returns the URL only; it does not transfer copyright.

What's the cost to scrape 10,000 Google Maps photos? 10,000 × $0.0008 = $8. For comparison: Google Places API Photo (basic) costs $70 for the same 10,000 photos, and the URLs expire after 1 hour each.

Can I get high-resolution images? Yes. Each URL ends in =w408-h540-k-no or similar — change the w and h numbers (or use =s10000 for max-dim) and Google's lh5.googleusercontent.com host serves the same photo at the new resolution. No re-scraping needed.

Will Google rate-limit my scraping? At default settings (fresh session cookies, no proxy) the actor sustains ~10 places/second without throttling. For sustained >50 places/sec throughput, supply a residential proxy URL via the PROXY_URL env var.