Google Maps Scraper avatar
Google Maps Scraper
Try for free

No credit card required

View all Actors
Google Maps Scraper

Google Maps Scraper

compass/crawler-google-places
Try for free

No credit card required

Extract data from hundreds of Google Maps locations and businesses. Get Google Maps data including reviews, images, contact info, opening hours, location, popular times, prices & more. Export scraped data, run the scraper via API, schedule and monitor runs, or integrate with other tools.

User avatar

My crawler keeps on adding more requests

Open

conservative_hamster opened this issue
a month ago

The crawler keeps on adding more requests so it looks like it will scrape indefinitely. I'm trying to scrape all of new york (the state), have around 18k results now, but it seems like neew results comes in slower and costs are more.

It says Crawled 19888/7388 pages, but the last number gets more every hour.

What to do?

User avatar

Hi, thanks for your question. The crawler created 15k requests in the beginning to perform the search based on your polygon and "deeper city scrape" (example of a search request). Then, for each found place, actor creates an additional request to perform place detail page scrape (example of a detail page request). So in the end the crawler should perform around 36k requests.

You can continue the run, but the cost per result will increase. At this point, most of the places that the crawler gets from Google Maps are duplicates (places that it has already found) and so they are discarded.

The "Crawled 19888/7388 pages" is probably a bug caused by the restart of the crawler, we'll take a look at it.

User avatar

wearable_acacia

25 days ago

As a new user, I'm having a similar question.

I'm crawling "Th眉ringen, Deutschland" with deeper city scrape and it came up with 22k requests. I want to crawl Germany in full and ideally in parallel, but a 98% result is fine for me.

Can I somehow limit the # of requests for each run, like to 5k? I'm afraid that I'm running out of budget any paying most of the budget for finding duplicates towards the end of the run when I'm at sleep.

Any advice for a noob?

User avatar

Hi, unfortunately at the moment there's no way to limit number of requests other than turning off deeper city scrape. We are, however, working on an optimization so that actor "knows" it should stop when finding only duplicates.

You may want to check out Google Maps Extractor - it's a PPR (predictable pricing) and faster version of this one, although it has some limitations (e.g. doesn't scrape reviews or images)

User avatar

wearable_acacia

25 days ago

Thanks, something like that would be great. Or more general, a $ limit per run could solve that, too. I found the expected numbers of results per region and I鈥檓 using that +5% now as a result limit. We鈥檒l see.

Developer
Maintained by Apify
Actor metrics
  • 4.3k monthly users
  • 95.4% runs succeeded
  • 1.7 days response time
  • Created in Nov 2018
  • Modified about 19 hours ago