Google Maps Reviews: Reliable, Faster, Cheaper avatar
Google Maps Reviews: Reliable, Faster, Cheaper

Pricing

Pay per event

Go to Store
Google Maps Reviews: Reliable, Faster, Cheaper

Google Maps Reviews: Reliable, Faster, Cheaper

Developed by

Agents

Agents

Maintained by Community

Extract valuable business reviews from Google Maps at high speed. Our cost-effective scraper delivers comprehensive review data, ratings, and customer sentiments to fuel your business intelligence. Monitor brand reputation, analyze competitors, make data-driven decisions with unmatched reliability.

0.0 (0)

Pricing

Pay per event

0

Total users

63

Monthly users

51

Runs succeeded

>99%

Issues response

12 hours

Last modified

12 hours ago

JM

Ocasional delay which is critical in my project

Open

james4u opened this issue
5 days ago

Hello team, currently I am using Node.js SDK to integrate APIFY service. I am currently fetching 300 - 600 reviews at once and sometimes I really get huge delay. Why is it and what can I do or what can you do for me to remove this ocasional delay?

JM

james4u

5 days ago

120s is timeout in my case, and sometimes it takes less than less than 70s to grab 500 reviews and sometimes it takes 165s to grab 351 reviews. Not sure if review count affects response time or not

agents avatar

Agents (agents)

4 days ago

Hey there,

Thank you for reaching out, and letting us know about your inquiry. Can you please share your Run ID, so that the team can investigate it in detail?

Best

JM

james4u

3 days ago

where can I get Run ID?

JM

james4u

3 days ago

more seriously, I am getting following error

agents avatar

Agents (agents)

3 days ago

Hello,

You can find all your previous runs under Runs tab and in each run there is a Run ID. Please share us the problematic run so we can investigate.

Cheers

agents avatar

Agents (agents)

3 days ago

Hello,

Another thing is, EAI_AGAIN is a DNS lookup timed out error, means it is a network connectivity error or proxy related error so it doesn't look it is a scraper related issue.

Best