Google Search Results (SERP) Scraper avatar
Google Search Results (SERP) Scraper

Pricing

$0.50 / 1,000 results

Go to Store
Google Search Results (SERP) Scraper

Google Search Results (SERP) Scraper

Developed by

ScraperLink

ScraperLink

Maintained by Community

🔥 Only $0.50 per 1,000 search results pages 🔥 **CHEAPEST** Google Search Results (SERP) Scraper with real-time SERP data and support for multiple countries.

5.0 (1)

Pricing

$0.50 / 1,000 results

31

Total users

511

Monthly users

288

Runs succeeded

>99%

Issues response

12 hours

Last modified

22 days ago

MT

Lower Number of Search Results?

Closed

matm opened this issue
a month ago

I realized my calls to this actor are taking too much data at once and it's leading to RAM problems on my machine. Isn't it possible to create a batch of just 1, or 3 results from google SERP?

The minimum I could see it 10.

scraperlink avatar

Hi @matm — thanks for raising this!

Just to clarify, the actor currently fetches full Google SERP pages, which typically include 10 results per page (with the option to fetch more, but not less). There’s no built-in way to fetch just 1 or 3 results, since that’s how Google structures the results it returns.

Also worth noting: billing is based on 1,000 search result pages, not individual results. So whether the page includes 3 or 10 results, it still counts as just one page — and all the processing is handled on our end, not yours.

If you're running into local memory issues, it might help to know that the results are returned as plain JSON — so unless you're running massive batches or holding everything in memory at once, it generally shouldn’t be a heavy load.

That said, if you’re working with large result sets and need to trim them down before further processing, I’d recommend slicing the results on your end as a quick workaround. If you’d like help optimizing that part of your script, feel free to share a snippet — happy to take a quick look!

Let me know how it goes.

MT

matm

a month ago

Thanks for the prompt response. I found my way around this now as I'm getting used to this actor. I'm indeed working with a large number of datasets in sheets and calling this API multiple times.

Best,

scraperlink avatar

Hi @matm — glad to hear!

Since you’re working with large datasets and calling the API multiple times, one approach that might help is separating the data fetching from the processing. For example, you could make your API calls in batches, store the raw results locally (e.g., in JSON files or a lightweight database), and then run your main script separately to process those results. That way, you're not holding everything in memory at once, and it’ll be easier to debug or resume if something goes wrong.

I'll go ahead and close out this issue for now.