
Web Scraper
Pricing
Pay per usage

Web Scraper
Crawls arbitrary websites using a web browser and extracts structured data from web pages using a provided JavaScript function. The Actor supports both recursive crawling and lists of URLs, and automatically manages concurrency for maximum performance.
4.4 (22)
Pricing
Pay per usage
762
Total users
85K
Monthly users
4.5K
Runs succeeded
>99%
Issues response
31 days
Last modified
15 hours ago
The parametre "Max result records (optional)" is set to N-1
Closed
I try many times, i think there is a bug. When i set this parameter to 10 (that means: i want to stop the execution of the job when the number of saved result is 10) , but the job stops at 9 and i get only 9 items in the dataset !
Hello RedabenhAKO and thank you for your input!
You're right — this is a bug. The "Max result records" option also counts cases where the Page Function does not return any dataset items. In your case, this happens in the handleStart()
branch of the function, which is why the Actor stops at 9 instead of 10.
We've already created a tracking issue for this: GitHub Issue #353, and I’m currently collecting feedback from my team. I’ll let you know if there are any developments on this.
I appreciate your patience, and thanks again for reporting this!
RedabenhAKO
Thanks for your answer
Hello, and apologies for the delay in this answer.
Unfortunately, I'll have to close this issue as wontfix
. Even undefined
Page Function return creates a dataset record with #debug
and #error
keys. See my GitHub PR for more details. Technically, those count as dataset items (because they are, really), so the Max result records limit is actually correct.
My apologies for the wait. I'll close this issue now, but feel free to ask additional questions if you have any. Cheers!