
Reddit Scraper Lite
This Actor is paid per event

Reddit Scraper Lite
This Actor is paid per event
Pay Per Result, unlimited Reddit web scraper to crawl posts, comments, communities, and users without login. Limit web scraping by number of posts or items and extract all data in a dataset in multiple formats.
Why do I get some fake filler results
In order to fill minimum 10 results I understand that scraper fills the output with fake objects: { "dataType": "fake", "body": "Filler result to achive minimum 10 items for Pay per Result run" } I am curious why is this the case?

This is to cover the initialization cost of the scraper since it takes some time for the first results to be returned.
dario.agosta
That's understandable but -friendly advice- talk to Apify to get to a price structure like with a set minimum per run, getting rows of
{ "dataType": "fake", "body": "Filler result to achive minimum 10 items for Pay per Result run" },
does not really look good, honestly and I think would hardly fly with the marketplace, in the interest of transparency.

I agree 100% dario

Impremented a new payment model that fixes this issue.
Actor Metrics
436 monthly users
-
92 bookmarks
92% runs succeeded
22 hours response time
Created in Jun 2020
Modified 3 days ago