
Reddit Scraper
1 day trial then $45.00/month - No credit card required now

Reddit Scraper
1 day trial then $45.00/month - No credit card required now
Unlimited Reddit web scraper to crawl posts, comments, communities, and users without login. Limit web scraping by number of posts or items and extract all data in a dataset in multiple formats.
Getting multiple duplicates for the same comment
I have scraped 20k items which are mostly comments. When performing data cleaning, I have noticed that only 1k of these items are unique and the other 19K are just multiple duplicates of the same unique items. Some comment appeared more than 100 times! How can I avoid this issue?

Can you share your run ID so I can take a look at what is happening?
ons_kharrat
Hey! My run id is V9An8uU7tUqWgw58R I have also run the actor again to get more data, will let you know if I face any issues with the new data.
ons_kharrat
Hey, I have faced another issue with my new scraping, it is scraping comments, but not scraping the posts these comments belong to. Is there a way to avoid this?

I have fixed the duplication issue.
Actor Metrics
360 monthly users
-
82 bookmarks
>99% runs succeeded
4.4 days response time
Created in Feb 2022
Modified 2 days ago