1 day trial then $45.00/month - No credit card required now
Reddit Scraper
1 day trial then $45.00/month - No credit card required now
Unlimited Reddit web scraper to crawl posts, comments, communities, and users without login. Limit web scraping by number of posts or items and extract all data in a dataset in multiple formats.
I'm trying to scrape the following url:
https://www.reddit.com/r/autism/comments/164ggvt/i_havent_told_my_daughter_that_she_has_autism/
It's not pulling all of the data, it tops out at 257 or approx 540 comments
Also, I tried it three times to no avail.
Can you share the Run ID with me?
I have found the issue and I am working on a solution.
Can you try it again?
Hi Gustavo, sorry for the delay and thanks for the speed of your response, the issue now is that it returned 1001 comments, I thinks it's separating them but it is still incomplete. Also, it's saying that I have run out of credits, it's a big task and trying it four times I think has used a lot of credits. I'm looking for an excel spreadsheet, I'll try send you the run ids
No. 1 9ecrcz45rta8ut8Ix No. 2 o8BvmrPQUySKjITLu No. 3 qny0H1HtL0EPQhAyI No. 4 Qp1PAe7Ehe5ikXTSS
Thanks :)
Just to note that https://www.reddit.com/r/autism/comments/164ggvt/i_havent_told_my_daughter_that_she_has_autism/ has a total of 540 comments
You are limiting the maximum number of results to 1000 using the maxItems property on your input. You should increase that so you will be able to get more results.
But there are only 540 comments. Can you try to run it and see if it returns the correct results on your end? Apologies, I'm not sure how troubleshooting works, I'm new to this and your's is the only tool I've used. Thank you so much by the way, it's been great!
Or is there anyway that you could attach the excel file output in this chat? It's for a piece of research one of 11 (the others were successful).
This file of the most recent attempt the data is not in order and has 5000 rows instead of 540, looking at the cells I can see that rows are being repeated
My run got 530 results
Can you share the run ID with 5000 rows?
xguPeBuAm0c9T7VZR
Here you go
Could you send me your excel spreadsheet for the 530, though the page says 540? Thanks a mil
Ran two times with the same number of resuts.
Can you confirm that you received the spreadsheet? The Apify page seems to have a bug and it is not displaying the attached file to me.
From the logs of this run (xguPeBuAm0c9T7VZR) seems like the actor is storing results so fast reached Apify's rate limit when tried to store more than 200 results per second. That has never happened to me which is very weird. This can be a bug on Apify side, if it is something related to the actor I should be able to replicate the issue here.
Hi, no I didn't receive it, I noticed a similar bug when I tried to send to you.
my email address is: ismaelihayden@gmail.com
Thank you so much!
Hello, I wonder has there been any movement on this? I didn't receive an email in case you sent it.
Thanks a mil!
Ismael
I have sent it with the subject: Reddit actor results Can you check your spam folder?
I have found the issue with your run and created a fix for it. But it still returns 530 results. Regarding the order, since the actor performs parallel requests to increase performance, the order is not guaranteed.
Ok that's understood, so long as they're in order etc, I can go through and see. So it will work if I run it or can you send me an excel file? Best wishes, Ismael
You can run it now
- 287 monthly users
- 99.1% runs succeeded
- 0.5 days response time
- Created in Feb 2022
- Modified 4 days ago