
🏯 Tweet Scraper V2 ($0.4 / 1K tweets) - X / Twitter Scraper
Pricing
from $0.40 / 1,000 tweets

🏯 Tweet Scraper V2 ($0.4 / 1K tweets) - X / Twitter Scraper
⚡️ Lightning-fast search, URL, list, and profile scraping, with customizable filters. At $0.40 per 1000 tweets, and 30-80 tweets per second, it is ideal for researchers, entrepreneurs, and businesses! Get comprehensive insights from Twitter (X) now!
2.3 (75)
Pricing
from $0.40 / 1,000 tweets
735
Total users
18K
Monthly users
2.3K
Runs succeeded
>99%
Issues response
5.2 hours
Last modified
17 hours ago
Sudden spike in charges due to API schema change
Closed
We had a scheduled pipeline in place that was designed to be charged about $8 per day, with jobs running in 15 minute intervals and crawl depth controlled by the parameter maxTweetsPerQuery.
We were using this pipeline without any issues for months until last Friday, July 19, when the version update from 0.341 to 0.342 removed this parameter, suddenly making our jobs crawl way deeper than intended. And we were charged in excess of $1000 in a single day due to this change.
We were not notified of this change in advance, and were thus caught by surprise, and the resulting charges were completely outside of our designs, and outside of our use case.
I reached out to Apify about this issue, and they directed us to reach out to you. We would like to request that since this change was prompted by you and completely unintended from us, a reimbursement be made for the charges made on July 19, 2024.
Thank you,

Hey there,
We are very sorry for the inconvenience however maxTweetsPerQuery
parameter was not working and there was an issue with maxItems
parameter. While we were fixing the issue with maxItems
, we removed the maxTweetsPerQuery
parameter since it was not working at all. Are you sure nothing has changed on your end or there might be another issue?
Also, to be able to control your spend, the preferred method is to use Maximum charged results
feature of Apify. That way, even if the scraper returns you more results, you will pay what you set and it will be our job to stop the actor after that amount of results. I know it is not possible to divide this into queries of a single run however if you want to make sure you spend what you planned, you should use it. You can divide your queries into several runs, it won't be a problem for our actor.
I forwarded this to our engineering team in case I am missing something and will let you know once they get back to me. You can also contact me via Discord to get faster responses.
Cheers!
agentsmyth
Hello,
There was absolutely no change on our end (it coincided exactly with the version shift from 0.341 to 0.342, and no one was changing any code at 5am on Friday from our end). So it seems like the removal of the maxTweetsPerQuery field or some other change has led to an unintended behavior, as you can see from the screenshot I shared (a sudden 40X spike in results).
I can update the code to properly reflect the change in schema, but our account has reached the maximum due to this sudden spike in charge, and until this issue resolved we will be unable to use the endpoint.
Please do let me know once the engineering team has gone through what happened. For your convenience, I will share the run results that made the same request, right before and after the version shift:
https://console.apify.com/organization/EbTTDq2McXPwsYQbn/actors/runs/uCKV1r2BFYCOBhn6m#output
https://console.apify.com/organization/EbTTDq2McXPwsYQbn/actors/runs/IRlYeCPHkvVnqebyJ#output
Thank you,
agentsmyth
This is the param set that we were using (via the Apify python SDK):
{ "includeSearchTerms": False, "maxTweetsPerQuery": 10, "onlyImage": False, "onlyQuote": False, "onlyTwitterBlue": False, "onlyVerifiedUsers": False, "onlyVideo": False, "sort": "Latest", "start": date_str, "tweetLanguage": "en", "twitterHandles": [ "YahooFinance", "TipRanks", "Benzinga", "business", "markets", "WSJbusiness", "WSJmarkets", "SeekingAlpha", "MarketCurrents", "financialjuice", "DeItaone", "PiQSuite", "FirstSquawk", "Trade_The_News", "Newsquawk", "unusual_whales", ] }

Our engineers told me that this parameter was very problematic, not working at all and breaking our scraper very frequently so they removed it. We are very sorry to hear about what you've experienced. Can you please reach out to me via Discord to talk about how we can help you?
agentsmyth
okay, I sent you a DM (to the apidojo handle)

This issue is resolved over Discord