
Extended GPT Scraper
Pricing
Pay per usage

Extended GPT Scraper
Extract data from any website and feed it into GPT via the OpenAI API. Use ChatGPT to proofread content, analyze sentiment, summarize reviews, extract contact details, and much more.
4.6 (4)
Pricing
Pay per usage
72
Monthly users
91
Runs succeeded
98%
Response time
1.4 days
Last modified
3 months ago
Another problem!
Hi! Another problem! I am uploading a list and I need to process exactly the pages I uploaded.
I have set the processing depth to 0.
But when the task is running, I see that the number of pages to be processed is increasing.
What am I setting wrong?
Thank you

Hi, thanks for opening this issue!
Max crawling depth of 0 means to crawl infinitely. Unfortunately, the value 1 already means to go one page deep, which is making this really stupid. It's very unintuitive, so we will discuss it in our team and try to change or make it more clear :)
In the meantime, to not enqueue sub-pages, simply set the linkSelector
or includeUrlGlobs
to nothing and the crawler will not have anything to enqueue.
I will keep you updated here, thanks!
cooperative_bureau
Thank you!
Pricing
Pricing model
Pay per usageThis Actor is paid per platform usage. The Actor is free to use, and you only pay for the Apify platform usage.