
Extended GPT Scraper
No credit card required

Extended GPT Scraper
No credit card required
Extract data from any website and feed it into GPT via the OpenAI API. Use ChatGPT to proofread content, analyze sentiment, summarize reviews, extract contact details, and much more.
Another problem!
Hi! Another problem! I am uploading a list and I need to process exactly the pages I uploaded.
I have set the processing depth to 0.
But when the task is running, I see that the number of pages to be processed is increasing.
What am I setting wrong?
Thank you

Hi, thanks for opening this issue!
Max crawling depth of 0 means to crawl infinitely. Unfortunately, the value 1 already means to go one page deep, which is making this really stupid. It's very unintuitive, so we will discuss it in our team and try to change or make it more clear :)
In the meantime, to not enqueue sub-pages, simply set the linkSelector
or includeUrlGlobs
to nothing and the crawler will not have anything to enqueue.
I will keep you updated here, thanks!
cooperative_bureau
Thank you!
Actor Metrics
88 monthly users
-
65 bookmarks
86% runs succeeded
4.4 hours response time
Created in Jun 2023
Modified a month ago