Extended GPT Scraper
No credit card required
Extended GPT Scraper
No credit card required
Extract data from any website and feed it into GPT via the OpenAI API. Use ChatGPT to proofread content, analyze sentiment, summarize reviews, extract contact details, and much more.
Do you want to learn more about this Actor?
Get a demoHi! Another problem! I am uploading a list and I need to process exactly the pages I uploaded.
I have set the processing depth to 0.
But when the task is running, I see that the number of pages to be processed is increasing.
What am I setting wrong?
Thank you
Hi, thanks for opening this issue!
Max crawling depth of 0 means to crawl infinitely. Unfortunately, the value 1 already means to go one page deep, which is making this really stupid. It's very unintuitive, so we will discuss it in our team and try to change or make it more clear :)
In the meantime, to not enqueue sub-pages, simply set the linkSelector
or includeUrlGlobs
to nothing and the crawler will not have anything to enqueue.
I will keep you updated here, thanks!
Thank you!
- 83 monthly users
- 37 stars
- 99.7% runs succeeded
- 4 days response time
- Created in Jun 2023
- Modified 23 days ago