Website Content Crawler avatar
Website Content Crawler

Pricing

Pay per usage

Go to Store
Website Content Crawler

Website Content Crawler

Developed by

Apify

Apify

Maintained by Apify

Crawl websites and extract text content to feed AI models, LLM applications, vector databases, or RAG pipelines. The Actor supports rich formatting using Markdown, cleans the HTML, downloads files, and integrates well with 🦜🔗 LangChain, LlamaIndex, and the wider LLM ecosystem.

4.6 (38)

Pricing

Pay per usage

1310

Total users

49.4k

Monthly users

6.9k

Runs succeeded

>99%

Issue response

3.8 days

Last modified

7 days ago

KO

number of saved lines

Open

kocsi opened this issue
a month ago

I would like to set it to save 100 lines from each input URL. Currently, the actor stops when it reaches the first 100 lines, even if I enter, say, 10 URLs and the first one already finds 100 lines of data.

jiri.spilka avatar

Hi,

Thank you for using Website Content Crawler.

I’m not sure I fully understand your question, so let me try to clarify a couple of points:

  • Regarding content: The crawler will save all the text content that is present on the website.
  • Regarding limiting the number of results: You have set "maxResults": 100, which means the crawler will save content from up to 100 URLs. If you need less/more you need to change this setting.

If you could please clarify your question a bit more, I’d be happy to assist you further.

Thank you, Jiri

KO

kocsi

a month ago

I would like to run 10 urls and I want it to fetch 100 lines of data from each url. Now if I start it and it finds 100 lines of data on the first url, it will save it and stop.

jakub.kopecky avatar

Hi,

Sorry for the late response, we've been busy lately. The Website Content Crawler does not support this; it saves the whole content of the page, but you can truncate the extracted text yourself by exporting the dataset in the Storage tab of the Actor run (to JSON, CSV, etc.) or accessing the results through API or Client (https://docs.apify.com/platform/storage/dataset).

Jakub