Website Content Crawler avatar
Website Content Crawler
Try for free

No credit card required

View all Actors
Website Content Crawler

Website Content Crawler

apify/website-content-crawler
Try for free

No credit card required

Crawl websites and extract text content to feed AI models, LLM applications, vector databases, or RAG pipelines. The Actor supports rich formatting using Markdown, cleans the HTML, downloads files, and integrates well with 🦜🔗LangChain, LlamaIndex, and the wider LLM ecosystem.

Do you want to learn more about this Actor?

Get a demo
CR

Treat hash URLs as separate pages to crawl

Open

civic-roundtable opened this issue
2 months ago

Is there any way to configure the crawler to treat hash URLs as unique?

I am trying to crawl this microsite: https://www.azdhs.gov/preparedness/epidemiology-disease-control/extreme-weather/heat-safety/extreme-heat-preparedness/index.php

It has 3 child pages, with completely separate content, but unfortunately they are all hash URLs:

The sitemap.xml is not up to date on this website, and canonicalUrl metadata is not set correctly.

What have I tried so far?

  1. Using includeUrlGlobs to explicitly include hash links (e.g. glob @(#?)*), but that does not work - the log says no links found
  2. explicitly setting these 4 startUrls, but the job de-dups them and only crawls the root page

Thank you for the help

CR

civic-roundtable

2 months ago

My current solution to this is to have separate tasks for each child page. When a hash URL is the only URL in startUrls, it is crawled correctly.

This works, but is not ideal because:

  1. I need to manually list all pages, defeating a key benefit of a dynamic crawl
  2. I need to wait for multiple tasks to finish and then merge their results, vs having all results in one dataset

Any ideas for how to treat hashes as unique URLs?

CR

civic-roundtable

2 months ago

Update - it looks like Web Scraper has this setting already! URL #fragments identify unique pages

Can we make that setting available in this Actor too?

jindrich.bar avatar

Hello and thank you for your interest in this Actor!

Thanks for the detailed report! Treating hash URLs as unique is a good feature idea. We'll discuss it with our team and determine the best way to incorporate it into the Actor.

In the meantime, your workaround of separate tasks is the best approach, even though it's not ideal.

Appreciate your patience and suggestion! I'll keep you posted about the progress on this. Cheers!

CR

civic-roundtable

2 months ago

Ok, thank you!

Developer
Maintained by Apify
Actor metrics
  • 2.8k monthly users
  • 434 stars
  • 99.9% runs succeeded
  • 2.9 days response time
  • Created in Mar 2023
  • Modified 3 days ago