Robots.txt Auditor & Sitemap Finder
Pricing
from $1.00 / 1,000 dataset items
Go to Apify Store

Robots.txt Auditor & Sitemap Finder
Scan robots.txt files in bulk to extract sitemap URLs and verify crawler directives for technical SEO compliance.
Pricing
from $1.00 / 1,000 dataset items
Rating
0.0
(0)
Developer

Andok
Maintained by Community
Actor stats
0
Bookmarked
2
Total users
1
Monthly active users
2 days ago
Last modified
Categories
Share
Robots.txt Auditor & Sitemap Extractor (Bulk)
Bulk fetch and parse robots.txt files for a list of URLs or domains.
What it does
For each input URL/domain, it converts the URL to the /robots.txt path and fetches the content.
It then extracts:
- HTTP status (e.g. 200, 404)
- Length of the file
- All
Sitemap:directives (deduplicated) - All unique
User-agent:blocks mentioned in the file
Typical uses
- SEO audits: fast way to discover sitemaps across a massive list of domains.
- Intelligence: detect what kind of crawlers the site is specifically blocking/allowing.
Input
urls(required): list of URLs or domains to check.timeoutSeconds(default15)concurrency(default10)
Output
Writes one dataset item per input domain.
Monetization + safety
This actor is designed for Pay-Per-Event (dataset item = 1 unit of work) and respects the per-run max charge limit.