Robots.txt Parser avatar
Robots.txt Parser
Under maintenance

Pricing

Pay per usage

Go to Apify Store
Robots.txt Parser

Robots.txt Parser

Under maintenance

Parse and analyze robots.txt files. Check crawl rules, sitemaps, and bot permissions for any website.

Pricing

Pay per usage

Rating

0.0

(0)

Developer

Donny Nguyen

Donny Nguyen

Maintained by Community

Actor stats

0

Bookmarked

1

Total users

0

Monthly active users

4 days ago

Last modified

Categories

Share

Robots.txt Parser - Analyze Crawl Directives for SEO

Parse and analyze robots.txt from any website. Extract allow/disallow rules per user-agent, find sitemap URLs, check crawl-delay settings.

Key Features

  • Full robots.txt parsing with multi user-agent support
  • Sitemap URL extraction
  • Crawl-delay detection
  • Optional path checking (is this path allowed?)
  • Bulk site analysis

Cost

Typical runs cost less than $0.01 per site.

Input Example

{
"urls": ["google.com", "github.com"],
"checkPaths": ["/admin", "/api"]
}

Output Example

{
"domain": "google.com",
"exists": true,
"userAgents": ["*", "Googlebot"],
"sitemaps": ["https://www.google.com/sitemap.xml"],
"totalRuleCount": 245
}

Use Cases

  • SEO technical audits: verify crawl directives
  • Competitor research: see what competitors block from crawlers
  • Bot development: check if your crawler is allowed
  • Sitemap discovery: find all sitemaps referenced in robots.txt

Built By

Donny Dev - Blockchain & automation developer