Get started
Product
Back
Start here!
Get data with ready-made web scrapers for popular websites
Browse 29,529 Actors
Apify platform
Apify Store
Pre-built web scraping tools
Actors
Build and run serverless programs
Integrations
Connect with apps and services
MCP
Give your AI access to Actors
Anti-blocking
Scrape without getting blocked
Proxy
Rotate scraper IP addresses
Open source
Crawlee
Web scraping and crawling library
Solutions
MCP server configuration
Configure your Apify MCP server with Actors and tools for seamless integration with MCP clients.
Start building
Web data for
Enterprise
Startups
Universities
Nonprofits
Use cases
Data for generative AI
Data for AI agents
Lead generation
Market research
View more →
Consulting
Apify Professional Services
Apify Partners
Developers
Documentation
Full reference for the Apify platform
Code templates
Python, JavaScript, and TypeScript
Web scraping academy
Courses for beginners and experts
Monetize your code
Publish your scrapers and get paid
Learn
API reference
CLI
SDK
Earn from your code
$1M paid out last month. Many developers earn over $3k.
Start earning now
Resources
Help and support
Advice and answers about Apify
Actor ideas
Get inspired to build Actors
Changelog
See what’s new on Apify
Customer stories
Find out how others use Apify
Company
About Apify
Contact us
Blog
Live events
Partners
Jobs
We're hiring!
Join our Discord
Talk to scraping experts
Pricing
Contact sales
Robots.txt Validator
Pay per usage
predictable_function/my-actor-3
List of website base URLs whose robots.txt files will be validated
Rating
5.0
(1)
Developer
riya rawat
Actor stats
0
Bookmarked
66
Total users
2
Monthly active users
4 months ago
Last modified
Categories
SEO tools
Share
andok/robotstxt-auditor
Scan robots.txt files in bulk to extract sitemap URLs and verify crawler directives for technical SEO compliance.
Andok
scrappy_garden/robots-txt-validator
Validate robots.txt for one or more websites: fetches /robots.txt per host, parses directive groups (User-agent/Allow/Disallow/Crawl-delay/Sitemap), reports common errors and warnings, and can test URLs against the chosen User-Agent.
Bikram Adhikari
taroyamada/robotstxt-ai-checker
robots.txt parser that audits AI crawler block rules (GPTBot, ClaudeBot, anthropic-ai, PerplexityBot) across thousands of websites in one run. Returns per-bot allow/disallow disposition and crawl-delay.
太郎 山田
alizarin_refrigerator-owner/robots-txt-checker
The Robots.txt Checker provides comprehensive analysis of your robots.txt file: Syntax Validation CMS Detection - Identify WordPress, Shopify, Drupal,& 6+ other CMS platforms Best Practice Check Companion File Checks - sitemap.xml, llms.txt, security.txt AI Recommendations - CMS-specific suggestions
The Howlers
3
pink_comic/robots-txt-validator
Analyze robots.txt files for any domain. Extract crawl rules, sitemaps, blocked paths, and crawl-delay settings. Validate configuration and identify SEO issues in bulk.
Ava Torres
zerobreak/robots-txt-analyzer
Robots txt analyzer that fetches and parses crawl rules from any website in bulk, so SEO teams and developers can audit blocked paths, user agents, and sitemap locations across hundreds of domains without manual work.
ZeroBreak
automation-lab/robots-txt-generator
Generate valid robots.txt files from structured rules. Apply presets (block AI bots, SEO-friendly), add custom per-bot rules, sitemaps, and crawl-delay. Zero-proxy, instant output.
Stas Persiianenko
datawinder/robots-txt-monitor
Stateful robots.txt monitoring with baseline awareness and severity-classified alerts. Detects meaningful policy changes over time — not noisy diffs.
Datawinder
zerobreak/indexability-audit
Indexability audit tool that checks robots.txt, meta robots tags, X-Robots-Tag headers, and canonical URLs for any list of pages, so SEO teams know which ones Google can actually crawl and index.
tom_the_builder/sitemap-robots-delta-monitor
Monitor sitemap.xml and robots.txt for URL inventory changes and return new, changed, or removed URLs in normalized JSON.
Danil Iarmolchik
-