🤖 robots.txt AI Bot Checker avatar

🤖 robots.txt AI Bot Checker

Pricing

from $11.00 / 1,000 results

Go to Apify Store
🤖 robots.txt AI Bot Checker

🤖 robots.txt AI Bot Checker

Audit robots.txt files across thousands of websites to detect specific crawl policies, disallowed paths, and user-agents for GPTBot and ClaudeBot.

Pricing

from $11.00 / 1,000 results

Rating

0.0

(0)

Developer

太郎 山田

太郎 山田

Maintained by Community

Actor stats

0

Bookmarked

2

Total users

1

Monthly active users

8 days ago

Last modified

Categories

Share

robots.txt AI Checker | GPTBot, ClaudeBot & AI Crawl Rules

Track how publishers and arbitrary websites handle AI crawlers, LLM training bots, and search indexing tools with this specialized web scraper. As data collection for generative AI models becomes a massive point of contention, tracking policy shifts across target domains is critical for maintaining compliance and understanding the evolving web ecosystem. This tool automatically fetches and parses robots.txt pages, allowing you to extract detailed bot policies specifically targeting agents like GPTBot, ClaudeBot, and Google-Extended. Users rely on this scraper to audit thousands of URLs effortlessly, substituting manual website checks with automated, scheduled runs. Set up daily or weekly monitoring workflows to immediately detect when a domain updates its scraping rules or imposes new restrictions on AI data collection. The system extracts structured data directly from the raw text, returning concrete details about which specific bots are explicitly allowed or disallowed. Output fields include the exact user-agent string, the restricted directory paths, and crawl-delay directives. By identifying exactly what changed since your last run, you can securely build web datasets, respect publisher boundaries, and integrate compliance checks directly into your broader data engineering pipelines.

Store Quickstart

  • Start with store-input.example.json. It uses demoMode=true so the first Store run is safe, cheap, and easy to understand.
  • If the compact output is useful, switch to store-input.templates.json and pick one of:
  • Demo Quickstart for a trial run
  • Production Monitor for recurring dataset snapshots
  • Webhook Alert for policy-change notifications

Key Features

  • 🛡️ Compliance-first — Produces audit-ready reports mapping findings to standards (WCAG, GDPR, SOC2)
  • 🔒 Non-invasive scanning — Uses only observable public signals — no intrusive probing
  • 📊 Severity-scored output — Each finding rated for criticality with remediation guidance
  • 📡 Delta-alerting — Flag new findings since last run via webhook delivery
  • 📋 Evidence export — Raw headers/responses captured for compliance documentation

Use Cases

WhoWhy
DevelopersAutomate recurring data fetches without building custom scrapers
Data teamsPipe structured output into analytics warehouses
Ops teamsMonitor changes via webhook alerts
Product managersTrack competitor/market signals without engineering time

Input

FieldTypeDefaultDescription
domainsarrayprefilledList of domains to analyze robots.txt for AI crawler policies. Max 500.
deliverystring"dataset"How to deliver results. 'dataset' saves to Apify Dataset, 'webhook' sends to a URL. In demoMode, delivery is always data
webhookUrlstringWebhook URL to send results to (only used when delivery is 'webhook'). Works with Slack, Discord, or any HTTP endpoint.
snapshotKeystring"robotstxt-snapshots"Key name for storing snapshots (used for change detection between runs).
concurrencyinteger5Maximum number of parallel requests. Higher = faster but may trigger rate limits.
dryRunbooleanfalseIf true, runs without saving results or sending webhooks. Useful for testing.
demoModebooleanfalseIf true, checks only 1 domain, returns compact policy fields, and disables webhook/snapshot writes.

Input Example

{
"domains": [
"google.com",
"github.com",
"nytimes.com",
"openai.com"
],
"delivery": "dataset",
"snapshotKey": "robotstxt-snapshots",
"concurrency": 5,
"dryRun": false,
"demoMode": false
}

Output

FieldTypeDescription
metaobject
resultsarray
results[].domainstring
results[].statusstring
results[].summaryobject
results[].aiPoliciesarray
results[].changesarray
results[].checkedAttimestamp
results[].demoAppliedboolean
results[].detailsMaskedboolean
results[].errornull

Output Example

{
"meta": {
"generatedAt": "2026-02-22T17:50:20.909Z",
"totals": {
"total": 1,
"requestedDomains": 2,
"processedDomains": 1,
"withRobotsTxt": 1,
"noRobotsTxt": 0,
"invalidDomains": 0,
"blockingAi": 0,
"errors": 0
},
"demoApplied": true,
"limits": {
"maxDomains": 1,
"compactPolicies": true,
"webhookEnabled": false,
"snapshotWriteEnabled": false
},
"upgradeHint": "Demo mode checks 1 domain, disables webhook delivery, and returns a compact policy view. Set demoMode=false to unlock bulk checks and full policy details."
},
"results": [
{
"domain": "openai.com",
"status": "ok",
"summary": {
"totalCrawlers": 16,
"blocked": 0,
"partialBlock": 16,
"allowed": 0,
"changed": 0
},
"aiPolicies": [
{
"crawler": "GPTBot",
"company": "OpenAI",
"blocked": false,
"partialBlock": true,
"allowed": false

API Usage

Run this actor programmatically using the Apify API. Replace YOUR_API_TOKEN with your token from Apify Console → Settings → Integrations.

cURL

curl -X POST "https://api.apify.com/v2/acts/taroyamada~robotstxt-ai-checker/run-sync-get-dataset-items?token=YOUR_API_TOKEN" \
-H "Content-Type: application/json" \
-d '{ "domains": [ "google.com", "github.com", "nytimes.com", "openai.com" ], "delivery": "dataset", "snapshotKey": "robotstxt-snapshots", "concurrency": 5, "dryRun": false, "demoMode": false }'

Python

from apify_client import ApifyClient
client = ApifyClient("YOUR_API_TOKEN")
run = client.actor("taroyamada/robotstxt-ai-checker").call(run_input={
"domains": [
"google.com",
"github.com",
"nytimes.com",
"openai.com"
],
"delivery": "dataset",
"snapshotKey": "robotstxt-snapshots",
"concurrency": 5,
"dryRun": false,
"demoMode": false
})
for item in client.dataset(run["defaultDatasetId"]).iterate_items():
print(item)

JavaScript / Node.js

import { ApifyClient } from 'apify-client';
const client = new ApifyClient({ token: 'YOUR_API_TOKEN' });
const run = await client.actor('taroyamada/robotstxt-ai-checker').call({
"domains": [
"google.com",
"github.com",
"nytimes.com",
"openai.com"
],
"delivery": "dataset",
"snapshotKey": "robotstxt-snapshots",
"concurrency": 5,
"dryRun": false,
"demoMode": false
});
const { items } = await client.dataset(run.defaultDatasetId).listItems();
console.log(items);

Tips & Limitations

  • Schedule weekly runs against your production domains to catch config drift.
  • Use webhook delivery to pipe findings into your SIEM (Splunk, Datadog, Elastic).
  • For CI integration, block releases on critical severity findings using exit codes.
  • Combine with ssl-certificate-monitor for layered cert + headers coverage.
  • Findings include links to official remediation docs — share with dev teams via the webhook payload.

FAQ

Is running this against a third-party site legal?

Passive public-header scanning is generally permitted, but follow your own compliance policies. Only scan sites you have authorization for.

How often should I scan?

Weekly for production domains; daily if you have high config-change velocity.

Can I export to a compliance tool?

Use webhook delivery or Dataset API — formats map well to Drata, Vanta, OneTrust import templates.

Is this a penetration test?

No — this actor performs passive compliance scanning only. No exploitation, fuzzing, or auth bypass.

Does this qualify as a SOC2 control?

This actor produces evidence artifacts suitable for SOC2 CC7.1 (continuous monitoring). It is not itself a SOC2 certification.

Security & Compliance cluster — explore related Apify tools:

Cost

Pay Per Event:

  • actor-start: $0.01 (flat fee per run)
  • dataset-item: $0.003 per output item

Example: 1,000 items = $0.01 + (1,000 × $0.003) = $3.01

No subscription required — you only pay for what you use.

⭐ Was this helpful?

If this actor saved you time, please leave a ★ rating on Apify Store. It takes 10 seconds, helps other developers discover it, and keeps updates free.

Bug report or feature request? Open an issue on the Issues tab of this actor.