🔍 Privacy Policy Scraper avatar

🔍 Privacy Policy Scraper

Pricing

Pay per usage

Go to Apify Store
🔍 Privacy Policy Scraper

🔍 Privacy Policy Scraper

Extract compliance data from arbitrary websites to identify GDPR/CCPA violations. Generate audit reports to find leads lacking proper cookie banners.

Pricing

Pay per usage

Rating

0.0

(0)

Developer

太郎 山田

太郎 山田

Maintained by Community

Actor stats

0

Bookmarked

2

Total users

1

Monthly active users

a day ago

Last modified

Share

Privacy & Cookie Compliance Scanner | GDPR / CCPA Banner Audit

Scan public privacy pages and cookie banners for GDPR/CCPA compliance signals. Returns one clean compliance summary row per site with banner detection, consent framework identification, policy freshness, and recommended actions.

Store Quickstart

Run this actor with your target input. Results appear in the Apify Dataset and can be piped to webhooks for real-time delivery. Use dryRun to validate before committing to a schedule.

Key Features

  • 🛡️ Compliance-first — Produces audit-ready reports mapping findings to standards (WCAG, GDPR, SOC2)
  • 🔒 Non-invasive scanning — Uses only observable public signals — no intrusive probing
  • 📊 Severity-scored output — Each finding rated for criticality with remediation guidance
  • 📡 Delta-alerting — Flag new findings since last run via webhook delivery
  • 📋 Evidence export — Raw headers/responses captured for compliance documentation

Use Cases

WhoWhy
DevelopersAutomate recurring data fetches without building custom scrapers
Data teamsPipe structured output into analytics warehouses
Ops teamsMonitor changes via webhook alerts
Product managersTrack competitor/market signals without engineering time

Input

FieldTypeDefaultDescription
sitesarrayprefilledList of sites to scan. Each entry requires a homepageUrl; privacyPolicyUrl and cookiePolicyUrl are auto-discovered if om
deliverystring"dataset"Starter path: dataset keeps the first run low-friction. Advanced path: webhook sends the same payload to your endpoint f
webhookUrlstringAdvanced delivery only: required when delivery is webhook. Must be a valid http(s) URL.
snapshotKeystring"privacy-cookie-compliance-snapshots"Keep this stable when moving from the quickstart to recurring compliance monitoring so policy drift stays comparable run
concurrencyinteger2Parallel site checks. Keep at 1-2 for quickstart runs; increase for larger compliance portfolios.
batchDelayMsinteger500Pause between batches to keep scans polite and avoid rate limiting.
requestTimeoutSecsinteger20Per-request timeout for fetching homepage, privacy policy, and cookie policy pages.
followRedirectsbooleantrueFollow HTTP redirects before scanning pages so canonical URLs are evaluated correctly.

Input Example

{
"sites": [
{
"homepageUrl": "https://vercel.com",
"privacyPolicyUrl": "https://vercel.com/legal/privacy-policy",
"cookiePolicyUrl": "",
"region": "EU",
"consentMode": ""
}
],
"delivery": "dataset",
"snapshotKey": "privacy-cookie-compliance-snapshots",
"concurrency": 2,
"batchDelayMs": 500,
"requestTimeoutSecs": 20,
"followRedirects": true,
"dryRun": false
}

Output

FieldTypeDescription
metaobject
alertsarray
resultsarray
alerts[].siteUrlstring (url)
alerts[].severitystring
alerts[].typestring
alerts[].messagestring

Output Example

{
"meta": {
"generatedAt": "2026-06-01T12:00:00.000Z",
"totals": {
"total": 2,
"initial": 0,
"ok": 1,
"changed": 1,
"error": 0,
"compliant": 1,
"partial": 1,
"non_compliant": 1,
"unknown": 0
},
"severityCounts": {
"critical": 0,
"high": 1,
"watch": 1,
"info": 0
},
"alertCount": 3,
"executiveSummary": {
"overallStatus": "attention_needed",
"brief": "2 of 2 site(s) need compliance attention. Top issue: Cookie banner presence changed since last run.",
"totals": {
"total": 2,
"initial": 0,
"ok": 1,
"changed": 1,
"error": 0,
"compliant": 1,
"partial": 1,
"non_compliant": 1,
"unknown": 0
},
"severityCounts": {
"critical": 0,
"high": 1,
"watch": 1,
"info": 0

API Usage

Run this actor programmatically using the Apify API. Replace YOUR_API_TOKEN with your token from Apify Console → Settings → Integrations.

cURL

curl -X POST "https://api.apify.com/v2/acts/taroyamada~privacy-cookie-compliance-scanner/run-sync-get-dataset-items?token=YOUR_API_TOKEN" \
-H "Content-Type: application/json" \
-d '{ "sites": [ { "homepageUrl": "https://vercel.com", "privacyPolicyUrl": "https://vercel.com/legal/privacy-policy", "cookiePolicyUrl": "", "region": "EU", "consentMode": "" } ], "delivery": "dataset", "snapshotKey": "privacy-cookie-compliance-snapshots", "concurrency": 2, "batchDelayMs": 500, "requestTimeoutSecs": 20, "followRedirects": true, "dryRun": false }'

Python

from apify_client import ApifyClient
client = ApifyClient("YOUR_API_TOKEN")
run = client.actor("taroyamada/privacy-cookie-compliance-scanner").call(run_input={
"sites": [
{
"homepageUrl": "https://vercel.com",
"privacyPolicyUrl": "https://vercel.com/legal/privacy-policy",
"cookiePolicyUrl": "",
"region": "EU",
"consentMode": ""
}
],
"delivery": "dataset",
"snapshotKey": "privacy-cookie-compliance-snapshots",
"concurrency": 2,
"batchDelayMs": 500,
"requestTimeoutSecs": 20,
"followRedirects": true,
"dryRun": false
})
for item in client.dataset(run["defaultDatasetId"]).iterate_items():
print(item)

JavaScript / Node.js

import { ApifyClient } from 'apify-client';
const client = new ApifyClient({ token: 'YOUR_API_TOKEN' });
const run = await client.actor('taroyamada/privacy-cookie-compliance-scanner').call({
"sites": [
{
"homepageUrl": "https://vercel.com",
"privacyPolicyUrl": "https://vercel.com/legal/privacy-policy",
"cookiePolicyUrl": "",
"region": "EU",
"consentMode": ""
}
],
"delivery": "dataset",
"snapshotKey": "privacy-cookie-compliance-snapshots",
"concurrency": 2,
"batchDelayMs": 500,
"requestTimeoutSecs": 20,
"followRedirects": true,
"dryRun": false
});
const { items } = await client.dataset(run.defaultDatasetId).listItems();
console.log(items);

Tips & Limitations

  • Schedule weekly runs against your production domains to catch config drift.
  • Use webhook delivery to pipe findings into your SIEM (Splunk, Datadog, Elastic).
  • For CI integration, block releases on critical severity findings using exit codes.
  • Combine with ssl-certificate-monitor for layered cert + headers coverage.
  • Findings include links to official remediation docs — share with dev teams via the webhook payload.

FAQ

Is running this against a third-party site legal?

Passive public-header scanning is generally permitted, but follow your own compliance policies. Only scan sites you have authorization for.

How often should I scan?

Weekly for production domains; daily if you have high config-change velocity.

Can I export to a compliance tool?

Use webhook delivery or Dataset API — formats map well to Drata, Vanta, OneTrust import templates.

Is this a penetration test?

No — this actor performs passive compliance scanning only. No exploitation, fuzzing, or auth bypass.

Does this qualify as a SOC2 control?

This actor produces evidence artifacts suitable for SOC2 CC7.1 (continuous monitoring). It is not itself a SOC2 certification.

Security & Compliance cluster — explore related Apify tools:

Cost

Pay Per Event:

  • actor-start: $0.01 (flat fee per run)
  • dataset-item: $0.003 per output item

Example: 1,000 items = $0.01 + (1,000 × $0.003) = $3.01

No subscription required — you only pay for what you use.