Documentation Drift Tracker avatar

Documentation Drift Tracker

Pricing

from $9.00 / 1,000 results

Go to Apify Store
Documentation Drift Tracker

Documentation Drift Tracker

Extract updates from software changelogs and API pages by running automated checks to track documentation drift and feed scraped results to AI tools.

Pricing

from $9.00 / 1,000 results

Rating

0.0

(0)

Developer

太郎 山田

太郎 山田

Maintained by Community

Actor stats

0

Bookmarked

1

Total users

1

Monthly active users

7 hours ago

Last modified

Share

Docs & Changelog Drift Monitor API

Monitor release notes, changelog pages, migration guides, and key docs pages with one summary-first target row per monitored repo, SDK, or product.

Store Quickstart

  • Start with store-input.example.json for a reliable first run across two public targets.
  • If the output fits your workflow, switch to store-input.templates.json and choose one of:
    • Quickstart (2 targets, dataset) for first success
    • Recurring Product Docs Watchlist for recurring docs and migration monitoring
    • Webhook Docs Drift Digest when Slack, ticketing, or internal automation should receive only summary-first action items

Key Features

  • 🛠️ Developer-focused — CLI-ready JSON output for piping into build/CI tooling
  • Fast parallel scanning — Concurrent fetches with backoff for high-throughput audits
  • 📊 Changelog-aware — Detects version bumps, new releases, and deprecations between runs
  • 🔔 Alert integrations — Webhook delivery to Slack/PagerDuty/Opsgenie for on-call visibility
  • 🔒 Zero-credentials — Uses only public data — no package-registry API keys required

Use Cases

WhoWhy
DevelopersAutomate recurring data fetches without building custom scrapers
Data teamsPipe structured output into analytics warehouses
Ops teamsMonitor changes via webhook alerts
Product managersTrack competitor/market signals without engineering time

Input

FieldTypeDefaultDescription
targetsarrayprefilledList of products or repos to monitor. Each item supports id, name, repo, criticality, owner, tags, releaseNotesUrl, chan
requestTimeoutSecondsinteger30Maximum time to wait for one public source request before the actor marks the surface as failed for this run.
userAgentstringOptional custom User-Agent string for public HTTP requests. Leave empty to use the actor default identifier.
maxCharsinteger35000Upper bound for extracted text per monitored surface before hashing and diff generation.
deliverystring"dataset"Choose whether summary-first target rows should be written to the dataset, posted to a webhook payload, or reserved for
datasetModestring"changes_only"Controls which target rows are persisted: only action-needed rows, only changed rows, or every monitored target.
webhookUrlstringWebhook destination for summary payload delivery when delivery is set to webhook.
notifyOnNoChangebooleanfalseIf true, webhook delivery still fires even when no target crosses the change threshold in this run.

Input Example

{
"targets": [
{
"id": "nextjs",
"name": "Next.js",
"repo": "vercel/next.js",
"criticality": "high",
"owner": "Frontend Platform",
"tags": ["framework", "docs"],
"releaseNotesUrl": "https://github.com/vercel/next.js/releases.atom",
"migrationGuideUrl": "https://nextjs.org/docs/app/guides/upgrading/version-16",
"docsPages": [
{
"id": "nextjs-caching",
"name": "Caching Guide",
"url": "https://nextjs.org/docs/app/guides/caching"
}
]
}
],
"delivery": "dataset",
"datasetMode": "changes_only",
"snapshotKey": "docs-changelog-drift-nextjs",
"diffMode": "line_summary",
"summaryMaxLines": 12,
"concurrency": 2
}

Output

FieldTypeDescription
metaobject
recurringDigestobject
actionNeededarray
resultsarray
results[].targetIdstring
results[].targetNamestring
results[].repostring
results[].repoUrlstring (url)
results[].criticalitystring
results[].ownerstring
results[].tagsarray
results[].statusstring
results[].severitystring
results[].reasonstring
results[].executiveSummarystring
results[].recommendedActionsarray
results[].signalsarray
results[].latestMarkersarray
results[].targetSummaryobject
results[].changesarray
results[].surfacesarray
results[].checkedAttimestamp
results[].errornull

Output Example

{
"meta": {
"generatedAt": "2026-04-04T14:33:02.189Z",
"now": "2026-04-04T14:33:02.182Z",
"input": {
"targetCount": 2,
"surfaceCount": 6,
"delivery": "dataset",
"datasetMode": "changes_only",
"diffMode": "line_summary",
"summaryMaxLines": 12,
"concurrency": 2,
"batchDelayMs": 0,
"dryRun": false
},
"snapshot": {
"key": "docs-changelog-drift-monitor-local",
"loadedFrom": "local",
"savedTo": "local"
},
"warnings": [
"surface(nextjs-release-notes): includePatterns is empty, full extracted page text will be monitored",
"surface(nextjs-migration-guide): includePatterns is empty, full extracted page text will be monitored",
"surface(nextjs-caching): includePatterns is empty, full extracted page text will be monitored",
"surface(nextjs-routing): includePatterns is empty, full extracted page text will be monitored",
"surface(crawlee-changelog): includePatterns is empty, full extracted page text will be monitored",
"surface(crawlee-upgrade): includePatterns is empty, full extracted page text will be monitored"
],
"totals": {
"targets": 2,
"monitoredSurfaces": 6,
"changedTargets": 0,
"initialTargets": 2,
"unchangedTargets": 0,
"partialTargets": 0,
"errorTargets": 0,
"actionNeededTargets": 0,
"changedSurfaces": 0,
"initialSurfaces": 6,
"unchangedSurfaces": 0,

API Usage

Run this actor programmatically using the Apify API. Replace YOUR_API_TOKEN with your token from Apify Console → Settings → Integrations.

cURL

curl -X POST "https://api.apify.com/v2/acts/taroyamada~docs-changelog-drift-monitor/run-sync-get-dataset-items?token=YOUR_API_TOKEN" \
-H "Content-Type: application/json" \
-d '{ "targets": [ { "id": "nextjs", "name": "Next.js", "repo": "vercel/next.js", "criticality": "high", "owner": "Frontend Platform", "tags": ["framework", "docs"], "releaseNotesUrl": "https://github.com/vercel/next.js/releases.atom", "migrationGuideUrl": "https://nextjs.org/docs/app/guides/upgrading/version-16", "docsPages": [ { "id": "nextjs-caching", "name": "Caching Guide", "url": "https://nextjs.org/docs/app/guides/caching" } ] } ], "delivery": "dataset", "datasetMode": "changes_only", "snapshotKey": "docs-changelog-drift-nextjs", "diffMode": "line_summary", "summaryMaxLines": 12, "concurrency": 2 }'

Python

from apify_client import ApifyClient
client = ApifyClient("YOUR_API_TOKEN")
run = client.actor("taroyamada/docs-changelog-drift-monitor").call(run_input={
"targets": [
{
"id": "nextjs",
"name": "Next.js",
"repo": "vercel/next.js",
"criticality": "high",
"owner": "Frontend Platform",
"tags": ["framework", "docs"],
"releaseNotesUrl": "https://github.com/vercel/next.js/releases.atom",
"migrationGuideUrl": "https://nextjs.org/docs/app/guides/upgrading/version-16",
"docsPages": [
{
"id": "nextjs-caching",
"name": "Caching Guide",
"url": "https://nextjs.org/docs/app/guides/caching"
}
]
}
],
"delivery": "dataset",
"datasetMode": "changes_only",
"snapshotKey": "docs-changelog-drift-nextjs",
"diffMode": "line_summary",
"summaryMaxLines": 12,
"concurrency": 2
})
for item in client.dataset(run["defaultDatasetId"]).iterate_items():
print(item)

JavaScript / Node.js

import { ApifyClient } from 'apify-client';
const client = new ApifyClient({ token: 'YOUR_API_TOKEN' });
const run = await client.actor('taroyamada/docs-changelog-drift-monitor').call({
"targets": [
{
"id": "nextjs",
"name": "Next.js",
"repo": "vercel/next.js",
"criticality": "high",
"owner": "Frontend Platform",
"tags": ["framework", "docs"],
"releaseNotesUrl": "https://github.com/vercel/next.js/releases.atom",
"migrationGuideUrl": "https://nextjs.org/docs/app/guides/upgrading/version-16",
"docsPages": [
{
"id": "nextjs-caching",
"name": "Caching Guide",
"url": "https://nextjs.org/docs/app/guides/caching"
}
]
}
],
"delivery": "dataset",
"datasetMode": "changes_only",
"snapshotKey": "docs-changelog-drift-nextjs",
"diffMode": "line_summary",
"summaryMaxLines": 12,
"concurrency": 2
});
const { items } = await client.dataset(run.defaultDatasetId).listItems();
console.log(items);

Tips & Limitations

  • Run nightly as part of your supply-chain monitoring to catch new vulnerabilities early.
  • Pair with oss-vulnerability-monitor for CVE coverage layered on top of version tracking.
  • For monorepos, run per-package rather than recursing — easier to triage alerts by team owner.
  • Use snapshotKey to persist between runs and only alert on diffs.
  • Webhook delivery supports JSON payloads — pipe into your existing on-call routing.

FAQ

Is my build slowed down?

This actor runs on Apify infrastructure, not your CI runners. No impact on build times.

What's the freshness of data?

Depends on the source registry — typically 5–60 minutes behind upstream.

Can I filter by package ecosystem?

Yes — most DevOps actors accept an ecosystem or package-manager filter in their input schema.

Does this work with private registries?

No — this actor targets public registries (npm, PyPI, crates.io, etc.). Private registries require credential handling that's out of scope.

Can I integrate with GitHub Actions?

Yes — call this actor via Apify API inside a workflow job, parse the JSON output, and fail the build on threshold violations.

DevOps & Tech Intel cluster — explore related Apify tools:

Cost

Pay Per Event:

  • actor-start: $0.01 (flat fee per run)
  • dataset-item: $0.003 per output item

Example: 1,000 items = $0.01 + (1,000 × $0.003) = $3.01

No subscription required — you only pay for what you use.