🛡️ Open Source CVE Scraper
Pricing
from $9.00 / 1,000 results
🛡️ Open Source CVE Scraper
Scrape GitHub Security Advisories and OSV for package vulnerabilities. Extract CVSS scores, affected ranges, and patch details to integrate tools.
Pricing
from $9.00 / 1,000 results
Rating
0.0
(0)
Developer
太郎 山田
Actor stats
0
Bookmarked
1
Total users
1
Monthly active users
7 hours ago
Last modified
Categories
Share
🔒 OSS Vulnerability Monitor
Monitor open-source packages for known security vulnerabilities using OSV and GitHub Security Advisories. One clean summary row per package — severity-first, no brittle scraping. Supports npm, PyPI, Go, Maven, Cargo, and more.
Store Quickstart
Run this actor with your target input. Results appear in the Apify Dataset and can be piped to webhooks for real-time delivery. Use dryRun to validate before committing to a schedule.
Key Features
- • Severity-first output — CRITICAL → HIGH → MEDIUM → LOW ranking per package
- • CVSS v3.1 base scores — computed directly from CVSS vector strings (no NVD key needed)
- • Affected version ranges — SEMVER ranges showing exactly which versions are impacted
- • Fix guidance —
fixedInversion(s) per vulnerability, aggregated intoremediationSummary - • Multi-ecosystem — npm, PyPI, Go modules, Maven, Cargo, NuGet, RubyGems and more
- • Mixed-ecosystem input — scan packages from different ecosystems in one run
Use Cases
| Who | Why |
|---|---|
| Developers | Automate recurring data fetches without building custom scrapers |
| Data teams | Pipe structured output into analytics warehouses |
| Ops teams | Monitor changes via webhook alerts |
| Product managers | Track competitor/market signals without engineering time |
Input
| Field | Type | Default | Description |
|---|---|---|---|
| packages | array | prefilled | Packages to scan. Each entry is either a plain string (package name, defaults to ecosystem below) or an object with {nam |
| ecosystem | string | "npm" | Ecosystem used for plain-string package names. OSV-supported values: npm, PyPI, Go, Maven, NuGet, Cargo, RubyGems, Packa |
| minSeverity | string | "ALL" | Only include vulnerabilities at or above this severity level. CRITICAL > HIGH > MEDIUM > LOW. |
| maxVulnsPerPackage | integer | 20 | Cap the number of individual vulnerability records returned per package (0 = unlimited). |
| concurrency | integer | 5 | Number of parallel OSV API requests |
| timeoutMs | integer | 15000 | Per-request timeout in milliseconds |
| delivery | string | "dataset" | Where to send results: Apify dataset or webhook |
| webhookUrl | string | — | Webhook URL to POST results to (if delivery=webhook) |
Input Example
{"packages": ["express","lodash","axios"],"ecosystem": "npm","minSeverity": "ALL","maxVulnsPerPackage": 20,"concurrency": 5,"timeoutMs": 15000,"delivery": "dataset","dryRun": false}
Output
| Field | Type | Description |
|---|---|---|
meta | object | |
results | array | |
results[].package | string | |
results[].ecosystem | string | |
results[].version | null | |
results[].totalVulns | number | |
results[].criticalCount | number | |
results[].highCount | number | |
results[].mediumCount | number | |
results[].lowCount | number | |
results[].unknownCount | number | |
results[].topSeverity | string | |
results[].topCvssScore | number | |
results[].remediationSummary | string | |
results[].vulnerabilities | array | |
results[].scannedAt | timestamp | |
results[].error | null |
Output Example
{"meta": {"generatedAt": "2026-06-15T12:00:00.000Z","ecosystem": "npm","minSeverity": "ALL","totals": {"scanned": 3,"errors": 0,"withVulns": 2,"clean": 1,"criticalPackages": 1,"highPackages": 1,"mediumPackages": 1,"lowPackages": 0,"totalVulnerabilities": 7}},"results": [{"package": "lodash","ecosystem": "npm","version": null,"totalVulns": 5,"criticalCount": 1,"highCount": 2,"mediumCount": 2,"lowCount": 0,"unknownCount": 0,"topSeverity": "CRITICAL","topCvssScore": 9.1,"remediationSummary": "Upgrade to: 4.17.12, 4.17.21","vulnerabilities": [{"id": "GHSA-jf85-cpcp-j695","cveId": "CVE-2019-10744","ghsaId": "GHSA-jf85-cpcp-j695","summary": "Prototype Pollution in lodash","severity": "CRITICAL","cvssScore": 9.1,"cvssVector": "9.1",
API Usage
Run this actor programmatically using the Apify API. Replace YOUR_API_TOKEN with your token from Apify Console → Settings → Integrations.
cURL
curl -X POST "https://api.apify.com/v2/acts/taroyamada~oss-vulnerability-monitor/run-sync-get-dataset-items?token=YOUR_API_TOKEN" \-H "Content-Type: application/json" \-d '{ "packages": [ "express", "lodash", "axios" ], "ecosystem": "npm", "minSeverity": "ALL", "maxVulnsPerPackage": 20, "concurrency": 5, "timeoutMs": 15000, "delivery": "dataset", "dryRun": false }'
Python
from apify_client import ApifyClientclient = ApifyClient("YOUR_API_TOKEN")run = client.actor("taroyamada/oss-vulnerability-monitor").call(run_input={"packages": ["express","lodash","axios"],"ecosystem": "npm","minSeverity": "ALL","maxVulnsPerPackage": 20,"concurrency": 5,"timeoutMs": 15000,"delivery": "dataset","dryRun": false})for item in client.dataset(run["defaultDatasetId"]).iterate_items():print(item)
JavaScript / Node.js
import { ApifyClient } from 'apify-client';const client = new ApifyClient({ token: 'YOUR_API_TOKEN' });const run = await client.actor('taroyamada/oss-vulnerability-monitor').call({"packages": ["express","lodash","axios"],"ecosystem": "npm","minSeverity": "ALL","maxVulnsPerPackage": 20,"concurrency": 5,"timeoutMs": 15000,"delivery": "dataset","dryRun": false});const { items } = await client.dataset(run.defaultDatasetId).listItems();console.log(items);
Tips & Limitations
- Run nightly as part of your supply-chain monitoring to catch new vulnerabilities early.
- Pair with
oss-vulnerability-monitorfor CVE coverage layered on top of version tracking. - For monorepos, run per-package rather than recursing — easier to triage alerts by team owner.
- Use
snapshotKeyto persist between runs and only alert on diffs. - Webhook delivery supports JSON payloads — pipe into your existing on-call routing.
FAQ
Is my build slowed down?
This actor runs on Apify infrastructure, not your CI runners. No impact on build times.
What's the freshness of data?
Depends on the source registry — typically 5–60 minutes behind upstream.
Can I filter by package ecosystem?
Yes — most DevOps actors accept an ecosystem or package-manager filter in their input schema.
Does this work with private registries?
No — this actor targets public registries (npm, PyPI, crates.io, etc.). Private registries require credential handling that's out of scope.
Can I integrate with GitHub Actions?
Yes — call this actor via Apify API inside a workflow job, parse the JSON output, and fail the build on threshold violations.
Related Actors
DevOps & Tech Intel cluster — explore related Apify tools:
- 🌐 DNS Propagation Checker — Check DNS propagation across 8 global resolvers (Google, Cloudflare, Quad9, OpenDNS).
- 🔍 Subdomain Finder — Discover subdomains for any domain using Certificate Transparency logs (crt.
- 🧹 CSV Data Cleaner — Clean CSV data: trim whitespace, remove empty rows, deduplicate by columns, sort.
- 📦 NPM Package Analyzer — Analyze npm packages: download stats, dependencies, licenses, deprecation status.
- 💬 Reddit Scraper — Scrape Reddit posts and comments from any subreddit via official JSON API.
- GitHub Release & Changelog Monitor API — Track GitHub releases, tags, release notes, and changelog drift over time with one summary-first repository row per repo.
- Docs & Changelog Drift Monitor API — Monitor release notes, changelog pages, migration guides, and key docs pages with one summary-first target row per monitored repo, SDK, or product.
- Tech Events Calendar API | Conferences + CFP — Aggregate tech conferences and CFPs across multiple sources into a deduplicated event calendar for DevRel and recruiting workflows.
Cost
Pay Per Event:
actor-start: $0.01 (flat fee per run)dataset-item: $0.003 per output item
Example: 1,000 items = $0.01 + (1,000 × $0.003) = $3.01
No subscription required — you only pay for what you use.