📦 NPM Package Scraper
Pricing
Pay per event
📦 NPM Package Scraper
Extract NPM package details, dependencies, and license data. Track weekly downloads and deprecation flags using the official registry API.
Pricing
Pay per event
Rating
0.0
(0)
Developer
太郎 山田
Actor stats
0
Bookmarked
2
Total users
1
Monthly active users
2 days ago
Last modified
Categories
Share
NPM Package Intelligence API | Downloads, Dependencies & Licenses
Analyze public npm packages with official npm registry and downloads APIs — no npmjs.com HTML scraping, login, or API key required. Ideal for dependency shortlisting, license audits, maintainer checks, and release-cadence research.
Store Quickstart
- Start with 3–5 exact package names in
packagesfor the cleanest first run. - Add
searchTermonly when you need discovery, and keepsearchSizearound 10–20 until you know the category you want. - Use
dryRun: trueto validate webhook or dataset delivery before a larger audit. - After the first useful run, move the account to the recurring watchlist template, then use the webhook handoff template for release alerts or downstream actions.
Key Features
- 📦 Full package metadata — version, description, author, homepage, repo
- 📥 Weekly download stats — Usage signal for popularity
- 🔗 Dependencies tree — Direct deps with version specs
- ⚖️ License info — MIT, Apache-2.0, GPL-3.0 detection
- ⚠️ Deprecation detection — Flags deprecated packages
- 🔑 No API key needed — Uses official npm registry
Use Cases
| Who | Why |
|---|---|
| DevOps teams | Dependency security audits |
| Engineering leaders | Track tech stack across projects |
| OSS maintainers | Competitor package analysis |
| License compliance | Verify all deps are commercially usable |
| Recruiters/founders | Research packages/maintainers for hiring |
Input
| Field | Type | Default | Description |
|---|---|---|---|
| packages | string[] | (required) | npm package names (max 100) |
| includeDownloads | boolean | true | Weekly download stats |
| includeDeprecated | boolean | false | Include deprecated warnings |
Input Example
{"packages": ["express", "react", "axios"],"includeDownloads": true,"includeDeprecated": false}
Output
| Field | Type | Description |
|---|---|---|
name | string | Package name |
version | string | Latest version |
description | string | Package description |
downloads | object | Download stats by period (if includeDownloads) |
downloads.lastWeek | integer | Last 7 days downloads |
downloads.lastMonth | integer | Last 30 days downloads |
vulnerabilities | object[] | Known CVEs (if includeVulnerabilities) |
maintainers | string[] | Maintainer usernames |
license | string | SPDX license identifier |
repository | string | Source repo URL |
Output Example
{"name": "express","version": "4.21.2","description": "Fast, unopinionated web framework","author": "TJ Holowaychuk","license": "MIT","dependencies": {"body-parser": "~1.20.0"},"weeklyDownloads": 32000000,"deprecated": false}
API Usage
Run this actor programmatically using the Apify API. Replace YOUR_API_TOKEN with your token from Apify Console → Settings → Integrations.
cURL
curl -X POST "https://api.apify.com/v2/acts/taroyamada~npm-package-intelligence/run-sync-get-dataset-items?token=YOUR_API_TOKEN" \-H "Content-Type: application/json" \-d '{ "packages": ["express", "react", "axios"], "includeDownloads": true, "includeDeprecated": false }'
Python
from apify_client import ApifyClientclient = ApifyClient("YOUR_API_TOKEN")run = client.actor("taroyamada/npm-package-intelligence").call(run_input={"packages": ["express", "react", "axios"],"includeDownloads": true,"includeDeprecated": false})for item in client.dataset(run["defaultDatasetId"]).iterate_items():print(item)
JavaScript / Node.js
import { ApifyClient } from 'apify-client';const client = new ApifyClient({ token: 'YOUR_API_TOKEN' });const run = await client.actor('taroyamada/npm-package-intelligence').call({"packages": ["express", "react", "axios"],"includeDownloads": true,"includeDeprecated": false});const { items } = await client.dataset(run.defaultDatasetId).listItems();console.log(items);
Tips & Limitations
- Audit your dependency list monthly to catch new CVEs and abandoned packages.
- Track competitor packages' download trends for market intel.
- Combine with GitHub data for full open-source intelligence.
- Use in CI/CD to fail builds when vulnerabilities are introduced.
FAQ
Where do download stats come from?
npm's official npms.io API. Weekly counts are reliable popularity signals.
Can I get security vulnerability data?
Not directly. Pair with npm audit or Snyk for security scanning.
Does it work with private packages?
No — only public npm registry. Private packages require authentication.
Deprecated packages?
Flagged via the 'deprecated' field. Useful for migration planning.
Where do vulnerabilities come from?
GitHub Advisory Database via the npm audit endpoint.
Are private npm packages supported?
Public packages only. Private registries require authentication that this actor doesn't handle.
Related Actors
Pair this actor with other flagship intelligence APIs in the same portfolio:
- PyPI Package Intelligence API — audit Python packages with release history, dependency declarations, and optional OSV signals.
- Docker Hub Image Intelligence API — inspect public container repositories, tags, pulls, and star signals for supply-chain research.
- Shopify Store Intelligence API — add public storefront and catalog context when evaluating ecommerce stacks built on these packages.
Pricing & Cost Control
Apify Store pricing is usage-based, so total cost mainly follows how many packages you process and whether you also run discovery via searchTerm. Check the Store pricing card for the current per-event rates.
- Start with a short
packageslist or keepsearchSizesmall for discovery runs. - Turn
includeDownloadson only when popularity signals matter. - Use
dryRun: trueto validate the input before larger audits. - Prefer dataset delivery while iterating; switch to webhooks once the payload shape is stable.