Product Hunt Scraper — Full Data + Maker Profiles avatar

Product Hunt Scraper — Full Data + Maker Profiles

Pricing

Pay per usage

Go to Apify Store
Product Hunt Scraper — Full Data + Maker Profiles

Product Hunt Scraper — Full Data + Maker Profiles

Scrapes Product Hunt daily/weekly/monthly TOP products with full maker profiles, social links, gallery, topics, and launch dates.

Pricing

Pay per usage

Rating

0.0

(0)

Developer

Algirdas Kolesnikovas

Algirdas Kolesnikovas

Maintained by Community

Actor stats

0

Bookmarked

2

Total users

0

Monthly active users

3 hours ago

Last modified

Share

Product Hunt Scraper — Full Data + Maker Profiles

Commercial‑grade Apify Actor that scrapes Product Hunt TOP products (daily/weekly/monthly or custom URL) with:

  • Full product data: name, tagline, description, topics, launch date, upvotes, comments
  • Media: thumbnail + gallery screenshots
  • Maker profiles: headline, Twitter, LinkedIn, website, founder flag
  • Monitoring script to detect when selectors break

Publish Description (Apify Store)

Track Product Hunt opportunities beyond simple scraping.
This Actor collects Product Hunt products and produces strategic outputs:

  • Product dataset (votes, topics, makers, links, metadata)
  • MAKER_NETWORK graph (who builds with whom, super makers)
  • NICHE_ANALYSIS (high-demand / low-supply topic opportunities)
  • Optional DIGEST_HTML weekly digest template (for email workflows)

Designed for founder research, market intelligence, maker discovery, and content/digest automation.

Credentials / Security

  • Recommended: productHuntApiToken (Product Hunt API v2 token)
    • Without it, actor falls back to browser scraping (can be blocked by 403/proxy limits).
  • Optional for digest email: sendgridApiKey + digestEmail
  • Never store or commit real credentials in repository files.
  • Revoke any key/token that was ever shared in chat or screenshots.

Data Output (per product)

Each dataset item has this shape:

{
"id": "notion-ai-2",
"slug": "notion-ai-2",
"rank": 1,
"name": "Notion AI",
"tagline": "Artificial intelligence built right into your workspace",
"description": "Full product description...",
"upvotes": 1842,
"commentsCount": 143,
"launchDate": "2023-02-22",
"topics": ["Productivity", "Artificial Intelligence"],
"productHuntUrl": "https://www.producthunt.com/posts/notion-ai-2",
"websiteUrl": "https://notion.so",
"websiteUrlRaw": "https://notion.so/?ref=producthunt",
"thumbnailUrl": "https://ph-files.imgix.net/abc123.png",
"gallery": ["https://ph-files.imgix.net/screenshot1.png"],
"makers": [
{
"name": "Ivan Zhao",
"headline": "Co-founder & CEO at Notion",
"productHuntProfile": "https://www.producthunt.com/@ivanzhao",
"twitterUrl": "https://twitter.com/ivanhzhao",
"linkedinUrl": null,
"websiteUrl": "https://notion.so",
"isFounder": true
}
],
"meta": {
"scrapedAt": "2025-06-15T08:00:00.000Z",
"scrapeMode": "full",
"sourceDate": "2025-06-15",
"actorRunId": "abc123xyz"
}
}

Input (Apify UI)

Defined in .actor/INPUT_SCHEMA.json:

  • mode: "daily" | "weekly" | "monthly" | "custom_url" (default: daily)
  • maxProducts: integer, max 500 (default: 50)
  • scrapeMakerProfiles: boolean (default: true)
  • productHuntApiToken: API token for fast/robust GraphQL mode
  • proxyConfig: Apify proxy config (default: residential)
  • customUrl: used only when mode = "custom_url"
  • digestEmail: target email for weekly digest output
  • sendgridApiKey: SendGrid API key for digest delivery

How to run locally

cd producthunt-scraper
npm install
# Simple test run
apify run --input='{
"mode": "daily",
"maxProducts": 5,
"scrapeMakerProfiles": false
}'

Monitoring

  • monitoring/healthcheck.js runs the scraper with a small sample and validates results using src/validators.js.
  • It writes HEALTH_REPORT_YYYY-MM-DD to the default Key-Value Store and optionally sends a Slack alert if SLACK_WEBHOOK_URL is set.

Extra KV Outputs

  • MAKER_NETWORK: maker collaboration graph + super maker stats
  • NICHE_ANALYSIS: niche/saturated/trending topic breakdown
  • DIGEST_HTML: generated weekly digest template (when digest inputs are set)