Pain Point Discovery avatar
Pain Point Discovery

Pricing

from $29.00 / 1,000 results

Go to Apify Store
Pain Point Discovery

Pain Point Discovery

**Crawl 9 platforms** to find **user pain points**. It uses **intelligent scoring** and **trend detection** to help **founders** make **data-driven product decisions**

Pricing

from $29.00 / 1,000 results

Rating

0.0

(0)

Developer

Sumit Sharma

Sumit Sharma

Maintained by Community

Actor stats

0

Bookmarked

4

Total users

2

Monthly active users

4 days ago

Last modified

Share

🎯 Pain Point Discovery Actor

Automatically discover customer pain points, complaints, and feedback from Reddit, Hacker News, Stack Overflow, GitHub, Product Hunt, Indie Hackers, and Y Combinator.

Find exactly what your target customers are struggling with, what they hate about existing solutions, and what features they wish existed β€” all in one automated workflow.


πŸ“– What This Actor Does

This Actor crawls 7 major platforms where people discuss their problems, frustrations, and needs. It uses AI-powered signal intelligence to:

  1. Scrape discussions from Reddit, Hacker News, GitHub Issues, Stack Overflow, Product Hunt, Indie Hackers, and YC
  2. Detect pain points using flexible keyword matching (like Google search - finds relevant discussions even with slight wording differences)
  3. Score & rank pain points by urgency, engagement, and business value
  4. Classify signals into types: complaints, feature requests, help requests, workarounds, etc.
  5. Enrich data with sentiment analysis, category classification, and trend detection
  6. Export results to CSV, JSON, and interactive HTML dashboard with filters

Use Cases:

  • πŸš€ Founders: Find product-market fit by discovering real pain points
  • πŸ’‘ Product Managers: Identify feature requests and user frustrations
  • πŸ“Š Market Researchers: Understand customer needs across industries
  • 🎨 Designers: Discover UX pain points in existing tools
  • πŸ’Ό Sales Teams: Find prospects actively complaining about competitors

✨ Key Features

🧠 Intelligent Pain Point Detection

  • SEO-style keyword matching: Finds discussions even if wording differs slightly
  • Multi-word flexibility: "struggling with social media" matches "I struggle to manage social media"
  • Pain language detection: Auto-detects frustration words (frustrated, broken, hate, slow, etc.)
  • Smart scoring algorithm: Ranks by engagement (upvotes, comments), recency, and keyword relevance

πŸ“Š Signal Intelligence

  • 7 Signal Types: Complaint, Feedback, Help Request, Feature Request, Workaround, Comparison, Emotional
  • Urgency Levels: Critical, High, Medium, Low
  • Persona Detection: Identifies who is experiencing the pain (developers, marketers, founders, etc.)
  • Business Value Classification: Estimates revenue opportunity (high-value, medium-value, low-value)

🌐 Multi-Platform Coverage

  • Reddit: Subreddit communities (r/startups, r/SaaS, r/entrepreneur, etc.)
  • Hacker News: Tech discussions and "Show HN" posts
  • GitHub Issues: Real bugs and feature requests from developers
  • Stack Overflow: Technical questions and problems
  • Product Hunt: Product reviews and feedback
  • Indie Hackers: Founder struggles and bootstrapping challenges
  • Y Combinator: Startup pain points and market gaps

πŸ’Ύ Export Options

  • CSV: Spreadsheet-ready data for Excel/Google Sheets
  • JSON: Structured data for APIs and databases
  • HTML Dashboard: Interactive filterable table with charts
  • Clickable URLs: Every pain point links to original source

🎁 Benefits You'll Get

BenefitDescription
⚑ Save 100+ HoursAutomates manual Reddit/HN/forum browsing
🎯 Find Hidden OpportunitiesDiscovers pain points you'd never find manually
πŸ“Š Validated IdeasSee real people discussing real problems (with upvotes as validation)
πŸš€ Beat CompetitorsFind gaps in existing solutions before your competitors do
πŸ’° Prioritize by ValueFocus on high-urgency, high-business-value pain points first
πŸ“ˆ Track TrendsSee which problems are getting worse over time
πŸ”— Direct Source LinksVerify every pain point with clickable URLs to original posts

πŸš€ How to Use Effectively

Quick Start (3 Steps)

  1. Choose Your Sources

    • Start with Reddit only for broad coverage
    • Add Hacker News for tech/startup pain points
    • Add GitHub if targeting developers
  2. Add Keywords (see "Best Keywords" section below)

    • Use Niche + Signal combinations: "salesforce frustrated", "excel too slow"
    • NOT single words like "problem" or "issue"
  3. Run & Export

    • Default settings work for most use cases
    • Results appear in CSV, JSON, and HTML dashboard
    • Click URLs to verify pain points are real

πŸ” How to Search Effectively (READ THIS FIRST!)

⚠️ IMPORTANT: Don't Use Generic Single Words!

❌ WRONG (Gets 0 Results):

struggling
frustrated
problem
issue
slow
expensive

Why? These are too generic. Platforms block them or return millions of irrelevant results.


βœ… RIGHT: Use "Niche + Problem" Formula

The Secret: Combine a specific tool/topic with a pain signal

[Tool/Topic] + [Pain Word]

🎯 Perfect Examples:

SaaS & Business Tools:

ai agent roi problem
salesforce too expensive
hubspot missing features
notion too slow
jira too complicated
slack distracting
excel too slow
crm doesn't work

Industry-Specific:

ai tools implementation difficult
machine learning frustrated
chatbot doesn't work
automation roi problem
email marketing expensive
inventory management tedious

Competitor Research:

salesforce alternatives
notion competitor
shopify struggling
monday.com overwhelming

πŸ’‘ Why This Works:

Keyword TypeResultsQuality
"problem" (generic)0 resultsN/A - Too generic
"ai agent roi problem"20-50 resultsβœ… High - Specific pain points
"salesforce frustrated"30-80 resultsβœ… High - Real complaints

πŸ“‹ Quick Start Template:

Copy and customize these proven combinations:

{
"keywords": [
"YOUR_NICHE struggling with",
"YOUR_TOOL too expensive",
"YOUR_CATEGORY doesn't work",
"YOUR_COMPETITOR frustrated",
"YOUR_PROCESS too slow"
]
}

Real Example:

{
"keywords": [
"ai agent roi problem",
"ai tools implementation difficult",
"machine learning costs too high",
"chatbot integration frustrated",
"automation workflow broken"
]
}

πŸŽ“ Best Practices:

  1. Use 5-15 keywords (not 50+) - Quality over quantity
  2. Mix tool names + pain signals - "salesforce frustrated" beats just "frustrated"
  3. Include competitor names - Find people ready to switch
  4. Test with Reddit first - Most conversational data
  5. Review first 10 results - Adjust keywords if off-target

πŸ”‘ How to Add Best Keywords

❌ Don't Use (Too Generic - Gets 0 Results)

problem
issue
bug
slow
expensive

Why? Platforms block generic searches, or return millions of irrelevant results.

βœ… Do Use (Niche + Signal - Gets 20-50 Results Per Keyword)

Use this formula:

[Tool/Topic] + [Pain Signal]

Examples by Category

SaaS Tools:

salesforce too expensive
hubspot missing features
monday.com overwhelming
notion too slow
jira too complicated
slack distracting
zoom quality issues

Business Processes:

hiring process difficult
onboarding time consuming
inventory management tedious
payroll software expensive
customer support overwhelming

Common Tools:

excel too slow
google sheets limitations
powerpoint frustrating
email marketing expensive
spreadsheet manual work

🎯 Best Keyword Patterns

PatternExampleWhat It Finds
[tool] struggling"shopify struggling"People having trouble with Shopify
[tool] frustrated"excel frustrated"Frustrated Excel users
[tool] too [pain]"crm too slow"Performance complaints
[tool] doesn't"jira doesn't work"Broken functionality
[process] tedious"reporting tedious"Manual work that needs automation
[tool] alternatives"notion alternatives"People ready to switch

πŸ’‘ Pro Tips

  1. Use 5-15 keywords (not 50+) β€” Quality over quantity
  2. Mix tool names + pain signals β€” "salesforce frustrated" beats just "frustrated"
  3. Include competitor names β€” Find people complaining about competitors
  4. Test with Reddit first β€” Most conversational data
  5. Review first 10 results β€” Adjust keywords if results are off-target

πŸ“ How to Configure Input (Step-by-Step Guide)

1. Data Sources

Select which platforms to scrape:

  • βœ… Reddit (Start here - most data)
  • βœ… Hacker News (Tech/startup focus)
  • ⬜ GitHub (Only if targeting developers)
  • ⬜ Stack Overflow (Technical problems only)
  • ⬜ Product Hunt (Product feedback)

Tip: Start with 1-2 sources, expand gradually

2. Subreddits (For Reddit)

Add relevant subreddit names (without r/):

startups
SaaS
entrepreneur
smallbusiness
SideProject
productmanagement

3. Pain Point Keywords

Add 5-15 Niche + Signal keywords:

struggling with social media
excel too slow
salesforce frustrated
inventory management tedious
crm doesn't work

4. Max Posts Per Source

  • 30-50: Quick test (5-10 min)
  • 100-200: Full scan (15-30 min)
  • 500+: Deep research (30-60 min)

5. Minimum Pain Point Score

  • 15-25: Broad discovery (captures more, lower quality)
  • 30-40: Balanced (recommended)
  • 50-70: High quality only (fewer results)

6. Time Range

  • day: Very fresh discussions
  • week: Recent (recommended)
  • month: Broader coverage
  • year: Historical data

7. Advanced Features

FeatureRecommendation
Deduplicationβœ… Always enable
URL Validationβœ… Enable (slower but accurate)
Trend Detectionβœ… Enable for market research
Category Classificationβœ… Enable
Sentiment Analysisβœ… Enable
Signal Classificationβœ… Enable

8. Performance Settings

Max Concurrency:

  • 1-2: Safe, avoids blocking (recommended)
  • 3-5: Faster, higher risk of 403 errors

Max Retries:

  • 3-5: Recommended (handles temporary errors)

GitHub Token:


πŸ“Š Sample Input Configurations

For SaaS Founders (Finding Product Ideas)

{
"sources": ["reddit", "hackernews"],
"subreddits": ["startups", "SaaS", "entrepreneur", "smallbusiness"],
"keywords": [
"struggling with social media",
"excel too slow",
"salesforce frustrated",
"crm doesn't work",
"email marketing expensive"
],
"maxPosts": 50,
"minScore": 15,
"timeRange": "week",
"maxConcurrency": 2
}

For Developers (Finding Technical Gaps)

{
"sources": ["github", "stackoverflow", "hackernews"],
"keywords": [
"api documentation confusing",
"authentication difficult",
"debugging nightmare",
"deployment complicated"
],
"maxPosts": 30,
"minScore": 20
}

πŸ“€ Output & Results

What You Get

  1. CSV File - Spreadsheet with all pain points
  2. JSON File - Structured data for APIs
  3. HTML Dashboard - Interactive filterable table with charts
  4. Reports - Keyword frequency, statistics, signal distribution

Example Output

Score | Title | Source | Signal | Urgency | URL
82 | "Struggling to manage social media..." | reddit | complaint | high | https://...
76 | "Excel is painfully slow..." | reddit | complaint | critical | https://...

πŸ”§ Troubleshooting

"0 Pain Points Found"

Fix: Use this test config:

{
"sources": ["reddit"],
"subreddits": ["startups"],
"keywords": ["struggling", "slow", "expensive"],
"minScore": 10,
"maxPosts": 20
}

"403 Forbidden Errors"

Fix:

  1. Enable RESIDENTIAL proxies in Apify settings
  2. Lower maxConcurrency to 1-2
  3. Add GitHub token for GitHub source

"504 Gateway Timeout" (GitHub)

Fix: Add GitHub Personal Access Token in input


πŸ’‘ Best Practices

  1. βœ… Start Small: Test with 1 source + 5 keywords first
  2. βœ… Review First Results: Check if data matches your needs
  3. βœ… Iterate Keywords: Adjust based on what you find
  4. βœ… Enable Signal Intelligence: Get urgency + business value insights
  5. βœ… Export to CSV: Easier to analyze in Excel/Sheets
  6. βœ… Filter HTML Dashboard: Use filters to find high-urgency pain points

🎯 Quick Checklist

Before running:

  • Selected 1-2 sources (Reddit recommended)
  • Added 5-15 Niche + Signal keywords
  • Set minScore to 15-25
  • Enabled Deduplication
  • Enabled Signal Classification
  • (Optional) Added GitHub token

Expected Results: 20-80 pain points per run


οΏ½ Actor Integration & Workflows

Integrating with Other Actors

You can chain this Actor with others to build powerful workflows:

Pattern 1: Pain Point β†’ Content Analysis

// Run Pain Point Discovery Actor
const run1 = await Apify.call('your-username/pain-point-discovery', {
sources: ['reddit'],
keywords: ['salesforce frustrated']
});
// Get dataset results
const dataset = await Apify.openDataset(run1.defaultDatasetId);
const painPoints = await dataset.getData();
// Send to Sentiment Analysis Actor
const run2 = await Apify.call('apify/sentiment-analysis', {
texts: painPoints.items.map(p => p.content)
});

Pattern 2: Pain Point β†’ Lead Enrichment

// 1. Find pain points
const painPoints = await runPainPointDiscovery();
// 2. Extract author usernames
const authors = painPoints.map(p => p.author);
// 3. Run LinkedIn Profile Scraper
const profiles = await Apify.call('apify/linkedin-profile-scraper', {
usernames: authors
});
// 4. Enrich pain points with contact info
const enrichedData = mergePainPointsWithProfiles(painPoints, profiles);

Pattern 3: Pain Point β†’ Market Research Report

// 1. Discover pain points
const painPoints = await runPainPointDiscovery();
// 2. Generate report with OpenAI
const run2 = await Apify.call('apify/openai', {
model: 'gpt-4',
messages: [{
role: 'user',
content: `Analyze these pain points and create a market research report: ${JSON.stringify(painPoints)}`
}]
});

Common Integration Scenarios

ScenarioActors to ChainOutput
Lead GenerationPain Point Discovery β†’ Email Finder β†’ CRM ExportQualified leads with contact info
Competitive AnalysisPain Point Discovery β†’ Competitor Scraper β†’ Report GeneratorCompetitive intelligence report
Content IdeasPain Point Discovery β†’ OpenAI β†’ Blog Post GeneratorSEO-optimized blog posts
Product RoadmapPain Point Discovery β†’ Categorization β†’ Notion/Airtable ExportPrioritized feature backlog

Accessing Actor Results

Via API:

// Get latest run results
const client = new ApifyClient({ token: 'YOUR_API_TOKEN' });
const run = await client.actor('your-username/pain-point-discovery').lastRun().dataset().listItems();
console.log(run.items); // Array of pain points

Via Webhook:

// Set up webhook to trigger on completion
const run = await Apify.call('your-username/pain-point-discovery', input, {
webhooks: [{
eventTypes: ['ACTOR.RUN.SUCCEEDED'],
requestUrl: 'https://your-api.com/webhook'
}]
});

πŸ€– Automation with Make.com (Integromat)

Setting Up Make.com Integration

Step 1: Create Make.com Scenario

  1. Go to make.com and create new scenario
  2. Add Apify module as first step
  3. Select "Run an Actor"

Step 2: Configure Apify Module

Step 3: Process Results Add modules to process the pain points:

  • Filter: Keep only high-urgency pain points
  • Router: Split by signal type or category
  • Transformer: Format data for next steps

Example Make.com Workflows

Workflow 1: Auto-Generate Lead List

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Schedule β”‚ β†’ Daily at 9 AM
β”‚ (Cron) β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Apify: β”‚ β†’ Run Pain Point Discovery
β”‚ Run Actor β”‚ inputs: reddit, keywords
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Filter: β”‚ β†’ Keep urgency = "high"
β”‚ High Urgency β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Apify: β”‚ β†’ Extract author profiles
β”‚ LinkedIn Scraperβ”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Hunter.io: β”‚ β†’ Find email addresses
β”‚ Email Finder β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Google Sheets: β”‚ β†’ Add to leads spreadsheet
β”‚ Add Row β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Make.com JSON Blueprint:

{
"name": "Pain Point Lead Generation",
"flow": [
{
"module": "apify:RunActor",
"data": {
"actorId": "your-username/pain-point-discovery",
"input": {
"sources": ["reddit", "hackernews"],
"keywords": ["{{1.keywords}}"],
"minScore": 40,
"timeRange": "day"
}
}
},
{
"module": "builtin:Filter",
"filter": {
"conditions": [[{
"a": "{{2.urgency}}",
"o": "equal",
"b": "high"
}]]
}
},
{
"module": "googlesheets:addRow",
"data": {
"spreadsheetId": "YOUR_SHEET_ID",
"values": {
"Title": "{{2.title}}",
"URL": "{{2.url}}",
"Score": "{{2.score}}",
"Author": "{{2.author}}"
}
}
}
]
}

Workflow 2: Weekly Market Research Report

1. Schedule: Every Monday 8 AM
2. Apify: Run Pain Point Discovery (week range)
3. Filter: minScore >= 50
4. Aggregator: Group by category
5. OpenAI: Generate summary report per category
6. Gmail: Email report to team
7. Slack: Post top 5 pain points to #product channel

Workflow 3: Real-time Alert System

1. Webhook: Listen for new Actor run completion
2. Iterator: Loop through pain points
3. Filter: urgency = "critical" OR score >= 80
4. Router:
- Path A: Slack notification to #urgent
- Path B: Create Notion page
- Path C: Add to Airtable "Hot Leads"

Make.com Module Configuration

Apify Module Settings:

Connection: [Your Apify API Token]
Actor: your-username/pain-point-discovery
Build: latest
Input:
sources: ["reddit", "hackernews"]
keywords: ["{{trigger.keywords}}"]
minScore: {{trigger.minScore}}
maxPosts: 50
Wait for completion: Yes
Timeout: 600 seconds

Example Filter (High-Value Pain Points):

Condition 1: score >= 60
Condition 2: urgency equals "high" OR "critical"
Condition 3: signal_business_value equals "high-value"
Operator: AND

πŸ”„ Automation with n8n

Setting Up n8n Integration

Step 1: Install n8n

npm install -g n8n
# or
docker run -it --rm --name n8n -p 5678:5678 n8nio/n8n

Step 2: Add Apify Credentials

  1. Go to n8n β†’ Credentials
  2. Add new credential: Apify API
  3. Enter API Token from https://console.apify.com/account/integrations

Step 3: Create Workflow

  1. Add Apify node
  2. Operation: "Run Actor"
  3. Actor: your-username/pain-point-discovery

Example n8n Workflows

Workflow 1: Automated Content Pipeline

{
"nodes": [
{
"name": "Schedule",
"type": "n8n-nodes-base.cron",
"parameters": {
"triggerTimes": {
"item": [{"mode": "everyDay", "hour": 9}]
}
}
},
{
"name": "Run Pain Point Discovery",
"type": "n8n-nodes-base.apify",
"parameters": {
"operation": "runActor",
"actorId": "your-username/pain-point-discovery",
"input": {
"sources": ["reddit", "hackernews"],
"keywords": ["marketing struggling", "sales process tedious"],
"minScore": 30
}
}
},
{
"name": "Filter High Priority",
"type": "n8n-nodes-base.if",
"parameters": {
"conditions": {
"number": [
{"value1": "={{$json.score}}", "operation": "larger", "value2": 50}
]
}
}
},
{
"name": "Generate Content Ideas with OpenAI",
"type": "n8n-nodes-base.openAi",
"parameters": {
"operation": "text",
"model": "gpt-4",
"prompt": "Create 3 blog post ideas based on this pain point: {{$json.title}}\n\nContext: {{$json.content}}"
}
},
{
"name": "Save to Notion",
"type": "n8n-nodes-base.notion",
"parameters": {
"operation": "create",
"resource": "databasePage",
"databaseId": "YOUR_DATABASE_ID",
"properties": {
"Title": "={{$json.title}}",
"Pain Point URL": "={{$json.url}}",
"Content Ideas": "={{$node[\"Generate Content Ideas with OpenAI\"].json.choices[0].text}}",
"Score": "={{$json.score}}",
"Category": "={{$json.category}}"
}
}
}
]
}

Workflow 2: Competitor Monitoring

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Webhook β”‚ β†’ Trigger: Daily or on-demand
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Set Variables β”‚ β†’ competitor_names = ["notion", "salesforce"]
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Function β”‚ β†’ Generate keywords: "{competitor} frustrated", etc.
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Apify: Run β”‚ β†’ Pain Point Discovery with competitor keywords
β”‚ Pain Discovery β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Code: Analyze β”‚ β†’ Count by competitor, urgency, sentiment
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ HTTP: Send to β”‚ β†’ POST to your analytics dashboard
β”‚ Webhook β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Slack: Alert if β”‚ β†’ Notify if critical competitor issues detected
β”‚ Critical Issues β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

n8n Code Node Example (Data Processing):

// Process pain points and group by category
const items = $input.all();
const grouped = {};
items.forEach(item => {
const category = item.json.category || 'Uncategorized';
if (!grouped[category]) {
grouped[category] = {
count: 0,
avgScore: 0,
painPoints: []
};
}
grouped[category].count++;
grouped[category].avgScore += item.json.score;
grouped[category].painPoints.push({
title: item.json.title,
url: item.json.url,
score: item.json.score
});
});
// Calculate averages
Object.keys(grouped).forEach(cat => {
grouped[cat].avgScore = Math.round(grouped[cat].avgScore / grouped[cat].count);
});
return [{ json: grouped }];

Workflow 3: CRM Auto-Enrichment

1. Schedule: Every 3 hours
2. Apify: Run Pain Point Discovery (recent posts only)
3. Filter: Contains author username
4. HTTP Request: Check if author exists in CRM
5. IF: Author NOT in CRM
- Apify: Run LinkedIn Scraper for profile
- HTTP Request: POST to CRM create contact
- Set Fields: pain_point = title, pain_score = score
6. IF: Author IN CRM
- HTTP Request: PATCH update contact
- Add: pain_points array, update last_activity
7. Slack: Daily summary of new leads

n8n Best Practices

  1. Use Webhooks for Real-time: Trigger workflows when Actor completes

    Apify Webhook URL: https://your-n8n.com/webhook/pain-points
  2. Error Handling: Add error workflow

    {
    "name": "Error Handler",
    "type": "n8n-nodes-base.errorTrigger",
    "parameters": {},
    "continueOnFail": false
    }
  3. Rate Limiting: Use Wait nodes between API calls

    Wait: 2 seconds (between Apify and external API calls)
  4. Data Transformation: Use Function nodes for complex logic

    // Deduplicate by URL
    const seen = new Set();
    return $input.all().filter(item => {
    if (seen.has(item.json.url)) return false;
    seen.add(item.json.url);
    return true;
    });

πŸ”” Webhook & Scheduling

Apify Webhooks

Setup Webhook Trigger:

// When creating Actor run via API
const run = await client.actor('your-username/pain-point-discovery').call(input, {
webhooks: [
{
eventTypes: ['ACTOR.RUN.SUCCEEDED'],
requestUrl: 'https://your-webhook-url.com/pain-points',
payloadTemplate: '{"runId": {{runId}}, "datasetId": {{defaultDatasetId}}}'
}
]
});

Webhook Payload Example:

{
"eventType": "ACTOR.RUN.SUCCEEDED",
"eventData": {
"actorId": "abc123",
"runId": "xyz789",
"defaultDatasetId": "dataset123"
},
"resource": {
"id": "xyz789",
"status": "SUCCEEDED",
"stats": {
"itemCount": 45
}
}
}

Schedule Recurring Runs

Via Apify Console:

  1. Go to Actor β†’ Schedules
  2. Create new schedule
  3. Set cron expression: 0 9 * * 1 (Every Monday 9 AM)
  4. Configure input

Via API:

await client.schedules().create({
name: 'Weekly Pain Point Scan',
actorId: 'your-username/pain-point-discovery',
cronExpression: '0 9 * * 1',
input: {
sources: ['reddit', 'hackernews'],
keywords: ['your', 'keywords'],
minScore: 30
}
});

Common Cron Schedules:

0 9 * * * Daily at 9 AM
0 9 * * 1 Every Monday at 9 AM
0 */6 * * * Every 6 hours
0 9 1 * * First day of month at 9 AM

οΏ½πŸ“¦ Technical Setup


# Install dependencies
npm install
# Run locally
apify run
# Run with custom input
apify run --input-file .actor/INPUT_TEST_QUICK.json
# Deploy to Apify
apify push

Happy Pain Point Hunting! 🎯