🚨 Reddit Keyword Monitor & Alerts
Pricing
Pay per event
🚨 Reddit Keyword Monitor & Alerts
Track specific keywords and subreddits for net-new posts and comments, pushing instant social listening alerts directly to Slack or Discord webhooks.
Pricing
Pay per event
Rating
0.0
(0)
Developer
太郎 山田
Actor stats
0
Bookmarked
3
Total users
1
Monthly active users
13 days ago
Last modified
Categories
Share
🚨 Reddit Keyword Monitor Alerts
Track real-time brand mentions, monitor competitor discussions, and extract highly relevant leads directly from the front page of the internet. The Reddit Keyword Monitor is engineered specifically for continuous, recurring data extraction without the noise of duplicate results. By leveraging stateful diffing and snapshot keys, this tool establishes a baseline during its initial run and exclusively scrapes net-new posts and comments on all subsequent scheduled runs. Social media managers, marketers, and lead generation teams use this tool to build automated social listening pipelines. Instead of manually searching Reddit or running redundant web scrapers that pull the same threads repeatedly, you can route fresh subreddit activity and specific keyword matches directly to your preferred channels. Configure target query routes, specify subreddits, or combine search filters to pinpoint exact discussions. The extracted data includes rich post details like author profiles, upvote counts, thread URLs, and full comment text. Easily schedule the monitor to run daily or hourly, routing the scraped output to Slack, Discord, or custom webhooks. Whether you need to catch customer complaints early, identify trending topics within niche communities, or scrape high-intent discussions for sales outreach, this monitor delivers reliable, deduplicated results directly to your operational workflow.
Store Quickstart
- Start with
store-input.example.jsonor Quickstart — Baseline Run (Run 1). - Then use the upgrade ladder from
store-input.templates.json:- Quickstart — Baseline Run (Run 1)
- Recurring Proof — Net-new Alerts (Run 2+)
- Webhook Handoff — Routed Alert Streams
sample-output.example.jsonshows the stronger Run 2+ alert rows, andlive-proof.example.jsondocuments the baseline → recurring proof path.
Key Features
- 🔎 Route-based monitoring — keywords, subreddit streams, query routes, and combined filters
- ♻️ Stateful diffing — suppresses previously seen items using
snapshotKey - 💬 Comment-aware alerts — monitors comments where public Reddit endpoints support it
- 📡 Webhook-first operations — deliver to Slack/Discord/incident pipelines
- 🤝 Pack handoff ready — use all-in-one actor for research/backfill, then return here for recurring alerts
Use Cases
| Who | Why |
|---|---|
| GTM teams | Recurring brand + competitor mention alerts |
| Community managers | Catch new subreddit risk threads quickly |
| Product teams | Track fresh pain points and feature requests |
| RevOps/Automation teams | Trigger webhook workflows only on net-new events |
Input
| Field | Type | Default | Description |
|---|---|---|---|
| keywords | array | — | Plain-text keywords for global Reddit monitoring. Used for post search plus recent comment scanning. |
| searchQueries | array | — | Reddit search queries to monitor for new posts. Query-only routes are post-only because public Reddit JSON does not expose direct comment search. |
| subreddits | array | — | Subreddits to monitor for new posts and comments (example: javascript). |
| routes | array | — | Optional JSON objects for combined subreddit+keyword or subreddit+query routing. |
| monitorComments | boolean | true | When enabled, scan recent comment streams where public endpoints support it. |
| postLimit | integer | 25 | How many recent posts to inspect per route. |
| commentLimit | integer | 50 | How many recent comments to inspect per route. |
| sort | string | "new" | Sort used for post endpoints. For recurring monitoring, "new" is usually best. |
Input Example
{"monitorComments": true,"postLimit": 25,"commentLimit": 50,"sort": "new","time": "day","timeoutMs": 15000,"delayMs": 1200,"snapshotKey": "reddit-keyword-monitor-snapshots","maxSnapshotItems": 5000,"emitOnFirstRun": false,"delivery": "dataset","dryRun": false}
Output
| Field | Type | Description |
|---|---|---|
meta | object | |
alerts | array | |
errors | array |
Output Example
{"meta": {"generatedAt": "2026-04-10T17:06:25.060Z","snapshotKey": "reddit-keyword-monitor-snapshots","firstRun": true,"emitOnFirstRun": false,"routeCount": 7,"observedItems": 188,"alertCount": 0,"errorCount": 0,"blockedCount": 0,"suppressedOnFirstRun": 188,"notes": [],"delivery": "dataset"},"alerts": [],"errors": []}
API Usage
Run this actor programmatically using the Apify API. Replace YOUR_API_TOKEN with your token from Apify Console → Settings → Integrations.
cURL
curl -X POST "https://api.apify.com/v2/acts/taroyamada~reddit-keyword-monitor-alerts/run-sync-get-dataset-items?token=YOUR_API_TOKEN" \-H "Content-Type: application/json" \-d '{ "monitorComments": true, "postLimit": 25, "commentLimit": 50, "sort": "new", "time": "day", "timeoutMs": 15000, "delayMs": 1200, "snapshotKey": "reddit-keyword-monitor-snapshots", "maxSnapshotItems": 5000, "emitOnFirstRun": false, "delivery": "dataset", "dryRun": false }'
Python
from apify_client import ApifyClientclient = ApifyClient("YOUR_API_TOKEN")run = client.actor("taroyamada/reddit-keyword-monitor-alerts").call(run_input={"monitorComments": true,"postLimit": 25,"commentLimit": 50,"sort": "new","time": "day","timeoutMs": 15000,"delayMs": 1200,"snapshotKey": "reddit-keyword-monitor-snapshots","maxSnapshotItems": 5000,"emitOnFirstRun": false,"delivery": "dataset","dryRun": false})for item in client.dataset(run["defaultDatasetId"]).iterate_items():print(item)
JavaScript / Node.js
import { ApifyClient } from 'apify-client';const client = new ApifyClient({ token: 'YOUR_API_TOKEN' });const run = await client.actor('taroyamada/reddit-keyword-monitor-alerts').call({"monitorComments": true,"postLimit": 25,"commentLimit": 50,"sort": "new","time": "day","timeoutMs": 15000,"delayMs": 1200,"snapshotKey": "reddit-keyword-monitor-snapshots","maxSnapshotItems": 5000,"emitOnFirstRun": false,"delivery": "dataset","dryRun": false});const { items } = await client.dataset(run.defaultDatasetId).listItems();console.log(items);
Tips & Limitations
- First run with
emitOnFirstRun: falseis expected to emit0alerts while creating the baseline. - Validate second-run behavior using the same
snapshotKeybefore enabling production schedules. - Use
delivery: "both"to keep dataset auditability while also sending real-time webhook alerts. - For historical backfill and broader source coverage, run reddit-all-in-one-scraper and keep this actor for recurring alerting.
FAQ
Why did my first run return 0 alerts?
If emitOnFirstRun is false, the first successful run creates the baseline snapshot and suppresses alerts by design. The second run proves recurring monitoring behavior by emitting only net-new items.
How do I prove setup worked before production?
Run the same input twice with the same snapshotKey. Confirm meta.firstRun becomes false and alertCount reflects only new activity.
Does it deduplicate across runs?
Yes. Snapshot state is stored under snapshotKey, so previously seen items are skipped unless you reset the key or change the route mix.
Can query-only routes monitor comments too?
No. Public Reddit JSON exposes post search, but not direct comment search for query-only routes. Use subreddit + keyword routes when comment alerts matter.
Can I send alerts straight to Slack, Discord, or my database?
Yes. Use delivery: "webhook" or delivery: "both" and forward the payload to your own router, Slack app, Discord bot, or warehouse ingest endpoint.
Related Actors
Reddit Intelligence Pack workflow:
- 📡 Reddit All-in-One Scraper — Research/backfill companion for subreddit, search, user, and URL pulls.
- 📰 Article Extractor — Extract clean text from links found in Reddit posts.
- 💬 Reddit Scraper (Legacy) — Proxy-sensitive fallback and migration path, not the primary monitoring actor.
Cost
Pay Per Event:
apify-actor-start: $0.00005 (default start event)apify-default-dataset-item: $0.003 per post, comment, error, or no-results item
Example: 1,000 items = $0.00005 + (1,000 x $0.001) = $1.00
No subscription required — you only pay for what you use.
⭐ Was this helpful?
If this actor saved you time, please leave a ★ rating on Apify Store. It takes 10 seconds, helps other developers discover it, and keeps updates free.
Bug report or feature request? Open an issue on the Issues tab of this actor.