Dataset Result Gate avatar

Dataset Result Gate

Pricing

Pay per usage

Go to Apify Store
Dataset Result Gate

Dataset Result Gate

Conditional pipeline gate. Fails if the previous actor's dataset is empty, succeeds if it has results — stopping unnecessary downstream runs before they start.

Pricing

Pay per usage

Rating

0.0

(0)

Developer

Vít Tuhý

Vít Tuhý

Maintained by Community

Actor stats

0

Bookmarked

1

Total users

0

Monthly active users

2 days ago

Last modified

Share

Stop downstream actors from running when there's nothing to process. This actor checks whether the previous actor's output dataset contains any results and either passes or blocks the pipeline accordingly — with zero extra cost per execution.


Example use case: monitor a social media profile and only scrape if there's a new post

Say you run a daily job that checks a competitor's Instagram for new posts. If they posted something, you want to enrich it — pull comments, run sentiment analysis, store it. If not, you want nothing to happen.

Without a gate, your enrichment actors would run every day regardless, wasting compute and money.

With Dataset Result Gate:

Instagram Profile Scraper (daily schedule)
└─ on SUCCESS → Dataset Result Gate
├─ 0 new posts → STOP. Nothing else runs.
└─ 1+ new posts → Comment Scraper starts → Sentiment Actor starts → ...

This pattern works for any scenario where a scraper may return empty results:

  • Price monitoring — only notify if prices changed
  • News scraping — only process if new articles appeared
  • Lead generation — only enrich if new profiles were found
  • Job listing monitoring — only alert if new positions are posted

How it works

  1. Receives the output dataset ID from the previous actor run
  2. Checks the clean item count (actual results, excluding error records)
  3. 0 clean items → run ends as FAILED → downstream integrations don't fire
  4. 1+ clean items → run ends as SUCCEEDED → downstream integrations fire normally

The actor uses cleanItemCount — the same number shown in the "Clean" column in Apify Console — so what you see in the UI matches what the gate acts on.


Setup

You need two integrations in Apify Console. No code changes required.

Step 1 — Connect your source actor to the gate

  1. Open your source actor (e.g. Instagram Profile Scraper) in Console
  2. Go to the Integrations tab
  3. Add integration:
    • Trigger: Actor run succeeded
    • Action: Start actor → select Dataset Result Gate
  4. In the input configuration, add:
{
"datasetId": "{{resource.defaultDatasetId}}"
}

{{resource.defaultDatasetId}} is an Apify template variable — it's automatically replaced with the source actor's output dataset ID at runtime. You don't need to fill it manually.

Step 2 — Connect the gate to your target actor

  1. Open Dataset Result Gate in Console
  2. Go to the Integrations tab
  3. Add integration:
    • Trigger: Actor run succeeded
    • Action: Start actor → select your target actor
  4. Configure the target actor's input as needed

That's it. The gate handles the conditional logic.


Input

FieldTypeRequiredDescription
datasetIdStringYesID of the dataset to check. Pass {{resource.defaultDatasetId}} when using via Apify Console integration.

Why not use n8n (or Zapier, Make) for this?

You can build conditional logic in n8n or Make — but every execution counts against your task/operation quota. If your scraper runs 10× a day and you're checking results each time, that's 300 operations a month just for the gate logic, before any actual work happens.

Dataset Result Gate runs entirely on Apify, billed only by compute time (which for a dataset metadata check is a few seconds at most). If you're already running actors on Apify, keeping the conditional logic inside the platform is simpler and cheaper than routing through an external automation tool.


Notes

  • The actor checks cleanItemCount, not the total item count. If your source actor produces error records alongside real results, those won't inflate the count.
  • The actor has no output of its own — it exists purely to pass or block the pipeline.
  • Works with any source actor, not just scrapers. Any actor that writes to a dataset can be gated.