Decide which scraped entries should be included in the output by using a custom function.
If not set, all scraped entries will be included.
This is done after outputPickFields, outputRenameFields, and outputTransform.
Example:
/**
* Inputs:
* `entry` - Scraped entry.
* `ctx.io` - Apify Actor class, see https://docs.apify.com/sdk/js/reference/class/Actor.
* `ctx.input` - The input object that was passed to this Actor.
* `ctx.state` - An object you can use to persist state across all your custom functions.
* `ctx.sendRequest` - Fetch remote data. Uses 'got-scraping', same as Apify's `sendRequest`.
* See https://crawlee.dev/docs/guides/got-scraping
* `ctx.itemCacheKey` - A function you can use to get cacheID for current `entry`.
* It takes the entry itself, and a list of properties to be used for hashing.
* By default, you should pass `input.cachePrimaryKeys` to it.
*
*/
// async (entry, { io, input, state, sendRequest, itemCacheKey }) => {
// // Example: Filter entries based on number of images they have (at least 5)
// return entry.images.length > 5;
//
// };