Sort Dataset Items
Pricing
Pay per usage
Go to Store
Sort Dataset Items
Add this actor as a webhook to your scraper to sort the dataset by index field
0.0 (0)
Pricing
Pay per usage
3
Total users
10
Monthly users
2
Runs succeeded
>99%
Last modified
2 years ago
.editorconfig
root = true
[*]indent_style = spaceindent_size = 4charset = utf-8trim_trailing_whitespace = trueinsert_final_newline = trueend_of_line = lf
.eslintrc
{ "extends": "@apify"}
.gitignore
# This file tells Git which files shouldn't be added to source control
.ideanode_modules
Dockerfile
# First, specify the base Docker image. You can read more about# the available images at https://sdk.apify.com/docs/guides/docker-images# You can also use any other image from Docker Hub.FROM apify/actor-node:16
# Second, copy just package.json and package-lock.json since it should be# the only file that affects "npm install" in the next step, to speed up the buildCOPY package*.json ./
# Install NPM packages, skip optional and development dependencies to# keep the image small. Avoid logging too much and print the dependency# tree for debuggingRUN npm --quiet set progress=false \ && npm install --only=prod --no-optional \ && echo "Installed NPM packages:" \ && (npm list --only=prod --no-optional --all || true) \ && echo "Node.js version:" \ && node --version \ && echo "NPM version:" \ && npm --version
# Next, copy the remaining files and directories with the source code.# Since we do this after NPM install, quick build will be really fast# for most source file changes.COPY . ./
# Optionally, specify how to launch the source code of your actor.# By default, Apify's base Docker images define the CMD instruction# that runs the Node.js source code using the command specified# in the "scripts.start" section of the package.json file.# In short, the instruction looks something like this:## CMD npm start
INPUT_SCHEMA.json
{ "title": "Input schema for the apify_project actor.", "type": "object", "schemaVersion": 1, "properties": { "datasetId": { "title": "Dataset Id", "type": "string", "description": "Dataset Id of dataset to sort. You don't need to provide this if you use a webhook.", "editor": "textfield" } }, "required": []}
apify.json
{ "env": { "npm_config_loglevel": "silent" }}
main.js
1const Apify = require('apify');2
3const transformFunction = (items) => {4 items.sort((a, b) => {5 return a.index - b.index;6 })7 return items;8}9
10Apify.main(async () => {11 // Get input of the actor (here only for demonstration purposes).12 const input = await Apify.getInput();13 console.log('Input:');14 console.dir(input);15
16 let {17 // either called from webhook or directly18 resource,19 datasetId20 } = input;21
22 if (resource) {23 datasetId = resource.defaultDatasetId;24 }25
26 const actorInput = {27 datasetIds: [datasetId],28 preDedupTransformFunction: transformFunction,29 };30
31 await Apify.metamorph(32 'lukaskrivka/dedup-datasets',33 actorInput,34 )35});
package.json
{ "name": "project-empty", "version": "0.0.1", "description": "This is a boilerplate of an Apify actor.", "dependencies": { "apify": "^2.0.7" }, "devDependencies": { "@apify/eslint-config": "^0.1.3", "eslint": "^7.0.0" }, "scripts": { "start": "node main.js", "lint": "./node_modules/.bin/eslint ./src --ext .js,.jsx", "lint:fix": "./node_modules/.bin/eslint ./src --ext .js,.jsx --fix", "test": "echo \"Error: oops, the actor has no tests yet, sad!\" && exit 1" }, "author": "It's not you it's me", "license": "ISC"}