Sort Dataset Items avatar
Sort Dataset Items

Pricing

Pay per usage

Go to Store
Sort Dataset Items

Sort Dataset Items

Developed by

Lukáš Křivka

Lukáš Křivka

Maintained by Community

Add this actor as a webhook to your scraper to sort the dataset by index field

0.0 (0)

Pricing

Pay per usage

3

Total users

10

Monthly users

2

Runs succeeded

>99%

Last modified

2 years ago

.editorconfig

root = true
[*]
indent_style = space
indent_size = 4
charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true
end_of_line = lf

.eslintrc

{
"extends": "@apify"
}

.gitignore

# This file tells Git which files shouldn't be added to source control
.idea
node_modules

Dockerfile

# First, specify the base Docker image. You can read more about
# the available images at https://sdk.apify.com/docs/guides/docker-images
# You can also use any other image from Docker Hub.
FROM apify/actor-node:16
# Second, copy just package.json and package-lock.json since it should be
# the only file that affects "npm install" in the next step, to speed up the build
COPY package*.json ./
# Install NPM packages, skip optional and development dependencies to
# keep the image small. Avoid logging too much and print the dependency
# tree for debugging
RUN npm --quiet set progress=false \
&& npm install --only=prod --no-optional \
&& echo "Installed NPM packages:" \
&& (npm list --only=prod --no-optional --all || true) \
&& echo "Node.js version:" \
&& node --version \
&& echo "NPM version:" \
&& npm --version
# Next, copy the remaining files and directories with the source code.
# Since we do this after NPM install, quick build will be really fast
# for most source file changes.
COPY . ./
# Optionally, specify how to launch the source code of your actor.
# By default, Apify's base Docker images define the CMD instruction
# that runs the Node.js source code using the command specified
# in the "scripts.start" section of the package.json file.
# In short, the instruction looks something like this:
#
# CMD npm start

INPUT_SCHEMA.json

{
"title": "Input schema for the apify_project actor.",
"type": "object",
"schemaVersion": 1,
"properties": {
"datasetId": {
"title": "Dataset Id",
"type": "string",
"description": "Dataset Id of dataset to sort. You don't need to provide this if you use a webhook.",
"editor": "textfield"
}
},
"required": []
}

apify.json

{
"env": { "npm_config_loglevel": "silent" }
}

main.js

1const Apify = require('apify');
2
3const transformFunction = (items) => {
4 items.sort((a, b) => {
5 return a.index - b.index;
6 })
7 return items;
8}
9
10Apify.main(async () => {
11 // Get input of the actor (here only for demonstration purposes).
12 const input = await Apify.getInput();
13 console.log('Input:');
14 console.dir(input);
15
16 let {
17 // either called from webhook or directly
18 resource,
19 datasetId
20 } = input;
21
22 if (resource) {
23 datasetId = resource.defaultDatasetId;
24 }
25
26 const actorInput = {
27 datasetIds: [datasetId],
28 preDedupTransformFunction: transformFunction,
29 };
30
31 await Apify.metamorph(
32 'lukaskrivka/dedup-datasets',
33 actorInput,
34 )
35});

package.json

{
"name": "project-empty",
"version": "0.0.1",
"description": "This is a boilerplate of an Apify actor.",
"dependencies": {
"apify": "^2.0.7"
},
"devDependencies": {
"@apify/eslint-config": "^0.1.3",
"eslint": "^7.0.0"
},
"scripts": {
"start": "node main.js",
"lint": "./node_modules/.bin/eslint ./src --ext .js,.jsx",
"lint:fix": "./node_modules/.bin/eslint ./src --ext .js,.jsx --fix",
"test": "echo \"Error: oops, the actor has no tests yet, sad!\" && exit 1"
},
"author": "It's not you it's me",
"license": "ISC"
}