Profesia.sk Scraper avatar
Profesia.sk Scraper
Try for free

3 days trial then $25.00/month - No credit card required now

View all Actors
Profesia.sk Scraper

Profesia.sk Scraper

jurooravec/profesia-sk-scraper
Try for free

3 days trial then $25.00/month - No credit card required now

One-stop-shop for all data on Profesia.sk Extract job offers, list of companies, positions, locations... Job offers include salary, textual info, company, and more

The code examples below show how to run the Actor and get its results. To run the code, you need to have an Apify account. Replace <YOUR_API_TOKEN> in the code with your API token, which you can find under Settings > Integrations in Apify Console. Learn more

1import { ApifyClient } from 'apify-client';
2
3// Initialize the ApifyClient with your Apify API token
4const client = new ApifyClient({
5    token: '<YOUR_API_TOKEN>',
6});
7
8// Prepare Actor input
9const input = {
10    "datasetType": "jobOffers",
11    "jobOfferFilterMinSalaryPeriod": "month",
12    "inputExtendFromFunction": `/**
13         * Inputs:
14         *
15         * `ctx.io` - Apify Actor class, see https://docs.apify.com/sdk/js/reference/class/Actor.
16         * `ctx.input` - The input object that was passed to this Actor.
17         * `ctx.state` - An object you can use to persist state across all your custom functions.
18         * `ctx.sendRequest` - Fetch remote data. Uses 'got-scraping', same as Apify's `sendRequest`.
19         *                       See https://crawlee.dev/docs/guides/got-scraping
20         * `ctx.itemCacheKey` - A function you can use to get cacheID for current `entry`.
21         *                        It takes the entry itself, and a list of properties to be used for hashing.
22         *                        By default, you should pass `input.cachePrimaryKeys` to it.
23         *
24         */
25        // async ({ io, input, state, sendRequest, itemCacheKey }) => {
26        //   // Example: Load Actor config from GitHub URL (public)
27        //   const config = await sendRequest.get('https://raw.githubusercontent.com/username/project/main/config.json').json();
28        //   
29        //   // Increase concurrency during off-peak hours
30        //   // NOTE: Imagine we're targetting a small server, that can be slower during the day
31        //   const hours = new Date().getUTCHours();
32        //   const isOffPeak = hours < 6 || hours > 20;
33        //   config.maxConcurrency = isOffPeak ? 8 : 3;
34        //   
35        //   return config;
36        //
37        //   /* ========== SEE BELOW FOR MORE EXAMPLES ========= */
38        //
39        //   /**
40        //    * ======= ACCESSING DATASET ========
41        //    * To save/load/access entries in Dataset.
42        //    * Docs:
43        //    * - https://docs.apify.com/platform/storage/dataset
44        //    * - https://docs.apify.com/sdk/js/docs/guides/result-storage#dataset
45        //    * - https://docs.apify.com/sdk/js/docs/examples/map-and-reduce
46        //    */
47        //   // const dataset = await io.openDataset('MyDatasetId');
48        //   // const info = await dataset.getInfo();
49        //   // console.log(info.itemCount);
50        //   // // => 0
51        //
52        //   /**
53        //    * ======= ACCESSING REMOTE DATA ========
54        //    * Use `sendRequest` to get data from the internet:
55        //    * Docs:
56        //    * - https://github.com/apify/got-scraping
57        //    */
58        //   // const catFact = await sendRequest.get('https://cat-fact.herokuapp.com/facts/5887e1d85c873e0011036889').json();
59        //   // console.log(catFact.text);
60        //   // // => "Cats make about 100 different sounds. Dogs make only about 10.",
61        //
62        //   /**
63        //    * ======= USING CACHE ========
64        //    * To save the entry to the KeyValue cache (or retrieve it), you can use
65        //    * `itemCacheKey` to create the entry's ID for you:
66        //    */
67        //   // const cacheId = itemCacheKey(item, input.cachePrimaryKeys);
68        //   // const cache = await io.openKeyValueStore('MyStoreId');
69        //   // cache.setValue(cacheId, entry);
70        // };`,
71    "startUrlsFromFunction": `/**
72         * Inputs:
73         *
74         * `ctx.io` - Apify Actor class, see https://docs.apify.com/sdk/js/reference/class/Actor.
75         * `ctx.input` - The input object that was passed to this Actor.
76         * `ctx.state` - An object you can use to persist state across all your custom functions.
77         * `ctx.sendRequest` - Fetch remote data. Uses 'got-scraping', same as Apify's `sendRequest`.
78         *                       See https://crawlee.dev/docs/guides/got-scraping
79         * `ctx.itemCacheKey` - A function you can use to get cacheID for current `entry`.
80         *                        It takes the entry itself, and a list of properties to be used for hashing.
81         *                        By default, you should pass `input.cachePrimaryKeys` to it.
82         *
83         */
84        // async ({ io, input, state, sendRequest, itemCacheKey }) => {
85        //   // Example: Create and load URLs from a Dataset by combining multiple fields
86        //   const dataset = await io.openDataset(datasetNameOrId);
87        //   const data = await dataset.getData();
88        //   const urls = data.items.map((item) => `https://example.com/u/${item.userId}/list/${item.listId}`);
89        //   return urls;
90        //
91        //   /* ========== SEE BELOW FOR MORE EXAMPLES ========= */
92        //
93        //   /**
94        //    * ======= ACCESSING DATASET ========
95        //    * To save/load/access entries in Dataset.
96        //    * Docs:
97        //    * - https://docs.apify.com/platform/storage/dataset
98        //    * - https://docs.apify.com/sdk/js/docs/guides/result-storage#dataset
99        //    * - https://docs.apify.com/sdk/js/docs/examples/map-and-reduce
100        //    */
101        //   // const dataset = await io.openDataset('MyDatasetId');
102        //   // const info = await dataset.getInfo();
103        //   // console.log(info.itemCount);
104        //   // // => 0
105        //
106        //   /**
107        //    * ======= ACCESSING REMOTE DATA ========
108        //    * Use `sendRequest` to get data from the internet:
109        //    * Docs:
110        //    * - https://github.com/apify/got-scraping
111        //    */
112        //   // const catFact = await sendRequest.get('https://cat-fact.herokuapp.com/facts/5887e1d85c873e0011036889').json();
113        //   // console.log(catFact.text);
114        //   // // => "Cats make about 100 different sounds. Dogs make only about 10.",
115        //
116        //   /**
117        //    * ======= USING CACHE ========
118        //    * To save the entry to the KeyValue cache (or retrieve it), you can use
119        //    * `itemCacheKey` to create the entry's ID for you:
120        //    */
121        //   // const cacheId = itemCacheKey(item, input.cachePrimaryKeys);
122        //   // const cache = await io.openKeyValueStore('MyStoreId');
123        //   // cache.setValue(cacheId, entry);
124        // };`,
125    "requestMaxEntries": 50,
126    "requestTransform": `/**
127         * Inputs:
128         * `request` - Request holding URL to be scraped.
129         * `ctx.io` - Apify Actor class, see https://docs.apify.com/sdk/js/reference/class/Actor.
130         * `ctx.input` - The input object that was passed to this Actor.
131         * `ctx.state` - An object you can use to persist state across all your custom functions.
132         * `ctx.sendRequest` - Fetch remote data. Uses 'got-scraping', same as Apify's `sendRequest`.
133         *                       See https://crawlee.dev/docs/guides/got-scraping
134         * `ctx.itemCacheKey` - A function you can use to get cacheID for current `entry`.
135         *                        It takes the entry itself, and a list of properties to be used for hashing.
136         *                        By default, you should pass `input.cachePrimaryKeys` to it.
137         *
138         */
139        // async (request, { io, input, state, sendRequest, itemCacheKey }) => {
140        //   // Example: Tag requests
141        //   // (maybe because we use RequestQueue that pools multiple scrapers)
142        //   request.userData.tag = "VARIANT_A";
143        //   return requestQueue;
144        //
145        //   /* ========== SEE BELOW FOR MORE EXAMPLES ========= */
146        //
147        //   /**
148        //    * ======= ACCESSING DATASET ========
149        //    * To save/load/access entries in Dataset.
150        //    * Docs:
151        //    * - https://docs.apify.com/platform/storage/dataset
152        //    * - https://docs.apify.com/sdk/js/docs/guides/result-storage#dataset
153        //    * - https://docs.apify.com/sdk/js/docs/examples/map-and-reduce
154        //    */
155        //   // const dataset = await io.openDataset('MyDatasetId');
156        //   // const info = await dataset.getInfo();
157        //   // console.log(info.itemCount);
158        //   // // => 0
159        //
160        //   /**
161        //    * ======= ACCESSING REMOTE DATA ========
162        //    * Use `sendRequest` to get data from the internet:
163        //    * Docs:
164        //    * - https://github.com/apify/got-scraping
165        //    */
166        //   // const catFact = await sendRequest.get('https://cat-fact.herokuapp.com/facts/5887e1d85c873e0011036889').json();
167        //   // console.log(catFact.text);
168        //   // // => "Cats make about 100 different sounds. Dogs make only about 10.",
169        //
170        //   /**
171        //    * ======= USING CACHE ========
172        //    * To save the entry to the KeyValue cache (or retrieve it), you can use
173        //    * `itemCacheKey` to create the entry's ID for you:
174        //    */
175        //   // const cacheId = itemCacheKey(item, input.cachePrimaryKeys);
176        //   // const cache = await io.openKeyValueStore('MyStoreId');
177        //   // cache.setValue(cacheId, entry);
178        // };`,
179    "requestTransformBefore": `/**
180         * Inputs:
181         *
182         * `ctx.io` - Apify Actor class, see https://docs.apify.com/sdk/js/reference/class/Actor.
183         * `ctx.input` - The input object that was passed to this Actor.
184         * `ctx.state` - An object you can use to persist state across all your custom functions.
185         * `ctx.sendRequest` - Fetch remote data. Uses 'got-scraping', same as Apify's `sendRequest`.
186         *                       See https://crawlee.dev/docs/guides/got-scraping
187         * `ctx.itemCacheKey` - A function you can use to get cacheID for current `entry`.
188         *                        It takes the entry itself, and a list of properties to be used for hashing.
189         *                        By default, you should pass `input.cachePrimaryKeys` to it.
190         *
191         */
192        // async ({ io, input, state, sendRequest, itemCacheKey }) => {
193        //   // Example: Fetch data or run code BEFORE requests are processed.
194        //   state.categories = await sendRequest.get('https://example.com/my-categories').json();
195        //
196        //   /* ========== SEE BELOW FOR MORE EXAMPLES ========= */
197        //
198        //   /**
199        //    * ======= ACCESSING DATASET ========
200        //    * To save/load/access entries in Dataset.
201        //    * Docs:
202        //    * - https://docs.apify.com/platform/storage/dataset
203        //    * - https://docs.apify.com/sdk/js/docs/guides/result-storage#dataset
204        //    * - https://docs.apify.com/sdk/js/docs/examples/map-and-reduce
205        //    */
206        //   // const dataset = await io.openDataset('MyDatasetId');
207        //   // const info = await dataset.getInfo();
208        //   // console.log(info.itemCount);
209        //   // // => 0
210        //
211        //   /**
212        //    * ======= ACCESSING REMOTE DATA ========
213        //    * Use `sendRequest` to get data from the internet:
214        //    * Docs:
215        //    * - https://github.com/apify/got-scraping
216        //    */
217        //   // const catFact = await sendRequest.get('https://cat-fact.herokuapp.com/facts/5887e1d85c873e0011036889').json();
218        //   // console.log(catFact.text);
219        //   // // => "Cats make about 100 different sounds. Dogs make only about 10.",
220        //
221        //   /**
222        //    * ======= USING CACHE ========
223        //    * To save the entry to the KeyValue cache (or retrieve it), you can use
224        //    * `itemCacheKey` to create the entry's ID for you:
225        //    */
226        //   // const cacheId = itemCacheKey(item, input.cachePrimaryKeys);
227        //   // const cache = await io.openKeyValueStore('MyStoreId');
228        //   // cache.setValue(cacheId, entry);
229        // };`,
230    "requestTransformAfter": `/**
231         * Inputs:
232         *
233         * `ctx.io` - Apify Actor class, see https://docs.apify.com/sdk/js/reference/class/Actor.
234         * `ctx.input` - The input object that was passed to this Actor.
235         * `ctx.state` - An object you can use to persist state across all your custom functions.
236         * `ctx.sendRequest` - Fetch remote data. Uses 'got-scraping', same as Apify's `sendRequest`.
237         *                       See https://crawlee.dev/docs/guides/got-scraping
238         * `ctx.itemCacheKey` - A function you can use to get cacheID for current `entry`.
239         *                        It takes the entry itself, and a list of properties to be used for hashing.
240         *                        By default, you should pass `input.cachePrimaryKeys` to it.
241         *
242         */
243        // async ({ io, input, state, sendRequest, itemCacheKey }) => {
244        //   // Example: Fetch data or run code AFTER requests are processed.
245        //   delete state.categories;
246        //
247        //   /* ========== SEE BELOW FOR MORE EXAMPLES ========= */
248        //
249        //   /**
250        //    * ======= ACCESSING DATASET ========
251        //    * To save/load/access entries in Dataset.
252        //    * Docs:
253        //    * - https://docs.apify.com/platform/storage/dataset
254        //    * - https://docs.apify.com/sdk/js/docs/guides/result-storage#dataset
255        //    * - https://docs.apify.com/sdk/js/docs/examples/map-and-reduce
256        //    */
257        //   // const dataset = await io.openDataset('MyDatasetId');
258        //   // const info = await dataset.getInfo();
259        //   // console.log(info.itemCount);
260        //   // // => 0
261        //
262        //   /**
263        //    * ======= ACCESSING REMOTE DATA ========
264        //    * Use `sendRequest` to get data from the internet:
265        //    * Docs:
266        //    * - https://github.com/apify/got-scraping
267        //    */
268        //   // const catFact = await sendRequest.get('https://cat-fact.herokuapp.com/facts/5887e1d85c873e0011036889').json();
269        //   // console.log(catFact.text);
270        //   // // => "Cats make about 100 different sounds. Dogs make only about 10.",
271        //
272        //   /**
273        //    * ======= USING CACHE ========
274        //    * To save the entry to the KeyValue cache (or retrieve it), you can use
275        //    * `itemCacheKey` to create the entry's ID for you:
276        //    */
277        //   // const cacheId = itemCacheKey(item, input.cachePrimaryKeys);
278        //   // const cache = await io.openKeyValueStore('MyStoreId');
279        //   // cache.setValue(cacheId, entry);
280        // };`,
281    "requestFilter": `/**
282         * Inputs:
283         * `request` - Request holding URL to be scraped.
284         * `ctx.io` - Apify Actor class, see https://docs.apify.com/sdk/js/reference/class/Actor.
285         * `ctx.input` - The input object that was passed to this Actor.
286         * `ctx.state` - An object you can use to persist state across all your custom functions.
287         * `ctx.sendRequest` - Fetch remote data. Uses 'got-scraping', same as Apify's `sendRequest`.
288         *                       See https://crawlee.dev/docs/guides/got-scraping
289         * `ctx.itemCacheKey` - A function you can use to get cacheID for current `entry`.
290         *                        It takes the entry itself, and a list of properties to be used for hashing.
291         *                        By default, you should pass `input.cachePrimaryKeys` to it.
292         *
293         */
294        // async (request, { io, input, state, sendRequest, itemCacheKey }) => {
295        //   // Example: Filter requests based on their tag
296        //   // (maybe because we use RequestQueue that pools multiple scrapers)
297        //   return request.userData.tag === "VARIANT_A";
298        //
299        //   /* ========== SEE BELOW FOR MORE EXAMPLES ========= */
300        //
301        //   /**
302        //    * ======= ACCESSING DATASET ========
303        //    * To save/load/access entries in Dataset.
304        //    * Docs:
305        //    * - https://docs.apify.com/platform/storage/dataset
306        //    * - https://docs.apify.com/sdk/js/docs/guides/result-storage#dataset
307        //    * - https://docs.apify.com/sdk/js/docs/examples/map-and-reduce
308        //    */
309        //   // const dataset = await io.openDataset('MyDatasetId');
310        //   // const info = await dataset.getInfo();
311        //   // console.log(info.itemCount);
312        //   // // => 0
313        //
314        //   /**
315        //    * ======= ACCESSING REMOTE DATA ========
316        //    * Use `sendRequest` to get data from the internet:
317        //    * Docs:
318        //    * - https://github.com/apify/got-scraping
319        //    */
320        //   // const catFact = await sendRequest.get('https://cat-fact.herokuapp.com/facts/5887e1d85c873e0011036889').json();
321        //   // console.log(catFact.text);
322        //   // // => "Cats make about 100 different sounds. Dogs make only about 10.",
323        //
324        //   /**
325        //    * ======= USING CACHE ========
326        //    * To save the entry to the KeyValue cache (or retrieve it), you can use
327        //    * `itemCacheKey` to create the entry's ID for you:
328        //    */
329        //   // const cacheId = itemCacheKey(item, input.cachePrimaryKeys);
330        //   // const cache = await io.openKeyValueStore('MyStoreId');
331        //   // cache.setValue(cacheId, entry);
332        // };`,
333    "requestFilterBefore": `/**
334         * Inputs:
335         *
336         * `ctx.io` - Apify Actor class, see https://docs.apify.com/sdk/js/reference/class/Actor.
337         * `ctx.input` - The input object that was passed to this Actor.
338         * `ctx.state` - An object you can use to persist state across all your custom functions.
339         * `ctx.sendRequest` - Fetch remote data. Uses 'got-scraping', same as Apify's `sendRequest`.
340         *                       See https://crawlee.dev/docs/guides/got-scraping
341         * `ctx.itemCacheKey` - A function you can use to get cacheID for current `entry`.
342         *                        It takes the entry itself, and a list of properties to be used for hashing.
343         *                        By default, you should pass `input.cachePrimaryKeys` to it.
344         *
345         */
346        // async ({ io, input, state, sendRequest, itemCacheKey }) => {
347        //   // Example: Fetch data or run code BEFORE requests are processed.
348        //   state.categories = await sendRequest.get('https://example.com/my-categories').json();
349        //
350        //   /* ========== SEE BELOW FOR MORE EXAMPLES ========= */
351        //
352        //   /**
353        //    * ======= ACCESSING DATASET ========
354        //    * To save/load/access entries in Dataset.
355        //    * Docs:
356        //    * - https://docs.apify.com/platform/storage/dataset
357        //    * - https://docs.apify.com/sdk/js/docs/guides/result-storage#dataset
358        //    * - https://docs.apify.com/sdk/js/docs/examples/map-and-reduce
359        //    */
360        //   // const dataset = await io.openDataset('MyDatasetId');
361        //   // const info = await dataset.getInfo();
362        //   // console.log(info.itemCount);
363        //   // // => 0
364        //
365        //   /**
366        //    * ======= ACCESSING REMOTE DATA ========
367        //    * Use `sendRequest` to get data from the internet:
368        //    * Docs:
369        //    * - https://github.com/apify/got-scraping
370        //    */
371        //   // const catFact = await sendRequest.get('https://cat-fact.herokuapp.com/facts/5887e1d85c873e0011036889').json();
372        //   // console.log(catFact.text);
373        //   // // => "Cats make about 100 different sounds. Dogs make only about 10.",
374        //
375        //   /**
376        //    * ======= USING CACHE ========
377        //    * To save the entry to the KeyValue cache (or retrieve it), you can use
378        //    * `itemCacheKey` to create the entry's ID for you:
379        //    */
380        //   // const cacheId = itemCacheKey(item, input.cachePrimaryKeys);
381        //   // const cache = await io.openKeyValueStore('MyStoreId');
382        //   // cache.setValue(cacheId, entry);
383        // };`,
384    "requestFilterAfter": `/**
385         * Inputs:
386         *
387         * `ctx.io` - Apify Actor class, see https://docs.apify.com/sdk/js/reference/class/Actor.
388         * `ctx.input` - The input object that was passed to this Actor.
389         * `ctx.state` - An object you can use to persist state across all your custom functions.
390         * `ctx.sendRequest` - Fetch remote data. Uses 'got-scraping', same as Apify's `sendRequest`.
391         *                       See https://crawlee.dev/docs/guides/got-scraping
392         * `ctx.itemCacheKey` - A function you can use to get cacheID for current `entry`.
393         *                        It takes the entry itself, and a list of properties to be used for hashing.
394         *                        By default, you should pass `input.cachePrimaryKeys` to it.
395         *
396         */
397        // async ({ io, input, state, sendRequest, itemCacheKey }) => {
398        //   // Example: Fetch data or run code AFTER requests are processed.
399        //   delete state.categories;
400        //
401        //   /* ========== SEE BELOW FOR MORE EXAMPLES ========= */
402        //
403        //   /**
404        //    * ======= ACCESSING DATASET ========
405        //    * To save/load/access entries in Dataset.
406        //    * Docs:
407        //    * - https://docs.apify.com/platform/storage/dataset
408        //    * - https://docs.apify.com/sdk/js/docs/guides/result-storage#dataset
409        //    * - https://docs.apify.com/sdk/js/docs/examples/map-and-reduce
410        //    */
411        //   // const dataset = await io.openDataset('MyDatasetId');
412        //   // const info = await dataset.getInfo();
413        //   // console.log(info.itemCount);
414        //   // // => 0
415        //
416        //   /**
417        //    * ======= ACCESSING REMOTE DATA ========
418        //    * Use `sendRequest` to get data from the internet:
419        //    * Docs:
420        //    * - https://github.com/apify/got-scraping
421        //    */
422        //   // const catFact = await sendRequest.get('https://cat-fact.herokuapp.com/facts/5887e1d85c873e0011036889').json();
423        //   // console.log(catFact.text);
424        //   // // => "Cats make about 100 different sounds. Dogs make only about 10.",
425        //
426        //   /**
427        //    * ======= USING CACHE ========
428        //    * To save the entry to the KeyValue cache (or retrieve it), you can use
429        //    * `itemCacheKey` to create the entry's ID for you:
430        //    */
431        //   // const cacheId = itemCacheKey(item, input.cachePrimaryKeys);
432        //   // const cache = await io.openKeyValueStore('MyStoreId');
433        //   // cache.setValue(cacheId, entry);
434        // };`,
435    "outputMaxEntries": 50,
436    "outputTransform": `/**
437         * Inputs:
438         * `entry` - Scraped entry.
439         * `ctx.io` - Apify Actor class, see https://docs.apify.com/sdk/js/reference/class/Actor.
440         * `ctx.input` - The input object that was passed to this Actor.
441         * `ctx.state` - An object you can use to persist state across all your custom functions.
442         * `ctx.sendRequest` - Fetch remote data. Uses 'got-scraping', same as Apify's `sendRequest`.
443         *                       See https://crawlee.dev/docs/guides/got-scraping
444         * `ctx.itemCacheKey` - A function you can use to get cacheID for current `entry`.
445         *                        It takes the entry itself, and a list of properties to be used for hashing.
446         *                        By default, you should pass `input.cachePrimaryKeys` to it.
447         *
448         */
449        // async (entry, { io, input, state, sendRequest, itemCacheKey }) => {
450        //   // Example: Add extra custom fields like aggregates
451        //   return {
452        //     ...entry,
453        //     imagesCount: entry.images.length,
454        //   };
455        //
456        //   /* ========== SEE BELOW FOR MORE EXAMPLES ========= */
457        //
458        //   /**
459        //    * ======= ACCESSING DATASET ========
460        //    * To save/load/access entries in Dataset.
461        //    * Docs:
462        //    * - https://docs.apify.com/platform/storage/dataset
463        //    * - https://docs.apify.com/sdk/js/docs/guides/result-storage#dataset
464        //    * - https://docs.apify.com/sdk/js/docs/examples/map-and-reduce
465        //    */
466        //   // const dataset = await io.openDataset('MyDatasetId');
467        //   // const info = await dataset.getInfo();
468        //   // console.log(info.itemCount);
469        //   // // => 0
470        //
471        //   /**
472        //    * ======= ACCESSING REMOTE DATA ========
473        //    * Use `sendRequest` to get data from the internet:
474        //    * Docs:
475        //    * - https://github.com/apify/got-scraping
476        //    */
477        //   // const catFact = await sendRequest.get('https://cat-fact.herokuapp.com/facts/5887e1d85c873e0011036889').json();
478        //   // console.log(catFact.text);
479        //   // // => "Cats make about 100 different sounds. Dogs make only about 10.",
480        //
481        //   /**
482        //    * ======= USING CACHE ========
483        //    * To save the entry to the KeyValue cache (or retrieve it), you can use
484        //    * `itemCacheKey` to create the entry's ID for you:
485        //    */
486        //   // const cacheId = itemCacheKey(item, input.cachePrimaryKeys);
487        //   // const cache = await io.openKeyValueStore('MyStoreId');
488        //   // cache.setValue(cacheId, entry);
489        // };`,
490    "outputTransformBefore": `/**
491         * Inputs:
492         *
493         * `ctx.io` - Apify Actor class, see https://docs.apify.com/sdk/js/reference/class/Actor.
494         * `ctx.input` - The input object that was passed to this Actor.
495         * `ctx.state` - An object you can use to persist state across all your custom functions.
496         * `ctx.sendRequest` - Fetch remote data. Uses 'got-scraping', same as Apify's `sendRequest`.
497         *                       See https://crawlee.dev/docs/guides/got-scraping
498         * `ctx.itemCacheKey` - A function you can use to get cacheID for current `entry`.
499         *                        It takes the entry itself, and a list of properties to be used for hashing.
500         *                        By default, you should pass `input.cachePrimaryKeys` to it.
501         *
502         */
503        // async ({ io, input, state, sendRequest, itemCacheKey }) => {
504        //   // Example: Fetch data or run code BEFORE entries are scraped.
505        //   state.categories = await sendRequest.get('https://example.com/my-categories').json();
506        //
507        //   /* ========== SEE BELOW FOR MORE EXAMPLES ========= */
508        //
509        //   /**
510        //    * ======= ACCESSING DATASET ========
511        //    * To save/load/access entries in Dataset.
512        //    * Docs:
513        //    * - https://docs.apify.com/platform/storage/dataset
514        //    * - https://docs.apify.com/sdk/js/docs/guides/result-storage#dataset
515        //    * - https://docs.apify.com/sdk/js/docs/examples/map-and-reduce
516        //    */
517        //   // const dataset = await io.openDataset('MyDatasetId');
518        //   // const info = await dataset.getInfo();
519        //   // console.log(info.itemCount);
520        //   // // => 0
521        //
522        //   /**
523        //    * ======= ACCESSING REMOTE DATA ========
524        //    * Use `sendRequest` to get data from the internet:
525        //    * Docs:
526        //    * - https://github.com/apify/got-scraping
527        //    */
528        //   // const catFact = await sendRequest.get('https://cat-fact.herokuapp.com/facts/5887e1d85c873e0011036889').json();
529        //   // console.log(catFact.text);
530        //   // // => "Cats make about 100 different sounds. Dogs make only about 10.",
531        //
532        //   /**
533        //    * ======= USING CACHE ========
534        //    * To save the entry to the KeyValue cache (or retrieve it), you can use
535        //    * `itemCacheKey` to create the entry's ID for you:
536        //    */
537        //   // const cacheId = itemCacheKey(item, input.cachePrimaryKeys);
538        //   // const cache = await io.openKeyValueStore('MyStoreId');
539        //   // cache.setValue(cacheId, entry);
540        // };`,
541    "outputTransformAfter": `/**
542         * Inputs:
543         *
544         * `ctx.io` - Apify Actor class, see https://docs.apify.com/sdk/js/reference/class/Actor.
545         * `ctx.input` - The input object that was passed to this Actor.
546         * `ctx.state` - An object you can use to persist state across all your custom functions.
547         * `ctx.sendRequest` - Fetch remote data. Uses 'got-scraping', same as Apify's `sendRequest`.
548         *                       See https://crawlee.dev/docs/guides/got-scraping
549         * `ctx.itemCacheKey` - A function you can use to get cacheID for current `entry`.
550         *                        It takes the entry itself, and a list of properties to be used for hashing.
551         *                        By default, you should pass `input.cachePrimaryKeys` to it.
552         *
553         */
554        // async ({ io, input, state, sendRequest, itemCacheKey }) => {
555        //   // Example: Fetch data or run code AFTER entries are scraped.
556        //   delete state.categories;
557        //
558        //   /* ========== SEE BELOW FOR MORE EXAMPLES ========= */
559        //
560        //   /**
561        //    * ======= ACCESSING DATASET ========
562        //    * To save/load/access entries in Dataset.
563        //    * Docs:
564        //    * - https://docs.apify.com/platform/storage/dataset
565        //    * - https://docs.apify.com/sdk/js/docs/guides/result-storage#dataset
566        //    * - https://docs.apify.com/sdk/js/docs/examples/map-and-reduce
567        //    */
568        //   // const dataset = await io.openDataset('MyDatasetId');
569        //   // const info = await dataset.getInfo();
570        //   // console.log(info.itemCount);
571        //   // // => 0
572        //
573        //   /**
574        //    * ======= ACCESSING REMOTE DATA ========
575        //    * Use `sendRequest` to get data from the internet:
576        //    * Docs:
577        //    * - https://github.com/apify/got-scraping
578        //    */
579        //   // const catFact = await sendRequest.get('https://cat-fact.herokuapp.com/facts/5887e1d85c873e0011036889').json();
580        //   // console.log(catFact.text);
581        //   // // => "Cats make about 100 different sounds. Dogs make only about 10.",
582        //
583        //   /**
584        //    * ======= USING CACHE ========
585        //    * To save the entry to the KeyValue cache (or retrieve it), you can use
586        //    * `itemCacheKey` to create the entry's ID for you:
587        //    */
588        //   // const cacheId = itemCacheKey(item, input.cachePrimaryKeys);
589        //   // const cache = await io.openKeyValueStore('MyStoreId');
590        //   // cache.setValue(cacheId, entry);
591        // };`,
592    "outputFilter": `/**
593         * Inputs:
594         * `entry` - Scraped entry.
595         * `ctx.io` - Apify Actor class, see https://docs.apify.com/sdk/js/reference/class/Actor.
596         * `ctx.input` - The input object that was passed to this Actor.
597         * `ctx.state` - An object you can use to persist state across all your custom functions.
598         * `ctx.sendRequest` - Fetch remote data. Uses 'got-scraping', same as Apify's `sendRequest`.
599         *                       See https://crawlee.dev/docs/guides/got-scraping
600         * `ctx.itemCacheKey` - A function you can use to get cacheID for current `entry`.
601         *                        It takes the entry itself, and a list of properties to be used for hashing.
602         *                        By default, you should pass `input.cachePrimaryKeys` to it.
603         *
604         */
605        // async (entry, { io, input, state, sendRequest, itemCacheKey }) => {
606        //   // Example: Filter entries based on number of images they have (at least 5)
607        //   return entry.images.length > 5;
608        //
609        //   /* ========== SEE BELOW FOR MORE EXAMPLES ========= */
610        //
611        //   /**
612        //    * ======= ACCESSING DATASET ========
613        //    * To save/load/access entries in Dataset.
614        //    * Docs:
615        //    * - https://docs.apify.com/platform/storage/dataset
616        //    * - https://docs.apify.com/sdk/js/docs/guides/result-storage#dataset
617        //    * - https://docs.apify.com/sdk/js/docs/examples/map-and-reduce
618        //    */
619        //   // const dataset = await io.openDataset('MyDatasetId');
620        //   // const info = await dataset.getInfo();
621        //   // console.log(info.itemCount);
622        //   // // => 0
623        //
624        //   /**
625        //    * ======= ACCESSING REMOTE DATA ========
626        //    * Use `sendRequest` to get data from the internet:
627        //    * Docs:
628        //    * - https://github.com/apify/got-scraping
629        //    */
630        //   // const catFact = await sendRequest.get('https://cat-fact.herokuapp.com/facts/5887e1d85c873e0011036889').json();
631        //   // console.log(catFact.text);
632        //   // // => "Cats make about 100 different sounds. Dogs make only about 10.",
633        //
634        //   /**
635        //    * ======= USING CACHE ========
636        //    * To save the entry to the KeyValue cache (or retrieve it), you can use
637        //    * `itemCacheKey` to create the entry's ID for you:
638        //    */
639        //   // const cacheId = itemCacheKey(item, input.cachePrimaryKeys);
640        //   // const cache = await io.openKeyValueStore('MyStoreId');
641        //   // cache.setValue(cacheId, entry);
642        // };`,
643    "outputFilterBefore": `/**
644         * Inputs:
645         *
646         * `ctx.io` - Apify Actor class, see https://docs.apify.com/sdk/js/reference/class/Actor.
647         * `ctx.input` - The input object that was passed to this Actor.
648         * `ctx.state` - An object you can use to persist state across all your custom functions.
649         * `ctx.sendRequest` - Fetch remote data. Uses 'got-scraping', same as Apify's `sendRequest`.
650         *                       See https://crawlee.dev/docs/guides/got-scraping
651         * `ctx.itemCacheKey` - A function you can use to get cacheID for current `entry`.
652         *                        It takes the entry itself, and a list of properties to be used for hashing.
653         *                        By default, you should pass `input.cachePrimaryKeys` to it.
654         *
655         */
656        // async ({ io, input, state, sendRequest, itemCacheKey }) => {
657        //   // Example: Fetch data or run code BEFORE entries are scraped.
658        //   state.categories = await sendRequest.get('https://example.com/my-categories').json();
659        //
660        //   /* ========== SEE BELOW FOR MORE EXAMPLES ========= */
661        //
662        //   /**
663        //    * ======= ACCESSING DATASET ========
664        //    * To save/load/access entries in Dataset.
665        //    * Docs:
666        //    * - https://docs.apify.com/platform/storage/dataset
667        //    * - https://docs.apify.com/sdk/js/docs/guides/result-storage#dataset
668        //    * - https://docs.apify.com/sdk/js/docs/examples/map-and-reduce
669        //    */
670        //   // const dataset = await io.openDataset('MyDatasetId');
671        //   // const info = await dataset.getInfo();
672        //   // console.log(info.itemCount);
673        //   // // => 0
674        //
675        //   /**
676        //    * ======= ACCESSING REMOTE DATA ========
677        //    * Use `sendRequest` to get data from the internet:
678        //    * Docs:
679        //    * - https://github.com/apify/got-scraping
680        //    */
681        //   // const catFact = await sendRequest.get('https://cat-fact.herokuapp.com/facts/5887e1d85c873e0011036889').json();
682        //   // console.log(catFact.text);
683        //   // // => "Cats make about 100 different sounds. Dogs make only about 10.",
684        //
685        //   /**
686        //    * ======= USING CACHE ========
687        //    * To save the entry to the KeyValue cache (or retrieve it), you can use
688        //    * `itemCacheKey` to create the entry's ID for you:
689        //    */
690        //   // const cacheId = itemCacheKey(item, input.cachePrimaryKeys);
691        //   // const cache = await io.openKeyValueStore('MyStoreId');
692        //   // cache.setValue(cacheId, entry);
693        // };`,
694    "outputFilterAfter": `/**
695         * Inputs:
696         *
697         * `ctx.io` - Apify Actor class, see https://docs.apify.com/sdk/js/reference/class/Actor.
698         * `ctx.input` - The input object that was passed to this Actor.
699         * `ctx.state` - An object you can use to persist state across all your custom functions.
700         * `ctx.sendRequest` - Fetch remote data. Uses 'got-scraping', same as Apify's `sendRequest`.
701         *                       See https://crawlee.dev/docs/guides/got-scraping
702         * `ctx.itemCacheKey` - A function you can use to get cacheID for current `entry`.
703         *                        It takes the entry itself, and a list of properties to be used for hashing.
704         *                        By default, you should pass `input.cachePrimaryKeys` to it.
705         *
706         */
707        // async ({ io, input, state, sendRequest, itemCacheKey }) => {
708        //   // Example: Fetch data or run code AFTER entries are scraped.
709        //   delete state.categories;
710        //
711        //   /* ========== SEE BELOW FOR MORE EXAMPLES ========= */
712        //
713        //   /**
714        //    * ======= ACCESSING DATASET ========
715        //    * To save/load/access entries in Dataset.
716        //    * Docs:
717        //    * - https://docs.apify.com/platform/storage/dataset
718        //    * - https://docs.apify.com/sdk/js/docs/guides/result-storage#dataset
719        //    * - https://docs.apify.com/sdk/js/docs/examples/map-and-reduce
720        //    */
721        //   // const dataset = await io.openDataset('MyDatasetId');
722        //   // const info = await dataset.getInfo();
723        //   // console.log(info.itemCount);
724        //   // // => 0
725        //
726        //   /**
727        //    * ======= ACCESSING REMOTE DATA ========
728        //    * Use `sendRequest` to get data from the internet:
729        //    * Docs:
730        //    * - https://github.com/apify/got-scraping
731        //    */
732        //   // const catFact = await sendRequest.get('https://cat-fact.herokuapp.com/facts/5887e1d85c873e0011036889').json();
733        //   // console.log(catFact.text);
734        //   // // => "Cats make about 100 different sounds. Dogs make only about 10.",
735        //
736        //   /**
737        //    * ======= USING CACHE ========
738        //    * To save the entry to the KeyValue cache (or retrieve it), you can use
739        //    * `itemCacheKey` to create the entry's ID for you:
740        //    */
741        //   // const cacheId = itemCacheKey(item, input.cachePrimaryKeys);
742        //   // const cache = await io.openKeyValueStore('MyStoreId');
743        //   // cache.setValue(cacheId, entry);
744        // };`,
745    "maxRequestRetries": 3,
746    "maxRequestsPerMinute": 120,
747    "minConcurrency": 1,
748    "requestHandlerTimeoutSecs": 180,
749    "logLevel": "info",
750    "errorReportingDatasetId": "REPORTING"
751};
752
753(async () => {
754    // Run the Actor and wait for it to finish
755    const run = await client.actor("jurooravec/profesia-sk-scraper").call(input);
756
757    // Fetch and print Actor results from the run's dataset (if any)
758    console.log('Results from dataset');
759    console.log(`💾 Check your data here: https://console.apify.com/storage/datasets/${run.defaultDatasetId}`);
760    const { items } = await client.dataset(run.defaultDatasetId).listItems();
761    items.forEach((item) => {
762        console.dir(item);
763    });
764})();
765
766// 📚 Want to learn more 📖? Go to → https://docs.apify.com/api/client/js/docs
Developer
Maintained by Community
Actor metrics
  • 1 monthly user
  • 1 star
  • 49.2% runs succeeded
  • Created in Apr 2023
  • Modified 11 months ago
Categories