Jobnet Scraper avatar

Jobnet Scraper

Pricing

from $3.00 / 1,000 results

Go to Apify Store
Jobnet Scraper

Jobnet Scraper

Scrape jobnet.dk — Denmark's official public job portal with 21,000+ listings. Contact data, regional filters, and incremental monitoring.

Pricing

from $3.00 / 1,000 results

Rating

0.0

(0)

Developer

Black Falcon Data

Black Falcon Data

Maintained by Community

Actor stats

0

Bookmarked

1

Total users

1

Monthly active users

2 days ago

Last modified

Categories

Share

What does Jobnet Scraper do?

Jobnet Scraper extracts structured job data from jobnet.dk — including contact details, company metadata, full descriptions, and location data. It supports keyword search and controllable result limits, so you can run the same query consistently over time. The actor also offers detail enrichment (full descriptions, company metadata, and contact information) where the source provides them, incremental monitoring that only returns new or changed results on recurring runs, and a compact output mode for AI-agent and MCP workflows.

What data can you extract from jobnet.dk?

Each result includes Core listing fields (jobId, title, cvrNumber, occupation, municipality, postalCode, postalDistrictName, and country, and more), detail fields when enrichment is enabled (description), contact and apply information (contactPersons and applyUrl), and company metadata (employerName and employerCity). In standard mode, all fields are always present — unavailable data points are returned as null, never omitted. In compact mode, only core fields are returned.

Enable detail enrichment in the input to get richer fields such as full descriptions, company metadata, and contact information where the source provides them.

Input

The main inputs are a search keyword and a result limit. Additional filters and options are available in the input schema.

Key parameters:

  • query — Job search keywords. Leave empty to browse all listings.
  • region — Filter by Danish region. (default: "")
  • employmentType — Permanent or temporary position. (default: "")
  • workHoursType — Full time, part time, or non-fixed hours. (default: "")
  • kmRadius — Geographic search radius in km. Applies when a postal code is used. (default: 50)
  • maxResults — Maximum total results to return (0 = unlimited). (default: 25)
  • includeDetails — Fetch full job description and contact data from detail endpoint. (default: true)
  • descriptionMaxLength — Truncate description to N characters. 0 = no truncation. (default: 0)
  • compact — Return core fields only — useful for AI-agent and MCP workflows. (default: false)
  • incrementalMode — Only return new or changed listings since last run. (default: false)
  • stateKey — Stable identifier for this tracked universe (e.g. your search query). Used to isolate incremental state between different searches.

Input example

{
"query": "softwareudvikler",
"region": "",
"employmentType": "",
"workHoursType": "",
"kmRadius": 50,
"maxResults": 5,
"includeDetails": true,
"descriptionMaxLength": 0,
"compact": false,
"incrementalMode": false
}

Output

Each run produces a dataset of structured job records. Results can be downloaded as JSON, CSV, or Excel from the Dataset tab in Apify Console.

Example job record

{
"jobId": "54ac467e-38d2-454a-b7ad-ada6ece246ad",
"title": "Børnehuset Lille Tornhøj i Aalborg Øst søger 3 pædagoger",
"employerName": "www.aalborg.dk",
"cvrNumber": "29189420",
"occupation": "Pædagog",
"municipality": "Aalborg",
"postalCode": 9220,
"postalDistrictName": "Aalborg Øst",
"country": "Danmark",
"isPartTime": false,
"isExternal": false,
"noFixedWorkplace": false,
"isDisabilityFriendly": false,
"datePosted": "2026-04-04T00:00:00+02:00",
"applicationDeadline": "2026-04-16T10:00:00+02:00",
"applicationDeadlineStatus": "ExpirationDate",
"availablePositions": 1,
"description": "<p>Vi søger dig, som brænder for at give børn en udviklende og legende dag. Dig, som skaber tryghed og nærvær. Dig, som har øje for fællesskabet og kan bevare overblikket. Dig, som er bevidst om egen...",
"employerCity": null,
"contactPersons": [
{
"name": null,
"email": null,
"phone": "25200560",
"title": null
}
],
"applyUrl": "https://aalborg.career.emply.com/ad/bornehuset-lille-tornhoj-i-aalborg-ost-soger-3-paedagoger/trzkoa/da",
"externalUrl": null,
"portalUrl": "https://jobnet.dk/find-job/54ac467e-38d2-454a-b7ad-ada6ece246ad",
"viewCount": 15,
"scrapedAt": "2026-04-04T17:27:26.833Z",
"source": "jobnet.dk",
"changeType": null
}

How to scrape jobnet.dk

  1. Go to Jobnet Scraper in Apify Console.
  2. Enter a search keyword.
  3. Set maxResults to control how many results you need.
  4. Enable includeDetails if you need full descriptions, contact info, or company data.
  5. Click Start and wait for the run to finish.
  6. Export the dataset as JSON, CSV, or Excel.

Use cases

  • Extract job data from jobnet.dk for market research and competitive analysis.
  • Monitor new and changed listings on scheduled runs without processing the full dataset every time.
  • Build outreach lists using contact details and apply URLs from listings.
  • Research company hiring patterns, employer profiles, and industry distribution.
  • Use structured location data for regional analysis, mapping, and geo-targeting.
  • Feed structured data into AI agents, MCP tools, and automated pipelines using compact mode.
  • Export clean, structured data to dashboards, spreadsheets, or data warehouses.

How much does it cost to scrape jobnet.dk?

Jobnet Scraper uses pay-per-event pricing. You pay a small fee when the run starts and then for each result that is actually produced.

  • Run start: $0.01 per run
  • Per result: $0.003 per job record

Example costs:

  • 10 results: $0.04
  • 100 results: $0.31
  • 500 results: $1.51

Incremental runs only charge for new or changed results, making recurring monitoring cost-efficient.

FAQ

How many results can I get from jobnet.dk?

The number of results depends on the search query and available listings on jobnet.dk. Use the maxResults parameter to control how many results are returned per run.

Does Jobnet Scraper support recurring monitoring?

Yes. Enable incremental mode to only receive new or changed listings on subsequent runs. This is ideal for scheduled monitoring where you want to track changes over time without re-processing the full dataset.

Can I integrate Jobnet Scraper with other apps?

Yes. Jobnet Scraper works with Apify's integrations to connect with tools like Zapier, Make, Google Sheets, Slack, and more. You can also use webhooks to trigger actions when a run completes.

Can I use Jobnet Scraper with the Apify API?

Yes. You can start runs, manage inputs, and retrieve results programmatically through the Apify API. Client libraries are available for JavaScript, Python, and other languages.

Can I use Jobnet Scraper through an MCP Server?

Yes. Apify provides an MCP Server that lets AI assistants and agents call this actor directly. Use compact mode and descriptionMaxLength to keep payloads manageable for LLM context windows.

This actor extracts publicly available data from jobnet.dk. Web scraping of public information is generally considered legal, but you should always review the target site's terms of service and ensure your use case complies with applicable laws and regulations, including GDPR where relevant.

Your feedback

If you have questions, need a feature, or found a bug, please open an issue on the actor's page in Apify Console. Your feedback helps us improve.

You might also like