RemoteOK Scraper - Remote Job Listings avatar

RemoteOK Scraper - Remote Job Listings

Pricing

from $1.00 / 1,000 results

Go to Apify Store
RemoteOK Scraper - Remote Job Listings

RemoteOK Scraper - Remote Job Listings

Scrape remoteok.com - the world’s largest remote job board. Filter by skill tag, location, or company. Salary range and direct apply URL on every result. Incremental mode detects new and changed listings. Compact output for AI agents and MCP workflows.

Pricing

from $1.00 / 1,000 results

Rating

0.0

(0)

Developer

Black Falcon Data

Black Falcon Data

Maintained by Community

Actor stats

0

Bookmarked

4

Total users

2

Monthly active users

10 hours ago

Last modified

Share

What does RemoteOK Scraper do?

RemoteOK Scraper extracts structured job data from remoteok.com — including salary data, contact details, company metadata, and full descriptions. It supports location filters and controllable result limits, so you can run the same query consistently over time.

Key features

  • Incremental mode — recurring runs emit and charge only for listings that are new or whose tracked content changed. First run builds the baseline state; subsequent runs emit only new or changed records.
  • Compact mode — AI-agent and MCP-friendly payloads with core fields only.

What data can you extract from remoteok.com?

Each result includes Core listing fields (jobId, title, location, tags, salaryMin, salaryMax, salaryText, and url, and more), detail fields when enrichment is enabled (description), contact and apply information (applyUrl), and company metadata (company). In standard mode, all fields are always present — unavailable data points are returned as null, never omitted. In compact mode, only core fields are returned.

Input

The main inputs are an optional location filter and a result limit. Additional filters and options are available in the input schema.

Key parameters:

  • tag — Filter by skill or technology tag, e.g. "javascript", "python", "devops", "react". Leave blank for all recent remote jobs.
  • location — Filter jobs by geographic restriction, e.g. "usa", "europe", "worldwide". Note: many remote jobs have no location restriction.
  • company — Filter jobs by company name, e.g. "github", "stripe". Leave blank for all companies.
  • maxResults — Maximum number of jobs to return (0 = all available, up to ~96 most recent). (default: 25)
  • descriptionMaxLength — Truncate job description HTML to N characters. 0 = no truncation. (default: 0)
  • compact — Return core fields only (jobId, title, company, location, tags, salaryText, applyUrl, url, postedDate). Ideal for AI-agent and MCP workflows. (default: false)
  • incrementalMode — Only emit jobs not seen in a previous run. Tracks state per stateKey. Ideal for daily schedulers — avoids duplicate processing. (default: false)
  • stateKey — Scope key for incremental state. Defaults to a hash of tag+location+company. Override when running multiple schedules against the same filters.

Input examples

Basic search — Keyword-driven search with a result cap.

→ Full payload per result — all standard fields populated where the source provides them.

{
"tag": "javascript",
"maxResults": 50
}

Incremental tracking — Only emit jobs that changed since the previous run with this stateKey.

→ First run builds the baseline state. Subsequent runs emit only records that are new or whose tracked content changed. Set emitUnchanged: true to include unchanged records as well.

{
"tag": "javascript",
"maxResults": 200,
"incrementalMode": true,
"stateKey": "javascript-tracker"
}

Compact output for AI agents — Return only core fields for AI-agent and MCP workflows.

→ Small payload with the most important fields — ideal for piping into LLMs without token overhead.

{
"tag": "javascript",
"maxResults": 50,
"compact": true
}

Output

Each run produces a dataset of structured job records. Results can be downloaded as JSON, CSV, or Excel from the Dataset tab in Apify Console.

Example job record

{
"jobId": "649bc6d9f6dcd2c34fe2c42d358f19ea58a8a76f518e765ace9c596331bd40c4",
"title": "Clinical Psychologist",
"company": "LifeStance Health",
"location": "Crofton, MD",
"tags": [
"adult",
"lead",
"medical",
"health",
"healthcare",
"non tech"
],
"salaryMin": 0,
"salaryMax": 0,
"salaryText": "$0 – $0",
"description": "<p><span style=\"font-size: 11pt\">At LifeStance Health, we believe in a truly healthy society where mental and physical healthcare are unified to make lives better. Our mission is to help people lead h...",
"applyUrl": "https://remoteOK.com/remote-jobs/remote-clinical-psychologist-lifestance-health-1130929",
"url": "https://remoteOK.com/remote-jobs/remote-clinical-psychologist-lifestance-health-1130929",
"logoUrl": null,
"postedDate": "2026-03-28T16:01:20+00:00",
"isVerified": false,
"portalUrl": "https://remoteOK.com/remote-jobs/remote-clinical-psychologist-lifestance-health-1130929",
"scrapedAt": "2026-03-29T21:32:21.790Z",
"source": "remoteok.com",
"changeType": "new"
}

Incremental fields

When incremental: true, each record also carries:

  • changeType — one of NEW, UPDATED, UNCHANGED, REAPPEARED, EXPIRED.
  • firstSeenAt, lastSeenAt — ISO-8601 timestamps tracking the listing across runs.

How to scrape remoteok.com

  1. Go to RemoteOK Scraper in Apify Console.
  2. Configure the input and optional location filter.
  3. Set maxResults to control how many results you need.
  4. Click Start and wait for the run to finish.
  5. Export the dataset as JSON, CSV, or Excel.

Use cases

  • Extract job data from remoteok.com for market research and competitive analysis.
  • Track salary trends across regions and categories over time.
  • Monitor new and changed listings on scheduled runs without processing the full dataset every time.
  • Build outreach lists using contact details and apply URLs from listings.
  • Research company hiring patterns, employer profiles, and industry distribution.
  • Feed structured data into AI agents, MCP tools, and automated pipelines using compact mode.
  • Export clean, structured data to dashboards, spreadsheets, or data warehouses.

How much does it cost to scrape remoteok.com?

RemoteOK Scraper uses pay-per-event pricing. You pay a small fee when the run starts and then for each result that is actually produced.

  • Run start: $0.005 per run
  • Per result: $0.001 per job record

Example costs:

  • 10 results: $0.01
  • 100 results: $0.11
  • 200 results: $0.21

Example: recurring monitoring savings

These examples compare full re-scrapes with incremental runs at different churn rates. Churn is the share of listings that are new or whose tracked content changed since the previous run. Actual churn depends on your query breadth, source activity, and polling frequency — the scenarios below are examples, not predictions.

Example setup: 100 results per run, daily polling (30 runs/month). Event-pricing examples scale linearly with result count.

Churn rateFull re-scrape run costIncremental run costSavings vs full re-scrapeMonthly cost after baseline
5% — stable niche query$0.11$0.01$0.10 (90%)$0.30
15% — moderate broad query$0.11$0.02$0.09 (81%)$0.60
30% — high-volume aggregator$0.11$0.03$0.07 (67%)$1.05

Full re-scrape monthly cost at daily polling: $3.15. First month with incremental costs $0.40 / $0.68 / $1.12 for the 5% / 15% / 30% scenarios because the first run builds baseline state at full cost before incremental savings apply.

FAQ

How many results can I get from remoteok.com?

The number of results depends on the search query and available listings on remoteok.com. Use the maxResults parameter to control how many results are returned per run.

Does RemoteOK Scraper support recurring monitoring?

Yes. Enable incremental mode to only receive new or changed listings on subsequent runs. This is ideal for scheduled monitoring where you want to track changes over time without re-processing the full dataset.

Can I integrate RemoteOK Scraper with other apps?

Yes. RemoteOK Scraper works with Apify's integrations to connect with tools like Zapier, Make, Google Sheets, Slack, and more. You can also use webhooks to trigger actions when a run completes.

Can I use RemoteOK Scraper with the Apify API?

Yes. You can start runs, manage inputs, and retrieve results programmatically through the Apify API. Client libraries are available for JavaScript, Python, and other languages.

Can I use RemoteOK Scraper through an MCP Server?

Yes. Apify provides an MCP Server that lets AI assistants and agents call this actor directly. Use compact mode and descriptionMaxLength to keep payloads manageable for LLM context windows.

This actor extracts publicly available data from remoteok.com. Web scraping of public information is generally considered legal, but you should always review the target site's terms of service and ensure your use case complies with applicable laws and regulations, including GDPR where relevant.

Your feedback

If you have questions, need a feature, or found a bug, please open an issue on the actor's page in Apify Console. Your feedback helps us improve.

You might also like

Getting started with Apify

New to Apify? Create a free account with $5 credit — no credit card required.

  1. Sign up — $5 platform credit included
  2. Open this actor and configure your input
  3. Click Start — export results as JSON, CSV, or Excel

Need more later? See Apify pricing.