Multi-ATS Company Jobs Scraper avatar

Multi-ATS Company Jobs Scraper

Pricing

Pay per event

Go to Apify Store
Multi-ATS Company Jobs Scraper

Multi-ATS Company Jobs Scraper

Scrape job listings from company career pages across 5 ATS platforms: Greenhouse, Workday, SmartRecruiters, Lever, and Ashby. Standardized output with 15 fields per job. Pure HTTP, no browser needed.

Pricing

Pay per event

Rating

0.0

(0)

Developer

Stas Persiianenko

Stas Persiianenko

Maintained by Community

Actor stats

0

Bookmarked

3

Total users

2

Monthly active users

4 days ago

Last modified

Share

What does Multi-ATS Company Jobs Scraper do?

Multi-ATS Company Jobs Scraper extracts job listings from company career pages across 5 major ATS (Applicant Tracking System) platforms: Greenhouse, Workday, SmartRecruiters, Lever, and Ashby. Just paste a company's career page URL, and get standardized job data — title, location, department, employment type, salary, and more — in a single, unified format.

This is the easiest way to monitor job openings at any company. Try it now by clicking Start with the default input (Airbnb on Greenhouse). No API key or login required.

Unlike other scrapers that only work with one ATS, this actor auto-detects which platform a company uses and applies the right extraction logic. Pure HTTP — no browser overhead, blazing fast.

Who is Multi-ATS Company Jobs Scraper for?

🎯 HR Tech Companies & Job Boards

  • Aggregate job listings from hundreds of companies into your job board
  • Monitor new openings across your client portfolio
  • Build ATS-agnostic job feeds for your platform

📊 Recruiters & Talent Acquisition Teams

  • Track open positions at target companies across any ATS
  • Monitor competitor hiring activity and team growth
  • Filter by department, location, or keyword to find relevant openings

📈 Market Researchers & Analysts

  • Analyze hiring trends across industries and geographies
  • Track workforce expansion signals for investment decisions
  • Build datasets of job market supply by company, role type, or location

🤖 Automation Engineers & Data Teams

  • Feed structured job data into CRM, ATS, or enrichment pipelines
  • Schedule daily scrapes for real-time job monitoring
  • Integrate with Zapier, Make, Google Sheets, or any API client

Why use Multi-ATS Company Jobs Scraper?

5 ATS platforms in one actor — Greenhouse, Workday, SmartRecruiters, Lever, and Ashby ✅ Standardized output — 15 fields per job, same format regardless of ATS ✅ Auto-detection — paste any career page URL, the scraper figures out the rest ✅ Pure HTTP — no browser, no Playwright, 256MB memory, minimal compute cost ✅ No API key or login required — all data from public career pages ✅ Resilient — gracefully skips invalid companies, never crashes on bad input ✅ Predictable pricing — pay per job listing extracted, not per compute minute ✅ Full Apify platform — API access, scheduling, webhooks, 3,000+ integrations

What data can you extract?

Each job listing includes up to 15 standardized fields:

FieldDescriptionExample
📌 titleJob title"Senior Software Engineer"
🏢 companyCompany name"Airbnb"
📍 locationPrimary location"San Francisco, CA"
🏷️ departmentDepartment or team"Engineering"
💼 employmentTypeFull-time, Part-time, Contract"Full-time"
🏠 workplaceTypeRemote, Hybrid, On-site"Remote"
🔗 urlDirect job posting URLhttps://boards.greenhouse.io/...
📝 applyUrlApplication URLhttps://boards.greenhouse.io/.../apply
📅 postedAtWhen the job was posted"2026-03-15T09:00:00Z"
📄 descriptionFull HTML job description<div>About the role...</div>
💰 salaryCompensation info (if available)"$150,000 - $200,000"
🎓 experienceLevelRequired experience level"Mid-Senior Level"
🆔 jobIdPlatform-specific job ID"7649441"
⚙️ atsSourceATS platform detected"greenhouse"
🔑 companySlugCompany identifier on ATS"airbnb"

How much does it cost to scrape company job listings?

This Actor uses pay-per-event pricing — you pay only for what you scrape. No monthly subscription. All platform costs are included.

FreeStarter ($29/mo)Scale ($199/mo)Business ($999/mo)
Per job listing$0.00115$0.001$0.00078$0.0006
1,000 jobs$1.15$1.00$0.78$0.60
10,000 jobs$11.50$10.00$7.80$6.00

Higher-tier plans get additional volume discounts (up to 72% off at Diamond tier).

Real-world cost examples:

CompanyATSJobsDurationCost (Free tier)
Airbnb (Greenhouse)Greenhouse253~2s~$0.30
VISA (SmartRecruiters)SmartRecruiters933~10s~$1.08
Spotify (Lever)Lever166~1s~$0.20
Ramp (Ashby)Ashby132~1s~$0.16

💡 Free plan ($5 credits): scrape ~4,300 job listings across any ATS.

How to scrape company job listings

  1. Go to the Multi-ATS Company Jobs Scraper page on Apify Store
  2. Click Start to try with the default input (Airbnb on Greenhouse)
  3. To add your own companies, paste career page URLs into the Company Career Page URLs field:
    • Greenhouse: https://boards.greenhouse.io/airbnb
    • Workday: https://walmart.wd5.myworkdayjobs.com/WalmartExternal
    • SmartRecruiters: smartrecruiters:VISA
    • Lever: lever:spotify
    • Ashby: https://jobs.ashbyhq.com/ramp
  4. Optionally set filters (keyword, location, department) and max jobs per company
  5. Click Start and wait for results
  6. Download results as JSON, CSV, or Excel from the Dataset tab

Input example — multiple companies:

{
"companyUrls": [
"https://boards.greenhouse.io/airbnb",
"https://jobs.ashbyhq.com/ramp",
"lever:spotify",
"smartrecruiters:VISA",
"https://walmart.wd5.myworkdayjobs.com/WalmartExternal"
],
"maxJobsPerCompany": 50,
"includeDescription": true
}

Input example — filtered search:

{
"companyUrls": ["https://boards.greenhouse.io/airbnb"],
"filterDepartment": "Engineering",
"filterLocation": "San Francisco",
"maxJobsPerCompany": 100
}

Input parameters

ParameterTypeDefaultDescription
companyUrlsarray[]Career page URLs or ats:slug format entries
maxJobsPerCompanyinteger100Maximum jobs to extract per company (0 = unlimited)
searchQuerystring""Filter jobs by keyword in title/description
filterLocationstring""Filter by location text (partial match)
filterDepartmentstring""Filter by department name (partial match)
includeDescriptionbooleantrueInclude full HTML job descriptions

Output example

{
"title": "Senior Software Engineer",
"company": "Airbnb",
"location": "San Francisco, CA",
"department": "Software Engineering",
"employmentType": null,
"workplaceType": null,
"url": "https://careers.airbnb.com/positions/7649441?gh_jid=7649441",
"applyUrl": "https://careers.airbnb.com/positions/7649441?gh_jid=7649441#app",
"postedAt": "2026-02-24T09:04:33-05:00",
"description": "<div><h3>About the role...</h3></div>",
"salary": null,
"experienceLevel": null,
"jobId": "7649441",
"atsSource": "greenhouse",
"companySlug": "airbnb"
}

Tips for best results

💡 Start small — try with 1-2 companies and maxJobsPerCompany: 10 to verify your URLs work before scaling up.

💡 ATS detection — the scraper auto-detects the ATS from the URL. If auto-detection fails, use the ats:slug format: greenhouse:airbnb, lever:spotify, smartrecruiters:VISA, ashby:ramp.

💡 Finding career page URLs — Google [company name] careers to find their career page. Look at the URL to identify the ATS platform.

💡 Skip descriptions for speed — set includeDescription: false if you only need job metadata (title, location, department). This is 2-10x faster for Workday and SmartRecruiters.

💡 Schedule daily runs — use Apify scheduling to monitor job openings daily and get notified of new positions via webhooks or integrations.

💡 Workday URLs — Workday career pages follow the pattern {company}.wd{N}.myworkdayjobs.com/{board}. You can find the exact URL on the company's career page.

Integrations

📊 Multi-ATS Jobs Scraper → Google Sheets — export daily job listings to a shared spreadsheet for your recruiting team to review.

💬 Multi-ATS Jobs Scraper → Slack/Discord — get instant alerts when new engineering roles open at target companies.

🔄 Multi-ATS Jobs Scraper → Make/Zapier — trigger enrichment workflows when new jobs appear: look up hiring manager on LinkedIn, add to CRM, send outreach.

📅 Scheduled monitoring — run daily/weekly to track hiring trends. Compare job counts over time to detect team expansions or contractions.

🔗 Webhooks — get real-time notifications when a run completes. Process results immediately in your pipeline.

🗄️ Multi-ATS Jobs Scraper → Database — pipe standardized job data into PostgreSQL, MongoDB, or any database for analytics dashboards.

Using the Apify API

Node.js:

import { ApifyClient } from 'apify-client';
const client = new ApifyClient({ token: 'YOUR_API_TOKEN' });
const run = await client.actor('automation-lab/multi-ats-jobs-scraper').call({
companyUrls: ['https://boards.greenhouse.io/airbnb', 'lever:spotify'],
maxJobsPerCompany: 50,
includeDescription: false,
});
const { items } = await client.dataset(run.defaultDatasetId).listItems();
console.log(items);

Python:

from apify_client import ApifyClient
client = ApifyClient("YOUR_API_TOKEN")
run = client.actor("automation-lab/multi-ats-jobs-scraper").call(run_input={
"companyUrls": ["https://boards.greenhouse.io/airbnb", "lever:spotify"],
"maxJobsPerCompany": 50,
"includeDescription": False,
})
items = client.dataset(run["defaultDatasetId"]).list_items().items
print(items)

cURL:

curl -X POST "https://api.apify.com/v2/acts/automation-lab~multi-ats-jobs-scraper/runs?token=YOUR_API_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"companyUrls": ["https://boards.greenhouse.io/airbnb"],
"maxJobsPerCompany": 50,
"includeDescription": false
}'

Use with AI agents via MCP

Multi-ATS Company Jobs Scraper is available as a tool for AI assistants that support the Model Context Protocol (MCP).

Add the Apify MCP server to your AI client — this gives you access to all Apify actors, including this one:

Setup for Claude Code

$claude mcp add --transport http apify "https://mcp.apify.com"

Setup for Claude Desktop, Cursor, or VS Code

Add this to your MCP config file:

{
"mcpServers": {
"apify": {
"url": "https://mcp.apify.com"
}
}
}

Your AI assistant will use OAuth to authenticate with your Apify account on first use.

Example prompts

Once connected, try asking your AI assistant:

  • "Use automation-lab/multi-ats-jobs-scraper to find all open engineering jobs at Airbnb and Ramp"
  • "Scrape all remote job listings from Spotify's Lever career page and VISA's SmartRecruiters page"
  • "Get a list of all jobs at Walmart's Workday career site that mention 'data science' in the title"

Learn more in the Apify MCP documentation.

Web scraping of publicly available data is generally legal, as confirmed by the US Ninth Circuit Court of Appeals in HiQ Labs v. LinkedIn (2022). This scraper only accesses publicly available career pages and job listings that companies intentionally publish for public viewing.

This Actor does not require login credentials, does not bypass any access controls, and only extracts data that companies have made publicly available. Users are responsible for ensuring their use of the extracted data complies with applicable laws and the terms of service of the respective platforms.

For more information, see Apify's guide on the legality of web scraping.

FAQ

How fast is this scraper? Very fast. Most companies complete in 1-5 seconds because we use direct API endpoints (no browser rendering). A typical run scraping 5 companies finishes in under 10 seconds.

How much does it cost to scrape 10,000 jobs? At the free tier: approximately $11.50 (10,000 × $0.00115). With a Scale plan ($199/mo): approximately $7.80. The start fee is $0.005 per run.

What's the difference between this and your Greenhouse Jobs Scraper? The Greenhouse Jobs Scraper is specialized for Greenhouse with deeper features (application questions, advanced filters). This Multi-ATS scraper covers 5 platforms with a standardized output format — ideal when you need to scrape across multiple ATS platforms.

Why are some fields null? Not all ATS platforms expose the same data. For example, Greenhouse doesn't provide salary or employment type in listings. Workday doesn't include department in search results. Fields are null when the source ATS doesn't provide that information.

Why does the scraper skip my company URL? The scraper needs to detect which ATS platform to use from the URL. If auto-detection fails, use the explicit format: greenhouse:slug, lever:slug, smartrecruiters:slug, ashby:slug, or workday:slug. Check that the URL matches one of the supported ATS patterns.

Which ATS platforms are supported? Currently: Greenhouse (220K+ companies), Workday (10K+ companies), SmartRecruiters (4K+ companies), Lever (2K+ companies), and Ashby (800+ companies). More platforms will be added based on user demand.

Other job scrapers