ATS Job Intelligence API (AI-Powered) avatar

ATS Job Intelligence API (AI-Powered)

Pricing

from $20.00 / 1,000 job enriched (ai)s

Go to Apify Store
ATS Job Intelligence API (AI-Powered)

ATS Job Intelligence API (AI-Powered)

Extract and enrich job listings from ATS platforms like Greenhouse and Lever using AI. Detect hiring signals, classify jobs, and generate structured insights for lead generation, job boards, and automation.

Pricing

from $20.00 / 1,000 job enriched (ai)s

Rating

0.0

(0)

Developer

Enos gabriel

Enos gabriel

Maintained by Community

Actor stats

1

Bookmarked

2

Total users

1

Monthly active users

4 hours ago

Last modified

Share

Extract and enrich job listings from ATS platforms like Greenhouse, Lever, Ashby, Workday, and Rippling using AI. Detect hiring signals, classify jobs, and generate structured insights for lead generation, job boards, and automation.

What does ATS Job Intelligence API do?

ATS Job Intelligence API is an AI-powered data extraction tool that scrapes job listings from multiple Applicant Tracking Systems (ATS) and enriches them with artificial intelligence. It automatically detects which ATS a company uses, fetches all available job data, and uses AI to provide deeper insights.

Key features:

  • Auto-detects ATS — No need to know which system a company uses. The actor figures it out automatically.
  • 5 ATS platforms supported — Greenhouse, Lever, Ashby, Workday, and Rippling
  • AI Enrichment — Optional AI-powered analysis adds:
    • Seniority classification (Junior, Mid, Senior, Unknown)
    • Tech stack detection (programming languages, frameworks, tools)
    • Category classification (Engineering, Product, Marketing, Sales, etc.)
    • Hiring signal detection (LOW, MEDIUM, HIGH urgency)
    • AI-generated role summary
  • Standardized output — Same schema regardless of ATS source
  • Built for B2B workflows — Clean JSON ready for enrichment pipelines
  • No browser overhead — Uses direct API calls for speed and reliability

Why use ATS Job Intelligence API?

  • Save hours of research — Manually finding and scraping each company's career page takes time. This actor does it in seconds.
  • AI-powered insights — Go beyond raw job data with AI enrichment that classifies seniority, detects tech stacks, and identifies hiring signals.
  • Consistent data — Every job comes with the same fields: title, department, location, salary, description, and AI insights.
  • Scale your outreach — Process hundreds of companies in a single run for recruitment, sales intelligence, or market research.
  • Works with your stack — Output integrates seamlessly with Clay, Apollo, Instantly, Zapier, Make, or any tool that consumes JSON.

How to use

  1. Go to the Input tab
  2. Add companies to the companies array with their domain (e.g., { "domain": "stripe.com" })
  3. Optionally add careers_url if you know the career page URL for more accurate detection
  4. Configure filters (keywords, location, department) if needed
  5. Click Start and download your dataset

Input

FieldRequiredDescription
companiesYesArray of { name, domain, careers_url } objects
filters.keywordsNoComma-separated keywords to filter jobs
filters.locationNoLocation filter (e.g., "remote", "San Francisco")
filters.departmentNoDepartment name to filter
filters.posted_within_daysNoOnly jobs posted within N days (0 = no limit)
ats_overrideNoForce a specific ATS (greenhouse, lever, ashby, workday, rippling)
include_descriptionNoInclude full job descriptions (default: true)
include_salaryNoInclude salary data where available (default: false)
max_jobs_per_companyNoMax jobs per company (default: 100)

Example Input

{
"companies": [
{ "name": "Stripe", "domain": "stripe.com" },
{ "name": "Linear", "domain": "linear.app", "careers_url": "https://jobs.ashbyhq.com/linear" }
],
"filters": {
"keywords": "engineer,product",
"location": "remote"
},
"max_jobs_per_company": 50
}

Output

Each job record includes the following fields:

FieldTypeDescription
job_idstringUnique job ID from the ATS
company_namestringCompany name
company_domainstringCompany domain
ats_platformstringSource ATS (greenhouse, lever, ashby, workday, rippling)
job_titlestringJob title
departmentstring/nullDepartment
teamstring/nullTeam
locationstringJob location
remoteboolean/stringtrue, false, or "hybrid"
employment_typestring/nullfull_time, part_time, contract, intern
senioritystring/nulljunior, mid, senior, staff, director, vp
job_urlstringURL to the job posting
apply_urlstringURL to apply
description_htmlstringFull job description (HTML)
description_textstringFull job description (plain text)
salary_minnumber/nullMinimum salary
salary_maxnumber/nullMaximum salary
salary_currencystring/nullCurrency code (USD, EUR, etc.)
posted_atstring/nullPosted date (ISO 8601)
updated_atstring/nullLast updated (ISO 8601)
scraped_atstringTimestamp when scraped
aiobject/nullAI enrichment (when ai: true)

AI Enriched Output

When ai: true is enabled, each job includes an ai object:

{
"job_id": "123456",
"company_name": "Stripe",
"job_title": "Senior Backend Engineer",
"location": "Remote",
"ai": {
"seniority": "Senior",
"tech_stack": ["Go", "PostgreSQL", "Redis", "gRPC", "Kubernetes"],
"remote": true,
"category": "Engineering",
"hiring_signal": "HIGH",
"summary": "Looking for a Senior Backend Engineer to build scalable payment infrastructure using Go and cloud-native technologies."
}
}

You can download the dataset in various formats: JSON, CSV, Excel, XML, or RSS.

Configuring GROQ_API_KEY

The AI enrichment uses Groq's API. The API key is already configured in the Actor — you don't need to set it up.

If you need to update the key (e.g., to use your own account), you can add it as an environment variable:

For Apify Console

  1. Open your Actor in Apify Console
  2. Go to SettingsEnvironment variables
  3. Add a new variable:
  4. Save and re-run the actor with ai: true

For Local Development

Create a .env file in the project root:

$GROQ_API_KEY=your_groq_api_key_here

Then run the actor locally with apify run.

Note: The AI enrichment is optional. The actor works perfectly without it — just omit "ai": true from your input.

Supported ATS Platforms

ATSAPI TypeSalary DataAI Enrichment
GreenhouseREST APIYes (per-job, with include_salary)Supported
LeverREST APIYes (optional)Supported
AshbyREST APIYes (with include_description)Supported
WorkdayCXS APINot availableSupported
RipplingREST APINot availableSupported

Use Cases

Lead Generation

Identify companies actively hiring in your target market. Use AI insights to prioritize prospects based on hiring signals and tech stack alignment.

Job Intelligence Platforms

Build job boards with enriched data. AI classification helps categorize roles, detect seniority levels, and identify remote opportunities automatically.

SaaS Product Features

Add job intelligence to your product. Track competitor hiring, monitor market trends, or power recruitment automation features.

Market Research

Analyze hiring patterns across industries. Use tech stack data to understand technology adoption trends and identify emerging tools.

Pricing

Pay-per-event (PPE) model:

ResourceCost
Actor start$0.00005 per run
Job extracted (basic)$0.003 per job
Job enriched with AI$0.02 per job

Example:

  • Scraping 50 companies with 500 total jobs (no AI): **$1.50**
  • Same run with AI enrichment enabled: ~$10.50

The AI version provides enriched job intelligence including:

  • Seniority classification
  • Tech stack extraction
  • Hiring signals
  • Structured summaries

This adds significant value compared to raw scraping and is designed for:

  • Lead generation platforms
  • Job intelligence products
  • Automation workflows

Note: Compute costs are billed separately by Apify.

Tips

  • Provide careers_url when possible for more accurate ATS detection
  • Disable include_description for faster runs when you only need metadata
  • Use ats_override if you know the ATS to skip detection entirely
  • Use filters.keywords to reduce the number of results and save on costs
  • The AI model used is llama-3.1-8b-instant via Groq — fast and cost-effective

FAQ

Q: Do I need a proxy? A: No. Greenhouse, Lever, and Ashby expose public APIs that don't require proxies. Workday and Rippling may benefit from proxies for large batches, but the actor works without them.

Q: What if ATS detection fails? A: The actor will attempt to fetch using all available connectors as fallback. You can also use ats_override to force a specific ATS.

Q: Can I scrape unlimited jobs? A: Yes, but use max_jobs_per_company to control costs and run time. The default is 100 jobs per company.

Q: Why is Lever returning 0 jobs? A: Some companies disable public job APIs. Use careers_url or try another company as a test.

Q: What happens if AI enrichment fails? A: The actor gracefully continues without AI data. Jobs are still returned, just without the ai enrichment field. This ensures your pipeline doesn't break due to AI issues.

Q: Is this legal? A: This actor uses publicly available APIs provided by each ATS for job listings. It respects the terms of service of each platform. For Workday and other enterprise systems, ensure you have permission before scraping.

Q: How does the AI work? A: The actor uses Groq's API with the llama-3.1-8b-instant model to analyze job titles and descriptions, extracting structured insights like tech stacks, seniority, and hiring signals.

Need help?