ATS Job Intelligence API (AI-Powered)
Pricing
from $20.00 / 1,000 job enriched (ai)s
ATS Job Intelligence API (AI-Powered)
Extract and enrich job listings from ATS platforms like Greenhouse and Lever using AI. Detect hiring signals, classify jobs, and generate structured insights for lead generation, job boards, and automation.
Pricing
from $20.00 / 1,000 job enriched (ai)s
Rating
0.0
(0)
Developer
Enos gabriel
Actor stats
1
Bookmarked
2
Total users
1
Monthly active users
4 hours ago
Last modified
Categories
Share
Extract and enrich job listings from ATS platforms like Greenhouse, Lever, Ashby, Workday, and Rippling using AI. Detect hiring signals, classify jobs, and generate structured insights for lead generation, job boards, and automation.
What does ATS Job Intelligence API do?
ATS Job Intelligence API is an AI-powered data extraction tool that scrapes job listings from multiple Applicant Tracking Systems (ATS) and enriches them with artificial intelligence. It automatically detects which ATS a company uses, fetches all available job data, and uses AI to provide deeper insights.
Key features:
- Auto-detects ATS — No need to know which system a company uses. The actor figures it out automatically.
- 5 ATS platforms supported — Greenhouse, Lever, Ashby, Workday, and Rippling
- AI Enrichment — Optional AI-powered analysis adds:
- Seniority classification (Junior, Mid, Senior, Unknown)
- Tech stack detection (programming languages, frameworks, tools)
- Category classification (Engineering, Product, Marketing, Sales, etc.)
- Hiring signal detection (LOW, MEDIUM, HIGH urgency)
- AI-generated role summary
- Standardized output — Same schema regardless of ATS source
- Built for B2B workflows — Clean JSON ready for enrichment pipelines
- No browser overhead — Uses direct API calls for speed and reliability
Why use ATS Job Intelligence API?
- Save hours of research — Manually finding and scraping each company's career page takes time. This actor does it in seconds.
- AI-powered insights — Go beyond raw job data with AI enrichment that classifies seniority, detects tech stacks, and identifies hiring signals.
- Consistent data — Every job comes with the same fields: title, department, location, salary, description, and AI insights.
- Scale your outreach — Process hundreds of companies in a single run for recruitment, sales intelligence, or market research.
- Works with your stack — Output integrates seamlessly with Clay, Apollo, Instantly, Zapier, Make, or any tool that consumes JSON.
How to use
- Go to the Input tab
- Add companies to the
companiesarray with their domain (e.g.,{ "domain": "stripe.com" }) - Optionally add
careers_urlif you know the career page URL for more accurate detection - Configure filters (keywords, location, department) if needed
- Click Start and download your dataset
Input
| Field | Required | Description |
|---|---|---|
companies | Yes | Array of { name, domain, careers_url } objects |
filters.keywords | No | Comma-separated keywords to filter jobs |
filters.location | No | Location filter (e.g., "remote", "San Francisco") |
filters.department | No | Department name to filter |
filters.posted_within_days | No | Only jobs posted within N days (0 = no limit) |
ats_override | No | Force a specific ATS (greenhouse, lever, ashby, workday, rippling) |
include_description | No | Include full job descriptions (default: true) |
include_salary | No | Include salary data where available (default: false) |
max_jobs_per_company | No | Max jobs per company (default: 100) |
Example Input
{"companies": [{ "name": "Stripe", "domain": "stripe.com" },{ "name": "Linear", "domain": "linear.app", "careers_url": "https://jobs.ashbyhq.com/linear" }],"filters": {"keywords": "engineer,product","location": "remote"},"max_jobs_per_company": 50}
Output
Each job record includes the following fields:
| Field | Type | Description |
|---|---|---|
job_id | string | Unique job ID from the ATS |
company_name | string | Company name |
company_domain | string | Company domain |
ats_platform | string | Source ATS (greenhouse, lever, ashby, workday, rippling) |
job_title | string | Job title |
department | string/null | Department |
team | string/null | Team |
location | string | Job location |
remote | boolean/string | true, false, or "hybrid" |
employment_type | string/null | full_time, part_time, contract, intern |
seniority | string/null | junior, mid, senior, staff, director, vp |
job_url | string | URL to the job posting |
apply_url | string | URL to apply |
description_html | string | Full job description (HTML) |
description_text | string | Full job description (plain text) |
salary_min | number/null | Minimum salary |
salary_max | number/null | Maximum salary |
salary_currency | string/null | Currency code (USD, EUR, etc.) |
posted_at | string/null | Posted date (ISO 8601) |
updated_at | string/null | Last updated (ISO 8601) |
scraped_at | string | Timestamp when scraped |
ai | object/null | AI enrichment (when ai: true) |
AI Enriched Output
When ai: true is enabled, each job includes an ai object:
{"job_id": "123456","company_name": "Stripe","job_title": "Senior Backend Engineer","location": "Remote","ai": {"seniority": "Senior","tech_stack": ["Go", "PostgreSQL", "Redis", "gRPC", "Kubernetes"],"remote": true,"category": "Engineering","hiring_signal": "HIGH","summary": "Looking for a Senior Backend Engineer to build scalable payment infrastructure using Go and cloud-native technologies."}}
You can download the dataset in various formats: JSON, CSV, Excel, XML, or RSS.
Configuring GROQ_API_KEY
The AI enrichment uses Groq's API. The API key is already configured in the Actor — you don't need to set it up.
If you need to update the key (e.g., to use your own account), you can add it as an environment variable:
For Apify Console
- Open your Actor in Apify Console
- Go to Settings → Environment variables
- Add a new variable:
- Name:
GROQ_API_KEY - Value: Your Groq API key (get one at https://console.groq.com/)
- Name:
- Save and re-run the actor with
ai: true
For Local Development
Create a .env file in the project root:
$GROQ_API_KEY=your_groq_api_key_here
Then run the actor locally with apify run.
Note: The AI enrichment is optional. The actor works perfectly without it — just omit "ai": true from your input.
Supported ATS Platforms
| ATS | API Type | Salary Data | AI Enrichment |
|---|---|---|---|
| Greenhouse | REST API | Yes (per-job, with include_salary) | Supported |
| Lever | REST API | Yes (optional) | Supported |
| Ashby | REST API | Yes (with include_description) | Supported |
| Workday | CXS API | Not available | Supported |
| Rippling | REST API | Not available | Supported |
Use Cases
Lead Generation
Identify companies actively hiring in your target market. Use AI insights to prioritize prospects based on hiring signals and tech stack alignment.
Job Intelligence Platforms
Build job boards with enriched data. AI classification helps categorize roles, detect seniority levels, and identify remote opportunities automatically.
SaaS Product Features
Add job intelligence to your product. Track competitor hiring, monitor market trends, or power recruitment automation features.
Market Research
Analyze hiring patterns across industries. Use tech stack data to understand technology adoption trends and identify emerging tools.
Pricing
Pay-per-event (PPE) model:
| Resource | Cost |
|---|---|
| Actor start | $0.00005 per run |
| Job extracted (basic) | $0.003 per job |
| Job enriched with AI | $0.02 per job |
Example:
- Scraping 50 companies with
500 total jobs (no AI): **$1.50** - Same run with AI enrichment enabled: ~$10.50
The AI version provides enriched job intelligence including:
- Seniority classification
- Tech stack extraction
- Hiring signals
- Structured summaries
This adds significant value compared to raw scraping and is designed for:
- Lead generation platforms
- Job intelligence products
- Automation workflows
Note: Compute costs are billed separately by Apify.
Tips
- Provide
careers_urlwhen possible for more accurate ATS detection - Disable
include_descriptionfor faster runs when you only need metadata - Use
ats_overrideif you know the ATS to skip detection entirely - Use
filters.keywordsto reduce the number of results and save on costs - The AI model used is
llama-3.1-8b-instantvia Groq — fast and cost-effective
FAQ
Q: Do I need a proxy? A: No. Greenhouse, Lever, and Ashby expose public APIs that don't require proxies. Workday and Rippling may benefit from proxies for large batches, but the actor works without them.
Q: What if ATS detection fails?
A: The actor will attempt to fetch using all available connectors as fallback. You can also use ats_override to force a specific ATS.
Q: Can I scrape unlimited jobs?
A: Yes, but use max_jobs_per_company to control costs and run time. The default is 100 jobs per company.
Q: Why is Lever returning 0 jobs?
A: Some companies disable public job APIs. Use careers_url or try another company as a test.
Q: What happens if AI enrichment fails?
A: The actor gracefully continues without AI data. Jobs are still returned, just without the ai enrichment field. This ensures your pipeline doesn't break due to AI issues.
Q: Is this legal? A: This actor uses publicly available APIs provided by each ATS for job listings. It respects the terms of service of each platform. For Workday and other enterprise systems, ensure you have permission before scraping.
Q: How does the AI work?
A: The actor uses Groq's API with the llama-3.1-8b-instant model to analyze job titles and descriptions, extracting structured insights like tech stacks, seniority, and hiring signals.
Need help?
- Report issues on GitHub
- Check the Apify documentation
- Contact support through the Apify Console