Lever Jobs Scraper
Pricing
from $50.00 / 1,000 job postings
Lever Jobs Scraper
Scrape Lever-hosted job postings by company slug — title, team, department, commitment, location, country, workplace type, plain-text and HTML descriptions, apply URL, posted date. Recruiter sourcing, ATS competitive analytics, sales prospecting on companies actively hiring.
Pricing
from $50.00 / 1,000 job postings
Rating
0.0
(0)
Developer
Stephan Corbeil
Actor stats
0
Bookmarked
2
Total users
1
Monthly active users
2 days ago
Last modified
Share
🎯 Lever Jobs Scraper — Open API Postings
Scrape Lever-hosted job boards in bulk via Lever's official open postings API. Feed in a list of company slugs (the path segment in jobs.lever.co/{slug}), get back every live posting with title, team, department, location, plain-text and HTML descriptions, and apply URL.
Built for recruiters and sourcing tools (track who's hiring senior roles), ATS competitive analysts (Lever vs. Greenhouse vs. Ashby adoption tracking), sales/BD teams (prospect into companies actively scaling teams), and job aggregators / talent platforms that need clean structured postings.
What you get per posting
id,company_slug— Lever's posting ID + the slug you fed intitle— job titleteam,department,commitment— categorical breakdown (e.g. Engineering, Backend, Full-Time)location,all_locations,country,workplace_type— onsite / remote / hybrid + geographic datahosted_url— canonical Lever-hosted job pageapply_url— direct apply linkcreated_at,updated_at— Unix-ms timestampstags— any tags Lever exposesdescription_html,description_plain— main body in both formatsopening_html,opening_plain— intro paragraph (often pitch / mission)description_body_html,description_body_plain— main responsibilities + qualificationslists— Lever's structured lists (Responsibilities, Qualifications, etc.)additional_html,additional_plain— equal-opportunity, comp band, perkssource—"lever.co"
Use cases
- Recruiter sourcing — pull every senior IC role at 75 Lever-hosted Series B/C companies and search descriptions for tech-stack keywords.
- ATS market share research — count active Lever postings per company per quarter to track Lever vs. Greenhouse adoption.
- Sales prospecting — companies that just added a "Head of Engineering" or "VP Sales" role are prime ICP for many B2B tools.
- Job aggregator backfill — feed your remote-jobs board with structured Lever data refreshed nightly.
- Comp-band benchmarking — many Lever postings publish salary ranges in the description; extract them downstream.
- Investor / portfolio monitoring — track headcount growth at portfolio cos. via posting cadence.
Quick start
Input:
{"companies": ["clari", "figma", "box"],"maxJobs": 200,"includeContent": true}
Sample output row:
{"id": "abc123-...-def","company_slug": "clari","title": "Senior Backend Engineer, Platform","team": "Engineering","department": "Platform","commitment": "Full-Time","location": "Sunnyvale, CA","country": "United States","workplace_type": "hybrid","hosted_url": "https://jobs.lever.co/clari/abc123","apply_url": "https://jobs.lever.co/clari/abc123/apply","created_at": 1738473600000,"updated_at": 1745030400000,"description_plain": "Clari is hiring...","source": "lever.co"}
Run via Python (apify-client)
from apify_client import ApifyClientclient = ApifyClient("YOUR_APIFY_TOKEN")run = client.actor("nexgendata/lever-jobs-scraper").call(run_input={"companies": ["clari", "figma", "box"],"maxJobs": 500,"includeContent": True,})for item in client.dataset(run["defaultDatasetId"]).iterate_items():print(item["title"], "-", item["company_slug"], "-", item["location"])
Run via cURL
curl -X POST "https://api.apify.com/v2/acts/nexgendata~lever-jobs-scraper/run-sync-get-dataset-items?token=YOUR_TOKEN" \-H "Content-Type: application/json" \-d '{"companies":["clari","figma"],"maxJobs":50}'
Integrations
- Zapier — fire on each new posting, push into Airtable / Notion / Slack DM.
- Make.com — pipe new postings into HubSpot or Salesforce as new prospect signals.
- n8n — schedule + dedupe + push into Postgres for trend analysis.
Pricing
Pay-per-event (PPE):
- Actor start: $0.00005 (negligible)
- Per posting: $0.05
Cost calculator:
| Postings returned | Cost |
|---|---|
| 10 (smoke test) | $0.50 |
| 100 | $5.00 |
| 1,000 | $50.00 |
| 5,000 (multi-company sweep) | $250.00 |
You only pay for postings actually pushed to your dataset — failed companies don't get charged.
FAQ
Q: Where do I find the Lever slug?
A: Visit the company's careers page. If the URL is jobs.lever.co/{slug} (or it redirects to one), {slug} is what you want.
Q: What about Greenhouse / Ashby / Workday? A: Different ATS, different actors. See Related Actors below.
Q: Do I need a Lever API key?
A: No. The public postings API at api.lever.co/v0/postings/{slug}?mode=json requires no auth.
Q: How fresh is the data? A: Real-time. Postings show up within minutes of the company adding them in Lever.
Q: What if a company moved off Lever? A: You'll get an empty array (logged) and zero charges for that company. The actor moves on.
Q: Can I get just the metadata?
A: Yes — set includeContent: false. Cuts row size by 80–95%.
Related actors
- Greenhouse Jobs Scraper — same shape, for Greenhouse-hosted boards.
- WeWorkRemotely Jobs Scraper — remote-first job aggregator.
- HN Who's Hiring Scraper — monthly Hacker News hiring threads.
Built and maintained by NexGenData — affordable, focused web scrapers for B2B data teams.