Job Finder – Multi-Site Job Scraper
Pricing
$20.00/month + usage
Job Finder – Multi-Site Job Scraper
Scrapes LinkedIn, Indeed, Glassdoor, ZipRecruiter, and Google Jobs simultaneously. Returns structured, deduplicated job listings with detailed info like title, company, location, salary, and remote options. Optionally exports results as a CSV.
Pricing
$20.00/month + usage
Rating
0.0
(0)
Developer

Jamshaid Arif
Actor stats
0
Bookmarked
2
Total users
1
Monthly active users
5 days ago
Last modified
Categories
Share
An Apify actor that searches multiple job boards simultaneously and returns structured, deduplicated results.
Supported Sites
| Site | Key |
|---|---|
linkedin | |
| Indeed | indeed |
| Glassdoor | glassdoor |
| ZipRecruiter | zip_recruiter |
| Google Jobs | google |
Input Example
{"search_term": "python developer","location": "United States","site_names": ["linkedin", "indeed"],"results_wanted": 100,"hours_old": 72,"distance": 50,"job_type": "fulltime","is_remote": false,"country_indeed": "United States","export_csv": true}
Input Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
search_term | string | python developer | Job title / keywords |
location | string | United States | Geographic area |
site_names | string[] | [linkedin,indeed] | Which boards to scrape |
results_wanted | integer | 100 | Max results per site |
hours_old | integer | 72 | Only jobs posted within N hours |
distance | integer | 50 | Radius in miles |
job_type | string | (any) | fulltime, parttime, internship, contract |
is_remote | boolean | false | Only remote positions |
country_indeed | string | United States | Country for Indeed domain |
export_csv | boolean | true | Also save CSV to key-value store |
Output
Each record in the default dataset contains:
site, title, company, company_url, location, date_posted, job_url,job_type, salary_source, interval, min_amount, max_amount, currency,description, emails, is_remote, scraped_at, search_term, search_location
If export_csv is enabled, a ready-to-download jobs.csv file is also
saved to the run's default key-value store.
Running Locally
# Install depspip install -r requirements.txt# Create inputmkdir -p storage/key_value_stores/defaultecho '{"search_term":"data engineer","location":"New York","site_names":["linkedin"],"results_wanted":10,"hours_old":24}' \> storage/key_value_stores/default/INPUT.json# Runpython -m src
Deploying to Apify
apify loginapify push
Then trigger runs from the Apify Console, API, or schedule them on a cron.