Upwork Jobs Scraper
Pricing
Pay per usage
Upwork Jobs Scraper
Search public Upwork job postings by keyword and filters, then write normalized jobs with budgets, hourly ranges, skills, proposal counts, visible client history, and direct job URLs to the dataset.
Pricing
Pay per usage
Rating
0.0
(0)
Developer
jts
Actor stats
0
Bookmarked
1
Total users
0
Monthly active users
9 hours ago
Last modified
Categories
Share
Upwork Jobs Scraper
What this Actor does
Search public Upwork job postings by keyword and filters, then write normalized jobs with budgets, hourly ranges, skills, proposal counts, visible client history, and direct job URLs to the dataset.
What data can you get?
Each run writes structured Upwork job result rows to the Apify dataset. Common fields include:
| Field | What it means |
|---|---|
jobId | Upwork job ciphertext or public id. |
title | Job title. |
paymentType | fixed or hourly. |
budget | Fixed-price budget object. |
hourlyRange | Hourly min/max range. |
experienceLevel | Normalized experience level. |
clientCountry | Visible client country. |
skills | Required skills. |
proposalsCount | Visible proposal/applicant count. |
jobUrl | Direct Upwork job URL. |
result | Full normalized job result. |
How to use Upwork Jobs Scraper
- Open this Actor in Apify Console.
- Paste the input JSON below or fill the form fields in the Input tab.
- Click Start and wait for the run to finish.
- Open the Dataset tab to preview rows, then export JSON, CSV, Excel, XML, or HTML.
- For production, schedule the Actor or call it from the Apify API, SDKs, webhooks, or MCP server.
When to use this
- Public job-market and skill-demand research on Upwork postings
- Freelancer lead monitoring at conservative request rates
- Internal recruiting and sales prospecting where you comply with Upwork terms
When NOT to use this
- Bypassing Upwork access controls, captcha, or login walls
- Bulk proposal blasting, spam, or harassment
- Collecting freelancer or client private account data
Input example
{"q": "typescript","paymentType": "hourly","experienceLevel": "expert","limit": 25}
Input fields
Provide one query or an array of search requests. Optional filters include category, budget range, payment type, experience level, limit, and offset.
q(string, user input): Public Upwork job search phrase, for example "typescript api".category(string, user input): Optional category label or Upwork occupation id.paymentType(string, user input): Optional fixed or hourly filter.experienceLevel(string, user input): Optional entry, intermediate, or expert filter.budgetMin(integer, user input): Optional minimum fixed-price budget.budgetMax(integer, user input): Optional maximum fixed-price budget.limit(integer, user input): Results per query, 1-50.searches(array, user input): Optional array of query strings or search objects. Overrides the top-level query when provided.apiBaseUrl(string, optional override): Override of the upstream API base URL. Defaults to runtime.baseUrl from service.json.
Output dataset
Each dataset item is one normalized Upwork job result.
jobId: Upwork job ciphertext or public id.title: Job title.paymentType: fixed or hourly.budget: Fixed-price budget object.hourlyRange: Hourly min/max range.experienceLevel: Normalized experience level.clientCountry: Visible client country.skills: Required skills.proposalsCount: Visible proposal/applicant count.jobUrl: Direct Upwork job URL.result: Full normalized job result.
Sample output
How can you use the Upwork Jobs Scraper data?
- Monitor new Upwork jobs for a target skill or service niche
- Rank freelance leads by budget, experience level, proposal count, and client history
- Track skill demand across public Upwork job postings
- Enrich CRM workflows with direct Upwork job URLs and normalized budget fields
- Build alerts for high-value hourly or fixed-price opportunities
- Analyze freelance market trends without maintaining a custom scraper
- Normalize upstream jobs & freelance data data into one Apify dataset contract.
API, integrations, and MCP
- Run this Actor from Apify Console, schedules, webhooks, REST API, or the Apify SDKs.
- Connect the Actor to AI agents through the Apify MCP server.
- Export dataset items to JSON, CSV, Excel, XML, HTML, Google Sheets, S3, or downstream BI tools.
- Use the generated
INPUT.example.jsonas the baseline payload for API calls and scheduled runs.
FAQ
How much does it cost to use this Actor?
Pay-per-call: $0.0005 per actor start, $0.001 per job item. A 25-job search costs ~$0.03; daily monitoring on 30 search profiles stays under $25/month — priced relative to one closed lead.
Can I use this Actor with the Apify API?
Yes. Use the same input JSON shown above with Apify's REST API, JavaScript SDK, Python SDK, schedules, or webhooks.
Can I use this Actor through an MCP server?
Yes. Add this Actor to the Apify MCP server so AI agents can run it and consume the resulting dataset.
Is it legal to use this data?
Use this Actor only for the allowed use cases: Public job-market research, freelance lead monitoring at conservative request rates, internal sourcing workflows, skill-demand analytics, and CRM enrichment where the caller complies with Upwork terms and applicable law. Do not use it for prohibited workflows: Bypassing Upwork access controls, captcha, login walls, or rate limits; high-volume extraction that violates Upwork terms; collecting account-only or private Upwork data; automated spam, proposal blasting, harassment, credential collection, or resale of raw Upwork data as a standalone dataset.
Pricing
Pay-per-call: $0.0005 per actor start, $0.001 per job item. A 25-job search costs ~$0.03; daily monitoring on 30 search profiles stays under $25/month — priced relative to one closed lead.
Developer notes
Set UPWORK_JOBS_SCRAPER_API_KEY as an Apify secret environment variable. The Actor forwards it as X-API-Key to the upstream API.
Use apiBaseUrl only for local or staging QA. Production runs default to the deployed API base URL from service.json.
For advanced testing, a requests array can call explicit API paths; normal users should use the service-specific fields above.
Run locally
$apify run
Deploy
$apify push
Changelog
- 0.1.0 (2026-04-30) — Initial release: query + paymentType/experienceLevel filters, batch endpoint.