Upwork Scraper avatar

Upwork Scraper

Pricing

Pay per usage

Go to Apify Store
Upwork Scraper

Upwork Scraper

Scrape Upwork job listings at scale. Extract titles, descriptions, budgets, skills, client info, and more from search results or individual job pages. Supports multiple modes: fast search scraping, detailed modal extraction, and authenticated scraping for enhanced data.

Pricing

Pay per usage

Rating

0.0

(0)

Developer

The Empire Strikes Back

The Empire Strikes Back

Maintained by Community

Actor stats

0

Bookmarked

9

Total users

3

Monthly active users

a day ago

Last modified

Share

Scrape Upwork job listings at scale. Extract titles, descriptions, budgets, skills, client info, and more from search results or individual job pages.

What does Upwork Scraper do?

This Actor scrapes Upwork job listings using real browser automation with Cloudflare bypass. It extracts structured job data and outputs it to the Apify Dataset in JSON format.

Use it to monitor new job postings, analyze market trends, track client activity, or build custom job feeds.

Scraping modes

ModeSpeedData depthDescription
LiteFastComprehensiveExtracts all job data from search pages. Best for most use cases.
DeepSlowMaximumLike Lite, but also opens job modals for extra details (activity metrics, full descriptions).
Lite LoginFastEnhancedLogs in first, then runs Lite mode. Unlocks client scores and spending data.
Deep LoginSlowMaximum + EnhancedLogs in first, then runs Deep mode. Most complete data possible.
PostPer-jobFull detailScrapes individual job pages by URL. Most comprehensive for specific jobs.
AgencyMediumAgency profilesScrapes Top Rated Plus agency profiles from talent search pages.

Which mode should I choose?

  • Start with Lite -- it's fast and covers most fields. One page returns up to 50 jobs.
  • Use Deep when you need activity metrics (proposals, interviewing count, invites) or full descriptions.
  • Use Login modes when you need client spending amounts and star ratings.
  • Use Post when you have specific job URLs and need maximum detail.

Search filters

When using Lite, Deep, or Login modes, you can filter results using the built-in input fields:

  • Search Query -- keywords like "React developer" or "Docker"
  • Experience Level -- Entry, Intermediate, Expert (multi-select)
  • Hourly Rate -- min/max range (filters for hourly jobs only)
  • Fixed Price -- min/max range (filters for fixed-price jobs only)
  • Client Hires -- minimum number of past client hires
  • Project Duration -- week, month, 1-6 months, ongoing (multi-select)
  • Hours per Week -- part-time or full-time (multi-select)
  • Contract to Hire -- only jobs that may convert to full-time

Note: Hourly rate and fixed price filters are mutually exclusive. If both are set, hourly rate takes priority.

Output data

Each job is stored as one item in the Dataset. Fields are only included when data is available.

Core fields (all modes)

FieldTypeExample
titlestring"SAP Integration Suite Specialist"
urlstring"https://www.upwork.com/jobs/..."
job_idstring"1944659549969744966"
description_textstringFull job description
skillsarray["Python", "Docker", "AWS"]
publish_timestring"2025-06-16T15:12:59.652Z"
experience_levelstring"Expert"
project_lengthstring"1-3 months"
hours_per_weekstring"30+ hrs/week"

Budget fields

FieldTypeDescription
budget_hourly_min_usdnumberMin hourly rate (USD)
budget_hourly_max_usdnumberMax hourly rate (USD)
budget_total_usdnumberFixed-price budget (USD)

Client fields (Deep and Login modes)

FieldTypeExample
client_countrystring"United States"
client_citystring"New York"
client_scorenumber4.91
client_feedback_countinteger498
client_spentinteger270000 (USD)
client_total_hiresinteger979
client_active_hiresinteger331
client_company_sizestring"Small company (2-9 people)"
client_member_sincestring"Aug 15, 2017"

Activity fields (Deep modes)

FieldTypeExample
proposalsstring"Less than 5"
interviewingstring"0"
invites_sentinteger4
last_viewedstring"27 seconds ago"
payment_verifiedstring"Payment method verified"

Classification fields

FieldTypeExample
category_namestring"Market Research & Product Reviews"
category_group_namestring"Admin Support"
total_jobs_with_hiresinteger464

Proxy requirements

Upwork blocks datacenter IPs. This Actor requires residential proxies to bypass Cloudflare. The default configuration uses Apify's residential proxy group, which works out of the box.

Limits and performance

  • Each search page returns up to 50 jobs
  • Lite mode: ~50 jobs in about 60-90 seconds
  • Deep mode: ~50 jobs in about 3-5 minutes (modal interactions add time)
  • Default memory: 4096 MB | Default timeout: 3600 seconds
  • Output limit is automatically capped at Max Pages x 50

Tips

  • Start with 1 page to test your filters before scaling up
  • Use the Dataset export to get results as JSON, CSV, or Excel
  • For recurring scrapes, save your configuration as a Task and set up a Schedule
  • Login modes require valid Upwork credentials -- the Actor handles the full login flow automatically