USAJobs Scraper avatar

USAJobs Scraper

Pricing

$15.00/month + usage

Go to Apify Store
USAJobs Scraper

USAJobs Scraper

Scrape federal government job listings from USAJOBS.gov. Search by keyword, location, salary, agency, GS grade, security clearance, remote work, and 20+ filters. Extract job details: duties, qualifications, salary, benefits, required documents, key requirements, evaluations, and application links.

Pricing

$15.00/month + usage

Rating

0.0

(0)

Developer

ParseForge

ParseForge

Maintained by Community

Actor stats

0

Bookmarked

5

Total users

3

Monthly active users

21 hours ago

Last modified

Share

ParseForge Banner

πŸ›οΈ USAJOBS Federal Jobs Scraper

πŸ•’ Last updated: 2026-05-05

Collect federal government job listings from USAJOBS.gov without coding or API keys. Search by keyword, location, salary range, agency, GS grade level, and 20+ filters. Perfect for anyone looking to download USAJOBS data as CSV, build a federal jobs database, monitor government hiring trends, or find a USAJOBS API alternative that works out of the box.

The USAJOBS Federal Jobs Scraper collects up to 50+ data fields per job listing, including salary, qualifications, duties, benefits, and application links, with smart keyword-to-series mapping that works without an API key.

✨ What Does It Do

  • πŸ“ positionTitle - Identify open federal positions by their official title to match your skills or hiring needs
  • πŸ’° salaryMin / salaryMax - Compare GS pay scales and salary ranges across agencies to benchmark federal compensation
  • 🌍 locationDisplay / locations - Find jobs in specific cities, states, or remote positions with full geographic coordinates
  • 🏒 organizationName / departmentName - Filter by federal agency or department to target specific branches of government
  • πŸ”’ securityClearance - Know upfront if a position requires Secret, Top Secret, or no clearance at all
  • πŸ“‹ majorDuties / qualificationSummary - Review full job duties, qualification requirements, and education criteria before applying

πŸ”§ Input

  • Keyword - The search term to find jobs (e.g., "software engineer", "nurse", "data analyst"). Automatically maps to federal occupational series codes
  • Location - City or state to filter results (e.g., "Washington, DC", "San Francisco")
  • Radius (miles) - Search within a distance from your location, up to 200 miles
  • Max Items - How many job listings to collect. Free users get up to 100, paid users up to 1,000,000
  • Minimum/Maximum Pay Grade (GS) - Filter by General Schedule grade level (01-15)
  • Minimum/Maximum Salary - Set salary range boundaries
  • Position Title - Filter by exact position title (contains match)
  • Agency/Organization Code - Target specific agency subelements (e.g., "HE38;HE46")
  • Job Category Code - Filter by occupational series (e.g., "2210" for IT, "0610" for Nursing)
  • Schedule Type - Full-time, part-time, shift work, intermittent, or job sharing
  • Hiring Path - Who may apply: public, veterans, federal employees, etc.
  • Security Clearance - Required clearance level from Not Required to Top Secret/SCI
  • Remote Work - Filter for remote-only or exclude remote positions
  • Posted Within (days) - Jobs posted in the last N days (0-60)
  • Sort By - Order results by open date, close date, title, salary, location, or agency
  • Travel Required - Filter by travel percentage from Not Required to 76%+
  • Relocation Offered - Filter by relocation expense reimbursement
{
"keyword": "software engineer",
"maxItems": 10
}

πŸ“Š Output

Each job listing includes up to 50+ data fields. Download as JSON, CSV, or Excel.

πŸ“ Position Title🏒 Agency NameπŸ›οΈ Department Name
🌍 Location DisplayπŸ“ City, State, CountryπŸ—ΊοΈ Longitude / Latitude
πŸ’° Min SalaryπŸ’° Max SalaryπŸ’΅ Pay Interval
πŸ“Š Pay GradeπŸ“ˆ Grade Range⬆️ Promotion Potential
πŸ“‹ Schedule TypeπŸ“„ Appointment Type🏷️ Service Type
πŸ‘₯ Hiring PathsπŸ”’ Security Clearance🏠 Remote / Telework
✈️ Travel RequiredπŸ’Š Drug Test RequiredπŸ‘” Supervisory Status
πŸ”„ Relocation Offeredβœ… Opening StatusπŸ‘€ Total Openings
πŸ“… Open DateπŸ“… Close DateπŸ“… Expire Date
πŸ”— Job URLπŸ”— Apply URLπŸ”— Benefits URL
πŸ“ Qualification SummaryπŸ“ Job SummaryπŸ“ Major Duties
πŸŽ“ EducationπŸ“‹ RequirementsπŸ“‹ Evaluations
πŸ“ How to ApplyπŸ“ What to Expect NextπŸ“„ Required Documents
🎁 Benefitsℹ️ Other InformationπŸ”‘ Key Requirements
πŸ‘€ Who May ApplyπŸ“ Position DescriptionπŸ“‚ Job Categories
🏒 Organization CodesπŸ†” Position IDπŸ†” Control Number

πŸ’Ž Why Choose the USAJOBS Scraper?

FeatureOur ActorUSA Jobs Scraper (shahidirfan)USAJobs Listing Scraper (powerbox)
No API key requiredβœ”οΈβŒβŒ
Smart keyword-to-series mappingβœ”οΈβŒβŒ
50+ output fields per jobβœ”οΈβŒβŒ
20+ search filtersβœ”οΈβŒPartial
Security clearance filterβœ”οΈβŒβŒ
Remote work filterβœ”οΈβŒβŒ
GS pay grade filteringβœ”οΈβŒβŒ
Salary range filteringβœ”οΈβŒβŒ
Promotion potential fieldβœ”οΈβŒβŒ
Position opening statusβœ”οΈβŒβŒ
Key requirements extractionβœ”οΈβŒβŒ
Location coordinates (lat/lng)βœ”οΈβŒβŒ
Up to 1M results per runβœ”οΈβŒβŒ
Pay-per-event pricingβœ”οΈβŒβŒ

πŸ“‹ How to Use

No technical skills required. Follow these simple steps:

  1. Sign Up: Create a free account w/ $5 credit
  2. Find the Tool: Search for "USAJOBS Federal Jobs Scraper" in the Apify Store and configure your input
  3. Run It: Click "Start" and watch your results appear

That's it. No coding, no setup, no complicated configuration. Now you can export your data in CSV, Excel, or JSON format.

🎯 Business Use Cases

  • πŸ“Š HR Analysts - Track federal hiring volume across agencies and GS grades monthly to forecast government workforce expansion in your sector
  • πŸ’Ό Government Contractors - Monitor new IT Specialist and Cybersecurity positions at DoD agencies to identify upcoming contract opportunities before RFPs drop
  • πŸ”¬ Policy Researchers - Collect salary data across all federal nursing positions nationwide to analyze geographic pay disparities in government healthcare


✨ Why choose this Actor

Capability
🎯Built for the job. Scoped specifically to this data source so you skip the parser engineering entirely.
πŸ”–Structured output. Clean, typed fields ready for analysis, dashboards, or downstream pipelines.
⚑Fast. Optimized request patterns return results in seconds, not minutes.
πŸ”Always fresh. Every run pulls live data, so the dataset reflects the source as of run time.
🌐No infra to manage. Apify handles proxies, retries, scaling, scheduling, and storage.
πŸ›‘οΈReliable. Battle-tested across many runs and edge cases, with graceful error handling.
🚫No code required. Configure in the UI, run from CLI, schedule via cron, or call from any language with the Apify SDK.

πŸ“Š Production-grade structured data without the engineering overhead of building and maintaining your own scraper.


πŸ“ˆ How it compares to alternatives

ApproachCostCoverageRefreshFiltersSetup
⭐ USAJOBS Federal Jobs Scraper (this Actor)$5 free credit, then pay-per-useFull source coverageLive per runSource-native filters supported⚑ 2 min
Build your own scraperEngineering hoursFull once builtWhenever you maintain itCustom code🐒 Days to weeks
Paid managed APIs$$$ monthlyVendor-definedLiveVendor-defined⏳ Hours
Third-party data dumpsVariesSubset, often stalePeriodicNoneπŸ•’ Variable

Pick this Actor when you want broad coverage, server-side filtering, and no pipeline maintenance.


πŸš€ How to use

  1. πŸ“ Sign up. Create a free account with $5 credit (takes 2 minutes).
  2. 🌐 Open the Actor. Go to the USAJOBS Federal Jobs Scraper page on the Apify Store.
  3. 🎯 Set input. Configure the input fields in the form (or paste a JSON), then set maxItems.
  4. πŸš€ Run it. Click Start and let the Actor collect your data.
  5. πŸ“₯ Download. Grab your results in the Dataset tab as CSV, Excel, JSON, or XML.

⏱️ Total time from signup to downloaded dataset: 3-5 minutes. No coding required.


πŸ’Ό Business use cases

πŸ“Š Data & Analytics

  • Build trend reports and dashboards from live source data
  • Feed BI tools, warehouses, and ML pipelines with structured records
  • Run periodic snapshots to track changes over time
  • Compare segments, regions, or categories with consistent fields

🏒 Operations & Strategy

  • Monitor competitor moves, pricing, and inventory shifts
  • Build internal directories and lookup tools backed by current data
  • Power workflows that depend on fresh source records
  • Cut manual data-gathering time from hours to minutes

🎯 Marketing & Growth

  • Identify market opportunities and trending topics
  • Research target audiences and customer personas at scale
  • Power lead-generation pipelines with verified records
  • Track sentiment, reviews, or social signals over time

πŸ› οΈ Engineering & Product

  • Prototype features that need real-world data without owning a crawler
  • Replace fragile in-house scrapers with a managed Actor
  • Wire datasets into your apps via the Apify API or webhooks
  • Skip the proxy, retry, and parsing maintenance entirely

🌟 Beyond business use cases

Data like this powers more than commercial workflows. The same structured records support research, education, civic projects, and personal initiatives.

πŸŽ“ Research and academia

  • Empirical datasets for papers, thesis work, and coursework
  • Longitudinal studies tracking changes across snapshots
  • Reproducible research with cited, versioned data pulls
  • Classroom exercises on data analysis and ethical scraping

🎨 Personal and creative

  • Side projects, portfolio demos, and indie app launches
  • Data visualizations, dashboards, and infographics
  • Content research for bloggers, YouTubers, and podcasters
  • Hobbyist collections and personal trackers

🀝 Non-profit and civic

  • Transparency reporting and accountability projects
  • Advocacy campaigns backed by public-interest data
  • Community-run databases for local issues
  • Investigative journalism on public records

πŸ§ͺ Experimentation

  • Prototype AI and machine-learning pipelines with real data
  • Validate product-market hypotheses before engineering spend
  • Train small domain-specific models on niche corpora
  • Test dashboard concepts with live input

πŸ€– Ask an AI assistant about this scraper

Open a ready-to-send prompt about this ParseForge actor in the AI of your choice:

❓ Frequently Asked Questions

πŸ’³ Do I need a paid Apify plan to run this actor?

No. You can start right now on the free Apify plan, which includes $5 in free monthly credit. That is enough to run this actor several times and explore the output before committing to anything. Paid plans unlock higher limits, more concurrent runs, and larger datasets. Create a free Apify account here to get started.

🚨 What happens if my run fails or returns no results?

Failed runs are not charged. If the source site changes, proxies get rate-limited, or a specific input matches nothing, re-run the actor or open our contact form and we will investigate. You can also check the run log in the Apify console to see why the run stopped.

πŸ“ How many items can I scrape per run?

Free users are limited to 10 items per run so you can preview the output and confirm the actor works for your use case. Paid users can raise maxItems up to 1,000,000 per run. Upgrade here if you need full scale.

πŸ•’ How fresh is the data?

Every run fetches live data at the moment of execution. There is no cache or delay: the records you get reflect what the source returned at that moment. Schedule the actor to maintain a rolling snapshot of the data you need.

πŸ§‘β€πŸ’» Can I call this actor from my own code?

Yes. Apify exposes every actor as a REST endpoint and ships first-class SDKs for Node.js and Python. You can start a run, read the dataset, and handle webhooks from your own app in a few lines. All you need is your Apify API token.

πŸ“€ How do I export the data?

Every Apify dataset can be downloaded in one click from the console as CSV, JSON, JSONL, Excel, HTML, XML, or RSS. You can also pull results programmatically via the Apify API or stream them into BigQuery, S3, and other destinations through built-in integrations.

πŸ“… Can I schedule the actor to run automatically?

Yes. Use the Apify scheduler to run the actor on any cadence, from hourly to monthly. Results are saved to your dataset and can be delivered to webhooks, email, Slack, cloud storage, or automation tools such as Zapier and Make.


πŸ”Œ Integrate with any app

USAJOBS Federal Jobs Scraper connects to any cloud service via Apify integrations:

  • Make - Automate multi-step workflows
  • Zapier - Connect with 5,000+ apps
  • Slack - Get run notifications in your channels
  • Airbyte - Pipe results into your warehouse
  • GitHub - Trigger runs from commits and releases
  • Google Drive - Export datasets straight to Sheets

You can also use webhooks to trigger downstream actions when a run finishes. Push fresh data into your product backend, or alert your team in Slack.


πŸ’‘ More ParseForge Actors

Browse our complete collection of data extraction tools for more.

πŸš€ Ready to Start?

Create a free account with $5 credit and collect your first 100 results for free. No coding, no setup.

πŸ†˜ Need Help?

  • Check the FAQ section above for common questions
  • Visit the Apify support page for documentation and tutorials
  • Contact us to request a new scraper, propose a custom project, or report an issue at Tally contact form

⚠️ Disclaimer

This Actor is an independent tool and is not affiliated with, endorsed by, or sponsored by USAJOBS, the Office of Personnel Management (OPM), or any U.S. government agency. All trademarks mentioned are the property of their respective owners.


πŸ’‘ Pro Tip: browse the complete ParseForge collection for more reference-data scrapers.