Computrabajo LATAM Jobs Scraper
Pricing
from $0.01 / 1,000 results
Computrabajo LATAM Jobs Scraper
Scrapes job listings from Computrabajo across 10 Latin American countries. Supports keyword search, pagination, and outputs structured JSON.
Pricing
from $0.01 / 1,000 results
Rating
0.0
(0)
Developer
Alberto Diaz
Actor stats
0
Bookmarked
1
Total users
0
Monthly active users
9 days ago
Last modified
Categories
Share
Scraper for Computrabajo — the #1 job portal in Latin America. Extracts job listings across 10 countries in a single run.
Apify Store: (coming soon)
Countries supported
pe mx co ar cl ec ve bo py uy
Output
Each job listing returns:
| Field | Description |
|---|---|
title | Job title |
company | Company name |
location | City and region |
country | ISO country code |
url | Direct link to the offer |
posted_at | Posting date (relative, as shown on site) |
modality | Remote / On-site / Hybrid (when available) |
salary | Salary (when shown publicly) |
platform | Always computrabajo |
scraped_at | UTC timestamp of the scrape |
run_id | ID to trace all results from the same run |
Input
{"keyword": "desarrollador python","countries": ["pe", "mx", "co"],"max_pages": 5,"proxy_enabled": false}
| Field | Required | Default | Description |
|---|---|---|---|
keyword | Yes | — | Search term |
countries | No | ["pe"] | List of country codes |
max_pages | No | 5 | Pages per country (1–50, ~20 results each) |
proxy_enabled | No | false | Enable Apify residential proxies |
Local Development
# Setupmake setup# Run (reads input.json)make run# Run testsmake test# Health check against live sitemake health
Edit input.json to change the search:
{"platform": "computrabajo","countries": ["pe", "ar"],"keyword": "contador","max_pages": 3}
Deploy
Deployments to Apify Store are automated via GitHub Actions on every push to main.
To set up the first time:
- Get your token at apify.com → Settings → Integrations
- Add it as
APIFY_TOKENin GitHub → Settings → Secrets → Actions
Monitoring
A health check runs every Monday at 9am UTC against PE, MX and CO. If any country returns 0 results (broken selectors, site change), the workflow fails and GitHub sends an email alert automatically.
To trigger manually: GitHub → Actions → Scraper Health Monitor → Run workflow.