Naukri Job Scraper
Pricing
Pay per event
Naukri Job Scraper
Scrape job listings from Naukri.com, India's largest job board. Extract title, company, salary, location, experience, skills & description. Export JSON/CSV/Excel. No API key needed.
Pricing
Pay per event
Rating
0.0
(0)
Developer
Stas Persiianenko
Actor stats
1
Bookmarked
429
Total users
125
Monthly active users
15 days ago
Last modified
Categories
Share
Naukri.com Job Scraper
Scrape job listings from Naukri.com — India's largest job board with 90M+ registered job seekers and 1M+ active recruiters. Extract structured job data including titles, companies, salaries, skills, experience requirements, work mode, and AmbitionBox company ratings — all without a browser, API key, or login.
What does Naukri.com Job Scraper do?
Naukri.com Job Scraper connects to Naukri's JSON search API and extracts comprehensive job listing data across any keyword and location. It processes multiple pages of results automatically, parses salary data from INR amounts into structured fields, extracts skills arrays, and includes AmbitionBox company ratings directly from the search response.
Unlike browser-based competitors that use 2048–4096MB memory and take minutes per run, this actor uses pure HTTP with 256MB memory — delivering faster results at a fraction of the cost.
Key capabilities:
- 🔍 Search any job title, skill, or technology (Python, product manager, data analyst, etc.)
- 📍 Filter by city — Bangalore, Mumbai, Delhi, Hyderabad, Chennai, Pune, or all of India
- 💰 Extract structured salary data (min/max in INR, display string)
- 🧑💻 Filter by experience level, work mode (remote/hybrid/office), and salary range
- ⭐ Includes AmbitionBox company ratings and review counts
- 📄 Full job descriptions included in output
- 🏢 Vacancy count per listing
Who is Naukri.com Job Scraper for?
HR teams and talent acquisition professionals at Indian companies monitoring the job market:
- Track competitor hiring: which companies are hiring your target roles?
- Monitor salary benchmarks: what are others paying for specific skills?
- Run weekly extractions to detect hiring velocity changes
Recruitment agencies and staffing firms building candidate pipelines:
- Extract thousands of job postings to identify high-volume clients
- Monitor which companies are actively hiring at scale
- Build lead lists of companies with open positions in target industries
Market researchers and data analysts studying India's job market:
- Salary trend analysis across cities, skills, and experience levels
- Demand signals for specific technologies (Python, React, AWS)
- India tech hiring velocity by company and sector
Job aggregation and alerting tools built on top of Naukri data:
- Schedule daily runs to detect new listings matching specific criteria
- Feed into Google Sheets or databases for ongoing monitoring
- Trigger Slack/email alerts when target companies post new roles
Why use Naukri.com Job Scraper?
- ✅ Pure HTTP — no browser overhead: 256MB vs 2048–4096MB for Playwright-based competitors
- ✅ Fast: fetches 100 jobs in under 30 seconds
- ✅ Structured output: salary parsed into INR min/max, skills as array, work mode extracted
- ✅ AmbitionBox ratings included: company ratings and review counts in every job record
- ✅ No API key or login required: public search API
- ✅ Pay only for what you scrape: PPE pricing — no subscription
- ✅ Works across all Indian cities and remote jobs
- ✅ Pagination handled automatically: scrapes up to 5,000 jobs per run
What data can you extract?
| Field | Description | Example |
|---|---|---|
jobId | Unique Naukri job ID | 250326037855 |
title | Job title | Python Developer |
companyName | Hiring company | Cognizant |
companyUrl | Company jobs page on Naukri | https://www.naukri.com/cognizant-jobs-careers-2114 |
companyLogoUrl | Company logo URL | https://img.naukimg.com/logo_images/... |
location | Job location | Bengaluru |
workMode | Work arrangement | hybrid, remote, null (office) |
experienceText | Display experience range | 6-11 Yrs |
experienceMin | Minimum years required | 6 |
experienceMax | Maximum years required | 11 |
salary | Salary display string | 6-15 Lacs PA |
salaryMin | Minimum salary (INR) | 600000 |
salaryMax | Maximum salary (INR) | 1500000 |
salaryCurrency | Currency code | INR |
skills | Required skills array | ["Python", "Django", "AWS"] |
jobDescription | Full job description | <p>We are looking for... |
postedDate | Exact post timestamp | 2026-03-25 16:41:37 |
postedDateRelative | Human-readable date | 1 Day Ago |
applyByDate | Application deadline | 2:11 AM |
jobUrl | Direct link to job listing | https://www.naukri.com/job-listings-... |
vacancy | Number of open positions | 15 |
companyRating | AmbitionBox rating (0–5) | 3.7 |
companyReviewsCount | Number of employee reviews | 60787 |
ambitionBoxUrl | Link to company reviews | https://www.ambitionbox.com/... |
keyword | Search keyword used | python developer |
locationSearched | Location filter used | bangalore |
scrapedAt | Extraction timestamp | 2026-04-02T10:30:00.000Z |
How much does it cost to scrape Naukri.com jobs?
This Actor uses pay-per-event pricing — you pay only for the jobs you scrape. No monthly subscription. All platform costs are included.
| Free ($5 credit) | Starter ($29/mo) | Scale ($199/mo) | Business ($999/mo) | |
|---|---|---|---|---|
| Per job | $0.00115 | $0.001 | $0.00078 | $0.0006 |
| 1,000 jobs | $1.15 | $1.00 | $0.78 | $0.60 |
| 5,000 jobs | $5.75 | $5.00 | $3.90 | $3.00 |
Plus a one-time run start fee of $0.005 per run (covers infrastructure startup).
Real-world cost examples:
| Use case | Jobs | Est. Duration | Cost (Free tier) |
|---|---|---|---|
| Quick keyword test | 20 | ~5s | ~$0.03 |
| Single city search | 100 | ~15s | ~$0.12 |
| Multi-city analysis | 500 | ~60s | ~$0.58 |
| Full niche extraction | 2,000 | ~4min | ~$2.31 |
Free plan estimate: With $5 free credits, you can scrape approximately 4,300 jobs on the free tier (including run start fees).
How to scrape Naukri.com jobs
- Go to the Naukri.com Job Scraper page on Apify Store
- Click Try for free
- In the Search section, enter your keyword (e.g.
react developer,data scientist,product manager) - Optionally set a Location (e.g.
bangalore,mumbai,delhi,hyderabad) - Set Max Jobs — start with
20for testing, increase for production - (Optional) Apply Filters: experience range, salary minimum, work mode
- Click Start and wait for results (usually under 30 seconds for 100 jobs)
- Download results as JSON, CSV, or Excel from the Dataset tab
Example inputs for different scenarios:
// Freshers: entry-level jobs with no experience required{"keyword": "software engineer","location": "bangalore","maxJobs": 100,"experienceMin": 0,"experienceMax": 2}
// Remote jobs above a salary threshold{"keyword": "python developer","maxJobs": 200,"workMode": "remote","salaryMin": 10}
// Recent postings for salary benchmarking{"keyword": "data scientist","location": "mumbai","maxJobs": 500,"sortBy": "date"}
Input parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
keyword | string | required | Job title, skill, or technology to search |
location | string | (all India) | City filter: bangalore, mumbai, delhi, hyderabad, chennai, pune, remote |
maxJobs | integer | 100 | Maximum jobs to extract (1–5,000) |
experienceMin | integer | 0 | Minimum years of experience required |
experienceMax | integer | — | Maximum years of experience required |
salaryMin | integer | — | Minimum salary in INR lakhs/year (e.g., 10 = ₹10 LPA) |
workMode | select | any | any, remote, hybrid, work-from-office |
sortBy | select | relevance | relevance (best match) or date (most recent first) |
proxyConfiguration | object | Apify proxy | Proxy settings — default Apify datacenter works well |
Output examples
Each extracted job is saved as one item in the dataset:
{"jobId": "250326037855","title": "Python Developer","companyName": "Cognizant","companyUrl": "https://www.naukri.com/cognizant-jobs-careers-2114","companyLogoUrl": "https://img.naukimg.com/logo_images/groups/v1/4156.gif","location": "Bengaluru","workMode": "hybrid","experienceText": "6-11 Yrs","experienceMin": 6,"experienceMax": 11,"salary": "6-15 Lacs PA","salaryMin": 600000,"salaryMax": 1500000,"salaryCurrency": "INR","skills": ["Python", "Django", "AWS", "Development"],"jobDescription": "<p>Hiring for Python developer</p><p>Skill: Python,Django and AWS</p>...","postedDate": "2026-03-25 16:41:37","postedDateRelative": "1 Day Ago","applyByDate": "2:11 AM","jobUrl": "https://www.naukri.com/job-listings-hiring-for-python-developer-cognizant-bengaluru-6-to-11-years-250326037855","vacancy": 15,"companyRating": 3.7,"companyReviewsCount": 60787,"ambitionBoxUrl": "https://www.ambitionbox.com/reviews/cognizant-reviews","keyword": "python developer","locationSearched": "bangalore","scrapedAt": "2026-04-02T10:30:00.000Z"}
Tips for best results
- 🚀 Start small: use
maxJobs: 20for your first run to verify the output format before scaling up - 📍 Location matters: Naukri has far more listings for major cities — try
bangalore,mumbai,hyderabad,punefor the most results - 🔄 Schedule regular runs: set up daily or weekly runs in Apify's scheduler to track hiring velocity over time
- 💡 Keyword tips: use lowercase (
python developernotPython Developer), try both full (machine learning engineer) and abbreviated (ml engineer) forms - 📊 CSV for Excel analysis: download as CSV to open salary and experience data directly in spreadsheets
- 🔍 Use
sortBy: "date"for monitoring new postings — combine with webhooks for real-time alerts - 🏢 Company monitoring: run with just the company name as keyword to track all openings from specific companies
- ⚡ Salary filter: use
salaryMinto filter noise — most serious listings withsalaryMin: 5(₹5 LPA) are real
Integrations
Naukri Job Scraper → Google Sheets: Export job data as CSV/Google Sheets to build a salary benchmarking dashboard. Add Apify's Google Sheets integration in the Integrations tab — new runs automatically append to your sheet.
Naukri Job Scraper → Slack/Discord alerts: Use Make (formerly Integromat) to watch for new Naukri runs, filter for specific companies or salary ranges, and post matching jobs to a Slack channel. Great for recruitment teams monitoring target companies.
Naukri Job Scraper → Database (PostgreSQL/Airtable): Use Apify webhooks to trigger a Zapier workflow that writes each new job record to Airtable or a PostgreSQL table. Build a rolling 30-day database of tech jobs in your target city.
Scheduled runs for market intelligence: Add a daily cron run at 9 AM IST (3:30 AM UTC) with sortBy: "date" to capture the previous day's job postings. Over time, build a time-series dataset of hiring trends by skill and city.
Naukri Job Scraper → Lead enrichment: Filter scraped jobs by company size signals (vacancy count, AmbitionBox review count) to identify high-volume hiring companies as potential recruitment agency leads.
Using the Apify API
You can run this actor programmatically via the Apify API. Here are examples in Node.js, Python, and cURL:
Node.js
import { ApifyClient } from 'apify-client';const client = new ApifyClient({ token: 'YOUR_APIFY_TOKEN' });const run = await client.actor('automation-lab/naukri-scraper').call({keyword: 'data engineer',location: 'bangalore',maxJobs: 500,sortBy: 'date',});const dataset = await client.dataset(run.defaultDatasetId).listItems();console.log(`Scraped ${dataset.items.length} jobs`);for (const job of dataset.items) {console.log(`${job.title} at ${job.companyName} — ${job.salary}`);}
Python
from apify_client import ApifyClientclient = ApifyClient(token='YOUR_APIFY_TOKEN')run = client.actor('automation-lab/naukri-scraper').call(run_input={'keyword': 'data engineer','location': 'bangalore','maxJobs': 500,'sortBy': 'date',})dataset = client.dataset(run['defaultDatasetId']).list_items()print(f"Scraped {len(dataset.items)} jobs")for job in dataset.items:print(f"{job['title']} at {job['companyName']} — {job.get('salary', 'Not disclosed')}")
cURL
# Start the actor runcurl -X POST "https://api.apify.com/v2/acts/automation-lab~naukri-scraper/runs?token=YOUR_APIFY_TOKEN" \-H "Content-Type: application/json" \-d '{"keyword": "data engineer","location": "bangalore","maxJobs": 500,"sortBy": "date"}'# Fetch results (replace DATASET_ID with the defaultDatasetId from the response)curl "https://api.apify.com/v2/datasets/DATASET_ID/items?format=json&token=YOUR_APIFY_TOKEN"
Use with AI agents via MCP
Naukri.com Job Scraper is available as a tool for AI assistants that support the Model Context Protocol (MCP).
Add the Apify MCP server to your AI client — this gives you access to all Apify actors, including this one:
Setup for Claude Code
$claude mcp add --transport http apify "https://mcp.apify.com"
Setup for Claude Desktop, Cursor, or VS Code
Add this to your MCP config file:
{"mcpServers": {"apify": {"url": "https://mcp.apify.com"}}}
Your AI assistant will use OAuth to authenticate with your Apify account on first use.
Example prompts
Once connected, try asking your AI assistant:
- "Use automation-lab/naukri-scraper to find all remote Python developer jobs in India posted in the last 3 days, and show me the salary ranges"
- "Scrape 200 data science job listings from Naukri in Bangalore and identify which companies are hiring most actively"
- "Get all machine learning engineer jobs on Naukri requiring 3-6 years experience and create a skills frequency report"
Learn more in the Apify MCP documentation.
Is it legal to scrape Naukri.com?
Web scraping public data is generally legal in most jurisdictions. Naukri.com displays job listings publicly without requiring login. This actor only extracts publicly visible data — the same information anyone can see by visiting the website.
Ethical scraping principles we follow:
- Respectful request rates — no aggressive crawling
- Public data only — no login, no personal data beyond what employers post publicly
- No PII extraction — focuses on job listings, not user profiles
- Compliant with GDPR and India's DPDP Act for public business data
Always review Naukri.com's Terms of Service for the most current usage guidelines. Use scraped data responsibly and in accordance with applicable laws.
FAQ
How many jobs can I scrape per run? Up to 5,000 jobs per run. Naukri's search API paginates results in batches of 20. For most searches, the practical limit is 1,000–2,000 jobs before results repeat or become irrelevant.
How much does it cost to scrape 1,000 Naukri jobs? On the free tier: $0.005 (start fee) + 1,000 × $0.00115 = $1.155 total. On the Starter plan ($29/month): $1.005 total. With $5 free Apify credits, you can scrape approximately 4,300 jobs.
How is this different from competitors? Most Naukri scrapers on the Store use Playwright (2048–4096MB) and are free. This actor uses a pure HTTP approach (256MB, no browser), making it faster and more cost-efficient. It also parses salary into structured min/max INR fields, extracts skills as an array, and includes AmbitionBox ratings — fields competitors often skip.
Why are some salary fields null?
Naukri allows employers to hide salary information. When salaryDetail.hideSalary is true, salary fields will be null. This is data the employer chose not to disclose — it's not a scraper limitation.
Why are results empty or fewer than expected?
- Your keyword may have limited listings — try a broader term
- Location spelling matters: use
bangalorenotBangalore(lowercase works best) - Some searches return fewer than
maxJobs— this is normal when there aren't enough matching listings
Is there a rate limit? Naukri's API is polite — we add a 500–1000ms delay between page requests. For very large runs (1000+ jobs), this adds some time but prevents blocks. You won't hit rate limits under normal usage.
Can I scrape multiple keywords in one run? Currently one keyword per run. For multiple keywords, use Apify's scheduler to run the actor multiple times with different inputs, or use the API to launch parallel runs.
Other job scrapers and tools
Looking for jobs from other platforms? Here are other automation-lab scrapers:
- Google Jobs Scraper — search jobs aggregated across all platforms via Google
- LinkedIn Jobs Scraper — extract LinkedIn job listings with company data
- Glassdoor Jobs Scraper — scrape Glassdoor jobs with salary estimates and company reviews
- Greenhouse Jobs Scraper — extract job listings from companies using Greenhouse ATS
- LinkedIn Job Enrichment — enrich job URLs with full LinkedIn job details