Naukri Job Scraper avatar

Naukri Job Scraper

Pricing

Pay per event

Go to Apify Store
Naukri Job Scraper

Naukri Job Scraper

Scrape job listings from Naukri.com, India's largest job board. Extract title, company, salary, location, experience, skills & description. Export JSON/CSV/Excel. No API key needed.

Pricing

Pay per event

Rating

0.0

(0)

Developer

Stas Persiianenko

Stas Persiianenko

Maintained by Community

Actor stats

1

Bookmarked

531

Total users

163

Monthly active users

a day ago

Last modified

Categories

Share

Naukri.com Job Scraper

Scrape job listings from Naukri.com — India's largest job board with 90M+ registered job seekers and 1M+ active recruiters. Extract structured job data including titles, companies, salaries, skills, experience requirements, work mode, and AmbitionBox company ratings — no API key or login required.

What does Naukri.com Job Scraper do?

Naukri.com Job Scraper uses a headless Chromium browser with residential proxies to bypass Naukri's Akamai bot protection, intercepts the internal search API responses, and extracts comprehensive job listing data across any keyword and location. It processes multiple pages of results automatically, parses salary data from INR amounts into structured fields, extracts skills arrays, and includes AmbitionBox company ratings directly from the search response.

Key capabilities:

  • 🔍 Search any job title, skill, or technology (Python, product manager, data analyst, etc.)
  • 📍 Filter by city — Bangalore, Mumbai, Delhi, Hyderabad, Chennai, Pune, or all of India
  • 💰 Extract structured salary data (min/max in INR, display string)
  • 🧑‍💻 Filter by experience level, work mode (remote/hybrid/office), and salary range
  • ⭐ Includes AmbitionBox company ratings and review counts
  • 📄 Full job descriptions included in output
  • 🏢 Vacancy count per listing

Who is Naukri.com Job Scraper for?

HR teams and talent acquisition professionals at Indian companies monitoring the job market:

  • Track competitor hiring: which companies are hiring your target roles?
  • Monitor salary benchmarks: what are others paying for specific skills?
  • Run weekly extractions to detect hiring velocity changes

Recruitment agencies and staffing firms building candidate pipelines:

  • Extract thousands of job postings to identify high-volume clients
  • Monitor which companies are actively hiring at scale
  • Build lead lists of companies with open positions in target industries

Market researchers and data analysts studying India's job market:

  • Salary trend analysis across cities, skills, and experience levels
  • Demand signals for specific technologies (Python, React, AWS)
  • India tech hiring velocity by company and sector

Job aggregation and alerting tools built on top of Naukri data:

  • Schedule daily runs to detect new listings matching specific criteria
  • Feed into Google Sheets or databases for ongoing monitoring
  • Trigger Slack/email alerts when target companies post new roles

Why use Naukri.com Job Scraper?

  • Reliable: Playwright + residential proxies bypass Akamai bot protection consistently
  • Structured output: salary parsed into INR min/max, skills as array, work mode extracted
  • AmbitionBox ratings included: company ratings and review counts in every job record
  • No API key or login required: public search API
  • Pay only for what you scrape: PPE pricing — no subscription
  • Works across all Indian cities and remote jobs
  • Pagination handled automatically: scrapes up to 5,000 jobs per run

What data can you extract?

FieldDescriptionExample
jobIdUnique Naukri job ID250326037855
titleJob titlePython Developer
companyNameHiring companyCognizant
companyUrlCompany jobs page on Naukrihttps://www.naukri.com/cognizant-jobs-careers-2114
companyLogoUrlCompany logo URLhttps://img.naukimg.com/logo_images/...
locationJob locationBengaluru
workModeWork arrangementhybrid, remote, null (office)
experienceTextDisplay experience range6-11 Yrs
experienceMinMinimum years required6
experienceMaxMaximum years required11
salarySalary display string6-15 Lacs PA
salaryMinMinimum salary (INR)600000
salaryMaxMaximum salary (INR)1500000
salaryCurrencyCurrency codeINR
skillsRequired skills array["Python", "Django", "AWS"]
jobDescriptionFull job description<p>We are looking for...
postedDateISO 8601 post timestamp2026-03-25T11:11:39.139Z
postedDateRelativeHuman-readable date1 Day Ago
applyByDateApplication deadline2:11 AM
jobUrlDirect link to job listinghttps://www.naukri.com/job-listings-...
vacancyNumber of open positions15
companyRatingAmbitionBox rating (0–5)3.7
companyReviewsCountNumber of employee reviews60787
ambitionBoxUrlLink to company reviewshttps://www.ambitionbox.com/...
keywordSearch keyword usedpython developer
locationSearchedLocation filter usedbangalore
scrapedAtExtraction timestamp2026-04-02T10:30:00.000Z

How much does it cost to scrape Naukri.com jobs?

This Actor uses pay-per-event pricing — you pay only for the jobs you scrape. No monthly subscription. All platform costs are included.

Free ($5 credit)Starter ($29/mo)Scale ($199/mo)Business ($999/mo)
Per job$0.0023$0.002$0.00156$0.0012
1,000 jobs$2.30$2.00$1.56$1.20
5,000 jobs$11.50$10.00$7.80$6.00

Plus a one-time run start fee of $0.005 per run (covers infrastructure startup).

Real-world cost examples:

Use caseJobsEst. DurationCost (Free tier)
Quick keyword test20~80s~$0.05
Single city search100~3 min~$0.24
Multi-city analysis500~15 min~$1.16
Full niche extraction2,000~60 min~$4.61

Free plan estimate: With $5 free credits, you can scrape approximately 2,100 jobs on the free tier (including run start fees).

How to scrape Naukri.com jobs

  1. Go to the Naukri.com Job Scraper page on Apify Store
  2. Click Try for free
  3. In the Search section, enter your keyword (e.g. react developer, data scientist, product manager)
  4. Optionally set a Location (e.g. bangalore, mumbai, delhi, hyderabad)
  5. Set Max Jobs — start with 20 for testing, increase for production
  6. (Optional) Apply Filters: experience range, salary minimum, work mode
  7. Click Start and wait for results (typically ~80 seconds for 20 jobs)
  8. Download results as JSON, CSV, or Excel from the Dataset tab

Example inputs for different scenarios:

// Freshers: entry-level jobs with no experience required
{
"keyword": "software engineer",
"location": "bangalore",
"maxJobs": 100,
"experienceMin": 0,
"experienceMax": 2
}
// Remote jobs above a salary threshold
{
"keyword": "python developer",
"maxJobs": 200,
"workMode": "remote",
"salaryMin": 10
}
// Recent postings for salary benchmarking
{
"keyword": "data scientist",
"location": "mumbai",
"maxJobs": 500,
"sortBy": "date"
}

Input parameters

ParameterTypeDefaultDescription
keywordstringrequiredJob title, skill, or technology to search
locationstring(all India)City filter: bangalore, mumbai, delhi, hyderabad, chennai, pune, remote
maxJobsinteger100Maximum jobs to extract (1–5,000)
experienceMininteger0Minimum years of experience required
experienceMaxintegerMaximum years of experience required
salaryMinintegerMinimum salary in INR lakhs/year (e.g., 10 = ₹10 LPA)
workModeselectanyany, remote, hybrid, work-from-office
sortByselectrelevancerelevance (best match) or date (most recent first)
proxyConfigurationobjectResidential proxyProxy settings — residential proxies required for Naukri's Akamai protection (used automatically)

Output examples

Each extracted job is saved as one item in the dataset:

{
"jobId": "250326037855",
"title": "Python Developer",
"companyName": "Cognizant",
"companyUrl": "https://www.naukri.com/cognizant-jobs-careers-2114",
"companyLogoUrl": "https://img.naukimg.com/logo_images/groups/v1/4156.gif",
"location": "Bengaluru",
"workMode": "hybrid",
"experienceText": "6-11 Yrs",
"experienceMin": 6,
"experienceMax": 11,
"salary": "6-15 Lacs PA",
"salaryMin": 600000,
"salaryMax": 1500000,
"salaryCurrency": "INR",
"skills": ["Python", "Django", "AWS", "Development"],
"jobDescription": "<p>Hiring for Python developer</p><p>Skill: Python,Django and AWS</p>...",
"postedDate": "2026-03-25T11:11:37.000Z",
"postedDateRelative": "1 Day Ago",
"applyByDate": "2:11 AM",
"jobUrl": "https://www.naukri.com/job-listings-hiring-for-python-developer-cognizant-bengaluru-6-to-11-years-250326037855",
"vacancy": 15,
"companyRating": 3.7,
"companyReviewsCount": 60787,
"ambitionBoxUrl": "https://www.ambitionbox.com/reviews/cognizant-reviews",
"keyword": "python developer",
"locationSearched": "bangalore",
"scrapedAt": "2026-04-02T10:30:00.000Z"
}

Tips for best results

  • 🚀 Start small: use maxJobs: 20 for your first run to verify the output format before scaling up
  • 📍 Location matters: Naukri has far more listings for major cities — try bangalore, mumbai, hyderabad, pune for the most results
  • 🔄 Schedule regular runs: set up daily or weekly runs in Apify's scheduler to track hiring velocity over time
  • 💡 Keyword tips: use lowercase (python developer not Python Developer), try both full (machine learning engineer) and abbreviated (ml engineer) forms
  • 📊 CSV for Excel analysis: download as CSV to open salary and experience data directly in spreadsheets
  • 🔍 Use sortBy: "date" for monitoring new postings — combine with webhooks for real-time alerts
  • 🏢 Company monitoring: run with just the company name as keyword to track all openings from specific companies
  • Salary filter: use salaryMin to filter noise — most serious listings with salaryMin: 5 (₹5 LPA) are real

Integrations

Naukri Job Scraper → Google Sheets: Export job data as CSV/Google Sheets to build a salary benchmarking dashboard. Add Apify's Google Sheets integration in the Integrations tab — new runs automatically append to your sheet.

Naukri Job Scraper → Slack/Discord alerts: Use Make (formerly Integromat) to watch for new Naukri runs, filter for specific companies or salary ranges, and post matching jobs to a Slack channel. Great for recruitment teams monitoring target companies.

Naukri Job Scraper → Database (PostgreSQL/Airtable): Use Apify webhooks to trigger a Zapier workflow that writes each new job record to Airtable or a PostgreSQL table. Build a rolling 30-day database of tech jobs in your target city.

Scheduled runs for market intelligence: Add a daily cron run at 9 AM IST (3:30 AM UTC) with sortBy: "date" to capture the previous day's job postings. Over time, build a time-series dataset of hiring trends by skill and city.

Naukri Job Scraper → Lead enrichment: Filter scraped jobs by company size signals (vacancy count, AmbitionBox review count) to identify high-volume hiring companies as potential recruitment agency leads.

Using the Apify API

You can run this actor programmatically via the Apify API. Here are examples in Node.js, Python, and cURL:

Node.js

import { ApifyClient } from 'apify-client';
const client = new ApifyClient({ token: 'YOUR_APIFY_TOKEN' });
const run = await client.actor('automation-lab/naukri-scraper').call({
keyword: 'data engineer',
location: 'bangalore',
maxJobs: 500,
sortBy: 'date',
});
const dataset = await client.dataset(run.defaultDatasetId).listItems();
console.log(`Scraped ${dataset.items.length} jobs`);
for (const job of dataset.items) {
console.log(`${job.title} at ${job.companyName}${job.salary}`);
}

Python

from apify_client import ApifyClient
client = ApifyClient(token='YOUR_APIFY_TOKEN')
run = client.actor('automation-lab/naukri-scraper').call(run_input={
'keyword': 'data engineer',
'location': 'bangalore',
'maxJobs': 500,
'sortBy': 'date',
})
dataset = client.dataset(run['defaultDatasetId']).list_items()
print(f"Scraped {len(dataset.items)} jobs")
for job in dataset.items:
print(f"{job['title']} at {job['companyName']}{job.get('salary', 'Not disclosed')}")

cURL

# Start the actor run
curl -X POST "https://api.apify.com/v2/acts/automation-lab~naukri-scraper/runs?token=YOUR_APIFY_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"keyword": "data engineer",
"location": "bangalore",
"maxJobs": 500,
"sortBy": "date"
}'
# Fetch results (replace DATASET_ID with the defaultDatasetId from the response)
curl "https://api.apify.com/v2/datasets/DATASET_ID/items?format=json&token=YOUR_APIFY_TOKEN"

Use with AI agents via MCP

Naukri.com Job Scraper is available as a tool for AI assistants that support the Model Context Protocol (MCP).

Add the Apify MCP server to your AI client — this gives direct access to this scraper:

Setup for Claude Code

$claude mcp add --transport http apify "https://mcp.apify.com?tools=automation-lab/naukri-scraper"

Setup for Claude Desktop, Cursor, or VS Code

Add this to your MCP config file:

{
"mcpServers": {
"apify": {
"url": "https://mcp.apify.com?tools=automation-lab/naukri-scraper"
}
}
}

Your AI assistant will use OAuth to authenticate with your Apify account on first use.

Example prompts

Once connected, try asking your AI assistant:

  • "Use automation-lab/naukri-scraper to find all remote Python developer jobs in India posted in the last 3 days, and show me the salary ranges"
  • "Scrape 200 data science job listings from Naukri in Bangalore and identify which companies are hiring most actively"
  • "Get all machine learning engineer jobs on Naukri requiring 3-6 years experience and create a skills frequency report"

Learn more in the Apify MCP documentation.

Web scraping public data is generally legal in most jurisdictions. Naukri.com displays job listings publicly without requiring login. This actor only extracts publicly visible data — the same information anyone can see by visiting the website.

Ethical scraping principles we follow:

  • Respectful request rates — no aggressive crawling
  • Public data only — no login, no personal data beyond what employers post publicly
  • No PII extraction — focuses on job listings, not user profiles
  • Compliant with GDPR and India's DPDP Act for public business data

Always review Naukri.com's Terms of Service for the most current usage guidelines. Use scraped data responsibly and in accordance with applicable laws.

FAQ

How many jobs can I scrape per run? Up to 5,000 jobs per run. Naukri's search API paginates results in batches of 20. For most searches, the practical limit is 1,000–2,000 jobs before results repeat or become irrelevant.

How much does it cost to scrape 1,000 Naukri jobs? On the free tier: $0.005 (start fee) + 1,000 x $0.0023 = $2.305 total. On the Starter plan ($29/month): $2.005 total. With $5 free Apify credits, you can scrape approximately 2,100 jobs.

How is this different from competitors? This actor uses Playwright with residential proxies and API response interception to reliably bypass Naukri's Akamai bot protection. It parses salary into structured min/max INR fields, extracts skills as an array, and includes AmbitionBox ratings — fields competitors often skip.

Why are some salary fields null? Naukri allows employers to hide salary information. When salaryDetail.hideSalary is true, salary fields will be null. This is data the employer chose not to disclose — it's not a scraper limitation.

Why are results empty or fewer than expected?

  • Your keyword may have limited listings — try a broader term
  • Location spelling matters: use bangalore not Bangalore (lowercase works best)
  • Some searches return fewer than maxJobs — this is normal when there aren't enough matching listings

Is there a rate limit? We add a 2–4 second delay between page navigations to avoid detection. Each page yields ~20 jobs, so larger runs take proportionally longer (~80 seconds per page). This is necessary to maintain reliable access through Naukri's Akamai bot protection.

Can I scrape multiple keywords in one run? Currently one keyword per run. For multiple keywords, use Apify's scheduler to run the actor multiple times with different inputs, or use the API to launch parallel runs.

Other job scrapers and tools

Looking for jobs from other platforms? Here are other automation-lab scrapers: