Fiverr Gig & Seller Scraper
Pricing
Pay per usage
Fiverr Gig & Seller Scraper
Scrape Fiverr gigs and seller profiles. Extract gig title, seller level, rating, reviews, all pricing packages, seller bio, skills, and certifications.
Pricing
Pay per usage
Rating
0.0
(0)
Developer
BotFlowTech
Actor stats
0
Bookmarked
2
Total users
1
Monthly active users
4 days ago
Last modified
Categories
Share
Extract gig listings, full gig details, and seller profiles from Fiverr.com at scale. Built on the Apify platform using Playwright for full JavaScript-rendered page support and stealth fingerprint rotation to handle Fiverr's anti-bot protections.
Capabilities
- Search-based scraping — Enter any keyword and scrape all matching gig listings with pagination support
- Individual gig scraping — Extract complete details from a specific gig URL including all pricing packages
- Seller profile scraping — Pull full seller profiles including bio, languages, skills, education, certifications, and all their listed gigs
- Flexible filtering — Sort by best selling, newest, rating, or price; filter by category and price range
- Smart data extraction — Primary extraction from embedded
__NEXT_DATA__JSON, with DOM fallback for maximum reliability - Stealth operation — Browser fingerprint rotation, randomised delays (2–4s), and viewport randomisation
Use Cases
Competitor Research
Monitor what your competitors are offering, how they price their packages, and what keywords drive traffic to top-selling gigs. Track seller levels and review counts over time.
Lead Generation
Identify sellers in your niche for partnership outreach, affiliate programs, or talent acquisition. Filter by level (Top Rated, Level 2) and review count to find established freelancers.
Market Analysis
Understand price distribution across a service category. Collect hundreds of gigs for a query and analyse average prices, delivery times, and common deliverables at each tier.
Finding Freelancers at Scale
Sourcing agencies and businesses can use this actor to build qualified shortlists of freelancers offering specific services, filtered by seller level, price, and rating.
Input
{"searchQuery": "python web scraping","category": "programming-tech","maxGigs": 100,"sortBy": "best_selling","minPrice": 10,"maxPrice": 200,"proxyConfiguration": {"useApifyProxy": true,"apifyProxyGroups": ["RESIDENTIAL"]}}
Or scrape a specific gig and its seller profile:
{"gigUrl": "https://www.fiverr.com/exampleuser/build-a-python-web-scraper","sellerUrl": "https://www.fiverr.com/exampleuser"}
Input Fields
| Field | Type | Default | Description |
|---|---|---|---|
searchQuery | string | — | Keywords to search on Fiverr |
category | string | — | Category slug/ID filter (optional) |
maxGigs | integer | 50 | Max gig listings to collect from search |
gigUrl | string | — | Direct URL to a specific gig page |
sellerUrl | string | — | Direct URL to a seller profile page |
sortBy | enum | best_selling | Sort order for search results |
minPrice | integer | 0 | Minimum starting price filter (USD) |
maxPrice | integer | 0 | Maximum starting price filter (USD, 0 = no limit) |
maxRequestRetries | integer | 3 | Retry attempts for failed page loads |
proxyConfiguration | object | — | Proxy settings (Apify Residential recommended) |
Output
All scraped records are saved to the Apify Dataset. There are three record types, identified by the type field.
Search Result Record
{"type": "search_result","gigTitle": "I will build a professional Python web scraper","sellerUsername": "techfreelancer99","sellerLevel": "Level 2 Seller","rating": 4.9,"reviewCount": 312,"startingPrice": 30,"currency": "USD","deliveryTime": "3 days","gigUrl": "https://www.fiverr.com/techfreelancer99/build-a-python-web-scraper","thumbnailUrl": "https://fiverr-res.cloudinary.com/images/...","tags": ["python", "web scraping", "data extraction", "automation"],"ordersInQueue": 5,"searchQuery": "python web scraping","scrapedAt": "2026-04-01T16:10:00.000Z"}
Gig Detail Record
{"type": "gig_detail","gigUrl": "https://www.fiverr.com/techfreelancer99/build-a-python-web-scraper","gigTitle": "I will build a professional Python web scraper","description": "Welcome! I specialise in building custom, production-ready Python scrapers using Playwright, Scrapy, and Requests. All code is fully documented...","category": "Programming & Tech","subcategory": "Web Scraping","packages": {"basic": {"name": "Basic","price": 30,"description": "Simple single-page scraper, up to 5 fields","deliveryDays": 2,"revisions": "1","deliverables": ["Source code", "CSV output", "1 revision"]},"standard": {"name": "Standard","price": 75,"description": "Multi-page scraper with pagination, up to 20 fields","deliveryDays": 4,"revisions": "3","deliverables": ["Source code", "CSV + JSON output", "Proxy support", "3 revisions"]},"premium": {"name": "Premium","price": 150,"description": "Full production scraper with API endpoint, auth handling, scheduling","deliveryDays": 7,"revisions": "Unlimited","deliverables": ["Source code", "REST API", "Docker setup", "Documentation", "Unlimited revisions"]}},"rating": 4.9,"reviewCount": 312,"ordersInQueue": 5,"tags": ["python", "playwright", "data extraction"],"seller": {"username": "techfreelancer99","level": "Level 2 Seller","bio": "Full-stack developer with 5 years of specialisation in data engineering and web automation...","memberSince": "October 2019","responseTime": "1 hour","lastDelivery": "about 2 hours","portfolioCount": 12,"country": "United Kingdom","profileUrl": "https://www.fiverr.com/techfreelancer99"},"scrapedAt": "2026-04-01T16:12:00.000Z"}
Seller Profile Record
{"type": "seller_profile","profileUrl": "https://www.fiverr.com/techfreelancer99","username": "techfreelancer99","displayName": "Alex T.","level": "Level 2 Seller","bio": "Full-stack developer specialising in data engineering and web automation. 5+ years professional experience.","country": "United Kingdom","memberSince": "October 2019","responseTime": "1 hour","lastDelivery": "about 2 hours","totalReviews": 451,"avgRating": 4.95,"languages": ["English (Fluent)", "French (Conversational)"],"education": [{"country": "United Kingdom","collegeName": "University of Manchester","degree": "BSc","major": "Computer Science","graduationYear": "2018"}],"certifications": ["AWS Certified Solutions Architect"],"skills": ["Python", "Playwright", "Scrapy", "REST APIs", "Docker", "PostgreSQL"],"gigs": [{"gigTitle": "I will build a professional Python web scraper","gigUrl": "https://www.fiverr.com/techfreelancer99/build-a-python-web-scraper","startingPrice": 30,"rating": 4.9,"reviewCount": 312},{"gigTitle": "I will create a custom Apify actor for any website","gigUrl": "https://www.fiverr.com/techfreelancer99/build-custom-apify-actor","startingPrice": 50,"rating": 5.0,"reviewCount": 139}],"scrapedAt": "2026-04-01T16:14:00.000Z"}
Technical Notes
Anti-Bot Handling
Fiverr uses PerimeterX and dynamic CSS class names. This actor mitigates detection through:
- Browser fingerprint rotation (Chrome and Firefox profiles on Windows/macOS)
- Randomised viewport sizes and delays between requests
webdrivernavigator property override- Low concurrency (2 parallel browsers maximum)
- Residential proxy support (strongly recommended for large runs)
Data Extraction Strategy
The actor first attempts to extract data from Fiverr's embedded __NEXT_DATA__ JSON script tag, which contains the full structured data the React frontend uses. If this is unavailable (e.g., due to page structure changes), it falls back to direct DOM selection.
Pagination
Search results are paginated using Fiverr's page query parameter. The actor automatically detects and enqueues next pages until maxGigs is reached.
Pricing
$1.00 per 1,000 gigs scraped (search results + gig details combined).
Seller profile pages are counted separately at the same rate. Proxy costs (if using Apify Residential Proxies) are billed additionally per Apify's standard compute unit pricing.
Running Locally
# Install dependenciesnpm install# Install Playwright browsersnpx playwright install chromium# Build TypeScriptnpm run build# Run (requires APIFY_TOKEN env variable or local storage)npm start
For local development with hot reload:
$npm run dev
License
Apache 2.0