
ScraperCodeGenerator
Pricing
Pay per usage
Go to Store

ScraperCodeGenerator
An intelligent web scraping tool that automatically generates custom scraping code for any website.
0.0 (0)
Pricing
Pay per usage
0
Total users
1
Monthly users
1
Last modified
a day ago
You can access the ScraperCodeGenerator programmatically from your own applications by using the Apify API. You can also choose the language preference from below. To use the Apify API, you’ll need an Apify account and your API token, found in Integrations settings in Apify Console.
$echo '{< "targetUrl": "https://books.toscrape.com/",< "userGoal": "Get me a list of all the books on the first page. For each book, I want its title, price, star rating, and whether it is in stock.",< "actors": [< {< "name": "cheerio-scraper",< "enabled": true,< "input": {< "maxRequestRetries": 3,< "requestTimeoutSecs": 30,< "maxPagesPerCrawl": 1,< "pageFunction": "async function pageFunction(context) {\\n const { request, log, $ } = context;\\n try {\\n const title = $('\''title'\'').text() || '\'''\'';\\n const html = $('\''html'\'').html() || '\'''\'';\\n return {\\n url: request.url,\\n title: title,\\n html: html\\n };\\n } catch (error) {\\n log.error('\''Error in pageFunction:'\'', error);\\n return {\\n url: request.url,\\n title: '\'''\'',\\n html: '\'''\''\\n };\\n }\\n}",< "proxyConfiguration": {< "useApifyProxy": true< }< }< },< {< "name": "web-scraper",< "enabled": false,< "input": {< "maxRequestRetries": 3,< "requestTimeoutSecs": 30,< "maxPagesPerCrawl": 1,< "pageFunction": "async function pageFunction(context) {\\n const { request, log, page } = context;\\n try {\\n const title = await page.title();\\n const html = await page.content();\\n return {\\n url: request.url,\\n title: title,\\n html: html\\n };\\n } catch (error) {\\n log.error('\''Error in pageFunction:'\'', error);\\n return {\\n url: request.url,\\n title: '\'''\'',\\n html: '\'''\''\\n };\\n }\\n}",< "proxyConfiguration": {< "useApifyProxy": true< }< }< },< {< "name": "website-content-crawler",< "enabled": true,< "input": {< "maxCrawlPages": 1,< "crawler": "playwright",< "proxyConfiguration": {< "useApifyProxy": true< }< }< },< {< "name": "playwright-scraper",< "enabled": false,< "input": {< "maxRequestRetries": 2,< "requestTimeoutSecs": 45,< "maxPagesPerCrawl": 1,< "pageFunction": "async function pageFunction(context) {\\n const { request, log, page } = context;\\n try {\\n const title = await page.title();\\n const html = await page.content();\\n return {\\n url: request.url,\\n title: title,\\n html: html\\n };\\n } catch (error) {\\n log.error('\''Error in pageFunction:'\'', error);\\n return {\\n url: request.url,\\n title: '\'''\'',\\n html: '\'''\''\\n };\\n }\\n}",< "proxyConfiguration": {< "useApifyProxy": true< }< }< }< ]<}' |<apify call ohlava/scrapercodegenerator --silent --output-dataset
ScraperCodeGenerator API through CLI
The Apify CLI is the official tool that allows you to use ScraperCodeGenerator locally, providing convenience functions and automatic retries on errors.
Install the Apify CLI
$npm i -g apify-cli$apify login
Other API clients include: