SEO Audit Tool
Pricing
from $0.01 / 1,000 results
SEO Audit Tool
Comprehensive SEO analysis for your entire website. Crawl all pages, identify technical issues, and get actionable insights to improve your search engine rankings.
Comprehensive SEO analysis for your entire website. Crawl all pages, identify technical issues, and get actionable insights to improve your search engine rankings.
π What does this actor do?
This SEO audit tool crawls your entire website and performs detailed analysis on each page, identifying errors and issues that could negatively impact your SEO performance. Get comprehensive reports on technical SEO, content quality, and on-page optimization.
β¨ Key Features
- π Single Page or Full Site Crawl: Audit a single page quickly or analyze your entire website
- π Technical SEO Checks: Validate meta tags, headers, and HTML structure
- πΌοΈ Image Optimization: Identify missing alt tags and broken images
- π Link Analysis: Detect broken links and no-follow issues
- π Content Quality: Assess content length and keyword optimization
- π± Mobile Optimization: Check viewport and responsive design elements
- π·οΈ Structured Data: Validate JSON-LD and microdata implementation
- π Google Analytics: Verify tracking code integration
- π° Pay-Per-Event Pricing: Only pay for pages successfully audited
π₯ Input Configuration
Required Fields
- Start URL: The webpage URL where the audit begins (e.g.,
https://apify.com)
Optional Fields
- Crawl all pages recursively (default:
false):false- Audits only the single URL provided (fast, recommended for quick checks)true- Crawls all pages on the website (comprehensive site audit)
- Max pages (default:
10): Maximum number of pages to crawl (only applies when recursive crawling is enabled) - Max depth (default: unlimited): Maximum crawl depth (only applies when recursive crawling is enabled)
- Proxy configuration: Optional proxy settings (Apify proxy requires valid API token)
- User Agent: Custom user agent string for testing mobile/desktop views
- Viewport width/height: Custom viewport dimensions for responsive testing
- Page Navigation timeout: Timeout in milliseconds for page loading (default: 60000ms)
- Max Request Retries: Number of retry attempts for failed requests
- SEO params: Override default SEO validation parameters
Example Input
{"startUrl": "https://apify.com","crawlRecursive": false,"maxRequestsPerCrawl": 10,"pageTimeout": 60000}
π€ Output & Dataset Schema
All results are stored in the Apify dataset. Each page audit produces a comprehensive JSON object with the following structure:
Core Page Information
- url - URL of the audited page
- title - Page title from
<title>tag - isLoaded - Whether the page loaded successfully
Technical SEO
- isDoctype - HTML doctype declaration present
- isCharacterEncode - Meta charset tag present
- isViewport - Viewport meta tag present (mobile optimization)
- robotsFileExists - robots.txt file exists at domain root
- faviconExists - favicon.ico exists at domain root
- pageIsBlocked - Page has noindex/nofollow robots meta tag
- isUsingFlash - Page uses Flash content (deprecated)
Meta Tags & Content
- isMetaDescription - Meta description tag exists
- metaDescription - Content of meta description
- isMetaDescriptionEnoughLong - Meta description length is optimal (< 140 chars)
- isTitle - Title tag exists
- isTitleEnoughLong - Title length is optimal (10-70 chars)
Headings
- isH1 - H1 heading exists
- h1 - Content of H1 heading
- isH1OnlyOne - Exactly one H1 tag present (SEO best practice)
- isH2 - H2 headings exist
Links Analysis
- linksCount - Total number of links on page
- isTooEnoughLinks - Link count within acceptable range (< 3000)
- internalNoFollowLinks - Array of internal links with nofollow attribute
- internalNoFollowLinksCount - Count of internal nofollow links
- brokenLinks - Array of internal broken links
- brokenLinksCount - Count of internal broken links
- externalBrokenLinks - Array of external broken links
- externalBrokenLinksCount - Count of external broken links
Images
- notOptimizedImages - Array of images missing alt attributes
- notOptimizedImagesCount - Count of images without alt tags
- brokenImages - Array of broken image URLs
- brokenImagesCount - Count of broken images
Content Quality
- wordsCount - Total word count on the page
- isContentEnoughLong - Content meets minimum threshold (< 350 words)
Analytics & Tracking
- isGoogleAnalyticsObject - Google Analytics object (ga) is present
- isGoogleAnalyticsFunc - Google Analytics tracking function is present
Mobile & Performance
- isAmp - Page uses AMP (Accelerated Mobile Pages)
- isNotIframe - Page is free of iframes
Structured Data
- jsonLd - JSON-LD structured data
isJsonLd- Whether JSON-LD is presentjsonLdData- Parsed JSON-LD data
- microdata - Microdata structured data
isMicrodata- Whether microdata is presentmicrodata- Array of parsed microdata items
Error Handling
- errorMessage - Error message if page failed to load (only present when
isLoadedis false)
For the complete field-level JSON schema specification, see ./OUTPUT_SCHEMA.json.
The Actor output configuration is defined in ./.actor/output_schema.json following the Apify Actor output schema specification.
π Sample Output
Successful Page Audit
{"url": "https://apify.com/","title": "Web Scraping, Data Extraction and Automation - Apify","isLoaded": true,"isGoogleAnalyticsObject": true,"isGoogleAnalyticsFunc": false,"isCharacterEncode": true,"isMetaDescription": true,"metaDescription": "Apify extracts data from websites, crawls lists of URLs and automates workflows on the web. Turn any website into an API in a few minutes!","isMetaDescriptionEnoughLong": true,"isDoctype": true,"isTitle": true,"isTitleEnoughLong": true,"isH1": true,"h1": "The web scraping and automation platform","isH1OnlyOne": true,"isH2": true,"linksCount": 91,"isTooEnoughLinks": true,"internalNoFollowLinks": [],"internalNoFollowLinksCount": 0,"notOptimizedImages": [],"notOptimizedImagesCount": 0,"wordsCount": 1373,"isContentEnoughLong": false,"isViewport": true,"isAmp": false,"isNotIframe": true,"isUsingFlash": false,"pageIsBlocked": false,"robotsFileExists": true,"faviconExists": true,"brokenLinks": [],"brokenLinksCount": 0,"externalBrokenLinks": [],"externalBrokenLinksCount": 0,"brokenImages": [],"brokenImagesCount": 0,"jsonLd": {"isJsonLd": true,"jsonLdData": {"@context": "https://schema.org","@type": "Organization","name": "Apify","url": "https://apify.com"}},"microdata": {"isMicrodata": false,"microdata": []}}
Failed Page Load
{"url": "https://example.com/broken-page","isLoaded": false,"errorMessage": "Navigation timeout of 60000 ms exceeded"}
π SEO Issues Detected
The tool automatically identifies common SEO issues:
- β Missing or poorly optimized meta tags - Title too short/long, missing meta description
- β Heading structure problems - Missing H1, multiple H1 tags, no H2 tags
- β Image optimization issues - Missing alt attributes on images
- β Broken links and images - Non-functional internal/external links
- β Content quality concerns - Insufficient word count
- β Mobile optimization gaps - Missing viewport meta tag
- β Technical SEO problems - Missing robots.txt, favicon, or doctype
- β Internal nofollow links - Potential link equity issues
- β Deprecated technologies - Flash content usage
π° Pricing
This actor uses pay-per-event pricing:
- $0.001 per page successfully audited
- Failed pages are not charged
- Single page audits are very cost-effective
- Full site crawls are charged only for successfully processed pages
π οΈ Customizing SEO Parameters
You can override default SEO validation thresholds using the seoParams input:
{"startUrl": "https://example.com","seoParams": {"maxTitleLength": 70,"minTitleLength": 10,"maxMetaDescriptionLength": 160,"maxLinksCount": 3000,"maxWordsCount": 500,"outputLinks": true,"workingStatusCodes": [200, 301, 302, 304]}}
π Use Cases
- Pre-launch SEO audit - Validate new websites before going live
- Ongoing monitoring - Regular SEO health checks
- Migration validation - Ensure SEO integrity after site migrations
- Competitor analysis - Audit competitor websites for insights
- Content optimization - Identify pages needing content improvements
- Technical SEO fixes - Find and fix broken links, missing tags, etc.
π Getting Started
- Enter your website URL in the Start URL field
- Choose single page or recursive crawling mode
- Configure optional parameters (proxy, viewport, etc.)
- Run the actor
- Export results as JSON, CSV, or Excel
- Fix identified SEO issues to improve rankings!
π API Integration & Automation
You can automate SEO audits using the Apify API. This is useful for:
- Scheduled audits (daily/weekly monitoring)
- CI/CD pipeline integration
- Bulk auditing multiple websites
- Custom reporting dashboards
Using the Apify API Client
Node.js Example
const { ApifyClient } = require('apify-client');// Initialize the client with your API tokenconst client = new ApifyClient({token: 'YOUR_APIFY_TOKEN',});// Run the SEO audit actorconst run = await client.actor('YOUR_USERNAME/seo-audit-tool').call({startUrl: 'https://example.com',crawlRecursive: false,pageTimeout: 60000,});// Fetch results from the datasetconst { items } = await client.dataset(run.defaultDatasetId).listItems();// Process the SEO audit resultsitems.forEach((item) => {console.log(`URL: ${item.url}`);console.log(`Title: ${item.title}`);console.log(`SEO Issues: ${item.brokenLinksCount} broken links, ${item.notOptimizedImagesCount} images without alt tags`);});
Python Example
from apify_client import ApifyClient# Initialize the client with your API tokenclient = ApifyClient('YOUR_APIFY_TOKEN')# Run the SEO audit actorrun = client.actor('YOUR_USERNAME/seo-audit-tool').call(run_input={'startUrl': 'https://example.com','crawlRecursive': False,'pageTimeout': 60000,})# Fetch results from the datasetitems = client.dataset(run['defaultDatasetId']).list_items().items# Process the SEO audit resultsfor item in items:print(f"URL: {item['url']}")print(f"Title: {item['title']}")print(f"SEO Issues: {item['brokenLinksCount']} broken links, {item['notOptimizedImagesCount']} images without alt tags")
Using REST API
# Start the actor runcurl -X POST https://api.apify.com/v2/acts/YOUR_USERNAME~seo-audit-tool/runs \-H "Authorization: Bearer YOUR_APIFY_TOKEN" \-H "Content-Type: application/json" \-d '{"startUrl": "https://example.com","crawlRecursive": false,"pageTimeout": 60000}'# Get the run status (replace RUN_ID with the ID from the previous response)curl https://api.apify.com/v2/acts/YOUR_USERNAME~seo-audit-tool/runs/RUN_ID \-H "Authorization: Bearer YOUR_APIFY_TOKEN"# Download the dataset (replace DATASET_ID with defaultDatasetId from run status)curl https://api.apify.com/v2/datasets/DATASET_ID/items \-H "Authorization: Bearer YOUR_APIFY_TOKEN"
Scheduled Audits
Set up automated audits using Apify Schedules:
- Go to Schedules in your Apify Console
- Create a new schedule (e.g., daily at 9 AM)
- Select the SEO Audit Tool actor
- Configure your input parameters
- Enable notifications for failures
Webhook Integration
Receive audit results automatically via webhooks:
const run = await client.actor('YOUR_USERNAME/seo-audit-tool').call({startUrl: 'https://example.com',crawlRecursive: false,}, {webhooks: [{eventTypes: ['ACTOR.RUN.SUCCEEDED'],requestUrl: 'https://your-server.com/webhook',}],});
Bulk Auditing Multiple Sites
const websites = ['https://example1.com','https://example2.com','https://example3.com',];// Run audits in parallelconst runs = await Promise.all(websites.map(url =>client.actor('YOUR_USERNAME/seo-audit-tool').call({startUrl: url,crawlRecursive: false,})));// Collect all resultsconst allResults = await Promise.all(runs.map(run =>client.dataset(run.defaultDatasetId).listItems()));
API Documentation
For complete API documentation, visit:
