SEO Audit Tool — Lighthouse Scores & Core Web Vitals avatar

SEO Audit Tool — Lighthouse Scores & Core Web Vitals

Pricing

from $6.50 / 1,000 page auditeds

Go to Apify Store
SEO Audit Tool — Lighthouse Scores & Core Web Vitals

SEO Audit Tool — Lighthouse Scores & Core Web Vitals

Audit websites across 21 SEO factors: titles, meta, headings, images, links, canonical, OG, Twitter Cards, schema, robots, HTTPS, URL structure. Per-page scores 0-100 with prioritized recommendations.

Pricing

from $6.50 / 1,000 page auditeds

Rating

0.0

(0)

Developer

junipr

junipr

Maintained by Community

Actor stats

0

Bookmarked

2

Total users

1

Monthly active users

11 days ago

Last modified

Share

SEO Audit Tool

What does SEO Audit Tool do?

SEO Audit Tool performs a comprehensive on-page SEO analysis of any website. Point it at one or more URLs and it crawls the site, auditing each page for title tags, meta descriptions, heading hierarchy, image optimization, internal and external links, canonical URLs, Open Graph tags, Twitter Cards, structured data (JSON-LD), meta robots directives, HTTPS usage, viewport configuration, URL structure, and word count. Each page receives an SEO score from 0 to 100 based on the issues found.

The actor also performs site-level checks including robots.txt analysis, sitemap.xml detection, and broken link detection. After crawling, it produces a summary report with the average score, total issues grouped by severity (error, warning, info), and robots.txt/sitemap status. All results are structured JSON ready for dashboards, reports, or automated monitoring.

Features

  • 16 on-page SEO checks — title tag, meta description, headings, images, links, canonical URL, Open Graph, Twitter Card, structured data, meta robots, language, viewport, HTTPS, URL structure, and word count
  • SEO scoring — each page scored 0-100 based on issue severity and count
  • Site-level checks — robots.txt existence and disallow rules, sitemap.xml detection and URL count
  • Broken link detection — optionally verify all internal and external links return valid HTTP responses
  • Multi-page crawling — follow internal links to discover and audit up to 500 pages per site
  • Multiple starting URLs — audit several websites in a single run
  • Issue categorization — issues grouped by severity (error, warning, info) and category
  • Site summary report — aggregate scores, total issues by severity, robots.txt and sitemap status
  • Configurable concurrency — adjust parallel crawling speed and request delays
  • Pay-per-page pricing — only pay for pages successfully audited

Input Configuration

{
"urls": ["https://example.com"],
"maxPages": 20,
"followLinks": true,
"checkBrokenLinks": true,
"checkRobotsTxt": true,
"checkSitemap": true,
"maxConcurrency": 5,
"requestDelay": 500
}
ParameterTypeDefaultDescription
urlsstring[]["https://crawlee.dev"]List of website URLs to audit
maxPagesinteger20Maximum pages to crawl per starting URL (1-500)
followLinksbooleantrueFollow internal links to discover additional pages
checkBrokenLinksbooleantrueVerify all links return valid HTTP responses
checkRobotsTxtbooleantrueFetch and analyze the site's robots.txt
checkSitemapbooleantrueFetch and analyze the site's sitemap.xml
maxConcurrencyinteger5Maximum pages to crawl in parallel (1-20)
requestDelayinteger500Delay between requests in ms (0-10000)

Output Format

The dataset contains two types of records: per-page audit results and a site summary.

Page audit result:

{
"type": "page",
"url": "https://example.com/about",
"statusCode": 200,
"responseTimeMs": 342,
"title": { "text": "About Us | Example", "length": 18, "isOptimal": true },
"metaDescription": { "text": "Learn about our company...", "length": 89, "isOptimal": false },
"headings": { "h1Count": 1, "h2Count": 3, "h3Count": 5, "hierarchyValid": true, "h1Text": "About Us" },
"images": { "total": 8, "withAlt": 6, "withoutAlt": 2, "withDimensions": 5, "withLazyLoad": 3 },
"links": { "internal": 12, "external": 4, "broken": 1, "nofollow": 0 },
"canonical": { "exists": true, "url": "https://example.com/about", "isSelfReferencing": true },
"openGraph": { "hasTitle": true, "hasDescription": true, "hasImage": true, "hasUrl": true },
"twitterCard": { "hasCard": true, "hasTitle": true, "hasDescription": true },
"structuredData": { "found": true, "types": ["Organization", "WebPage"] },
"metaRobots": { "index": true, "follow": true },
"language": "en",
"hasViewport": true,
"isHttps": true,
"wordCount": 850,
"score": 82,
"issues": [
{ "severity": "warning", "message": "Meta description is too short (89 chars, recommended 120-160)", "category": "meta" },
{ "severity": "error", "message": "2 images missing alt text", "category": "images" }
],
"scrapedAt": "2026-03-11T12:00:00.000Z"
}

Site summary:

{
"type": "summary",
"siteUrl": "https://example.com",
"pagesAudited": 15,
"averageScore": 78,
"totalIssues": 42,
"issuesBySeverity": { "error": 8, "warning": 22, "info": 12 },
"robotsTxt": { "exists": true, "disallowCount": 3 },
"sitemap": { "exists": true, "urlCount": 127 },
"scrapedAt": "2026-03-11T12:05:00.000Z"
}

Usage Examples / Use Cases

  • Website launch audit — run a full SEO check before launching a new site to catch missing meta tags, broken links, and missing structured data
  • Ongoing monitoring — schedule regular audits to detect SEO regressions after content or code changes
  • Competitor analysis — audit competitor websites to understand their SEO strategy, structured data usage, and technical quality
  • Client reporting — generate structured audit data for SEO agency client reports
  • Content optimization — identify pages with low word counts, missing headings, or poor meta descriptions
  • Technical SEO — find broken links, missing canonical tags, HTTPS issues, and robots.txt problems across an entire site

Proxy Requirements (Optional)

SEO Audit Tool does not require a proxy for most websites. It uses lightweight HTTP crawling and respects rate limits. However, if you are auditing a website that blocks datacenter IPs or uses aggressive anti-bot protection, you can configure a proxy in the Apify Console run settings. For most use cases, the default configuration works without any proxy.

Pricing

This actor uses Pay-Per-Event (PPE) pricing: $6.50 per 1,000 pages audited ($0.0065 per event).

Pricing includes all platform compute costs — no hidden fees.

FAQ

How is the SEO score calculated?

Each page starts at 100 and loses points for each issue found. Errors deduct more points than warnings, and warnings more than info-level issues. The score reflects the overall on-page SEO health of that specific URL. The site summary provides an average across all audited pages.

Can I audit multiple websites in one run?

Yes. Add multiple URLs to the urls array and each one will be audited independently. The actor produces separate page results and summary records for each starting URL, making it easy to compare SEO health across different sites.

When checkBrokenLinks is enabled, the actor sends HEAD requests to every link found on each page (both internal and external) and reports links that return 4xx or 5xx status codes. This helps you find dead links before your visitors or search engines do.

How many pages can I audit per run?

Up to 500 pages per starting URL, controlled by the maxPages setting. The default is 20 pages, which covers most small to medium sites. For large sites, increase maxPages and consider lowering maxConcurrency to avoid overwhelming the target server.

Does the audit check page speed or Core Web Vitals?

No. This actor focuses on on-page SEO factors like meta tags, headings, links, structured data, and content quality. For performance auditing and Core Web Vitals analysis, use a dedicated performance tool like Website Performance Analyzer.