💡 Lighthouse Bulk Checker — Performance at Scale
Pricing
from $5.00 / 1,000 results
💡 Lighthouse Bulk Checker — Performance at Scale
Run Lighthouse audits on hundreds of URLs simultaneously. Get performance, SEO, accessibility & best practices scores. Monitor Core Web Vitals across your entire domain. Pay per audit.
Pricing
from $5.00 / 1,000 results
Rating
0.0
(0)
Developer
Stephan Corbeil
Actor stats
0
Bookmarked
2
Total users
0
Monthly active users
10 hours ago
Last modified
Categories
Share
Google Lighthouse Checker
Audit web pages with Google Lighthouse to measure performance, accessibility, best practices, and SEO.
Features
- Performance Audits: Core Web Vitals, load times, render metrics
- Accessibility Checks: WCAG compliance, screen reader compatibility
- Best Practices: Security, modern web standards
- SEO Analysis: Meta tags, structured data, mobile-friendliness
- Batch Audits: Test multiple URLs at once
- Device Emulation: Mobile or desktop testing
Input Parameters
| Parameter | Required | Default | Description |
|---|---|---|---|
urls | Yes | - | List of URLs to audit |
categories | No | ["performance", "accessibility", "best-practices", "seo"] | Audit categories |
device | No | "mobile" | Device type (mobile/desktop) |
maxRequests | No | 10 | Maximum URLs to audit (1-100) |
Example Input
{"urls": ["https://example.com", "https://apify.com"],"categories": ["performance", "accessibility", "seo"],"device": "mobile","maxRequests": 10}
Output
Results include:
- Scores: 0-100 for each category
- Core Web Vitals: LCP, FID, CLS, TBT
- Detailed Metrics: FCP, Speed Index, TTI
- Error details for failed audits
Use Cases
- SEO audits and optimization
- Performance monitoring
- Accessibility compliance
- Pre-launch website checks
- Competitor analysis
Pricing
Free to use. Pay only for Apify platform compute time.
Links
💻 Code Example — Python
from apify_client import ApifyClientclient = ApifyClient("YOUR_APIFY_TOKEN")run = client.actor("nexgendata/google-lighthouse-checker").call(run_input={# Fill in the input shape from the actor's input_schema})for item in client.dataset(run["defaultDatasetId"]).iterate_items():print(item)
🌐 Code Example — cURL
curl -X POST "https://api.apify.com/v2/acts/nexgendata~google-lighthouse-checker/run-sync-get-dataset-items?token=YOUR_TOKEN" \-H "Content-Type: application/json" \-d '{ /* input schema */ }'
❓ FAQ
Q: How do I get started? Sign up at apify.com, grab your API token from Settings → Integrations, and run the actor via the Apify console, API, Python SDK, or any integration (Zapier, Make.com, n8n).
Q: What's the typical cost per run? See the pricing section below. Most runs finish under $0.10 for typical batches.
Q: Is this actor maintained? Yes. NexGenData maintains 140+ Apify actors and ships updates regularly. Bug reports via the Apify console issues tab get responses within 24 hours.
Q: Can I use the output commercially? Yes — you own the output data. Check the target site's Terms of Service for any usage restrictions on the scraped content itself.
Q: How do I handle rate limits? Apify manages concurrency and retries automatically. For very large batches (10K+ items), run multiple smaller jobs in parallel instead of one mega-job for better reliability.
💰 Pricing
Pay-per-event pricing — you only pay for what you actually extract.
- Actor Start: $0.0001
- result: $0.0050
🔗 Related NexGenData Actors
🚀 Apify Affiliate Program
New to Apify? Sign up with our referral link — you get free platform credits on signup, and you help fund the maintenance of this actor fleet.
Built and maintained by NexGenData — 140+ actors covering scraping, enrichment, MCP servers, and automation.