Page Size Analyzer
Pricing
Pay per event
Page Size Analyzer
This actor analyzes web page size and provides a detailed resource breakdown. It measures HTML document size, counts external resources (scripts, stylesheets, images, fonts, iframes, videos), and calculates inline script, style, and comment bytes.
Pricing
Pay per event
Rating
0.0
(0)
Developer
Stas Persiianenko
Actor stats
0
Bookmarked
3
Total users
1
Monthly active users
6 hours ago
Last modified
Categories
Share
Analyze web page size and resource breakdown. Count scripts, stylesheets, images, fonts, and inline code.
What does Page Size Analyzer do?
This actor analyzes web page size and provides a detailed resource breakdown. It measures HTML document size, counts external resources (scripts, stylesheets, images, fonts, iframes, videos), and calculates inline script, style, and comment bytes. Feed it a list of URLs and get back structured data about every page's weight and composition.
Use cases
- Performance engineers -- identify heavy pages with too many resources and prioritize optimization work
- SEO specialists -- large pages load slower and rank lower; measure page weight to guide technical SEO
- Marketing teams -- audit landing pages before campaigns to ensure fast load times on mobile and desktop
- Development leads -- catch inline script and style bloat during code review before merging to production
- Competitive analysts -- compare page sizes across competitor sites to benchmark your own performance
- Accessibility consultants -- bloated pages with excessive resources create barriers for users on slow connections
Why use Page Size Analyzer?
- Batch processing -- analyze hundreds of URLs in a single run instead of checking pages one by one
- Structured JSON output -- every result follows a consistent schema, ready for dashboards and spreadsheets
- Resource-level detail -- goes beyond total size to break down scripts, styles, images, fonts, iframes, and videos
- Inline code measurement -- surfaces hidden bloat from inline scripts, styles, and HTML comments
- API and integration ready -- call it programmatically or connect to Make, Zapier, and other automation platforms
- Pay-per-event pricing -- you only pay for the pages you analyze, starting at $0.001 per URL
Input parameters
| Parameter | Type | Required | Default | Description |
|---|---|---|---|---|
urls | array | Yes | -- | List of web page URLs to analyze for page size and resources |
Input example
{"urls": ["https://www.google.com","https://en.wikipedia.org/wiki/Web_scraping","https://example.com"]}
Output example
{"url": "https://example.com","title": "Example Domain","htmlSizeBytes": 1256,"htmlSizeKB": 1.2,"resourceCounts": {"scripts": 0,"stylesheets": 0,"images": 0,"fonts": 0,"iframes": 0,"videos": 0,"audios": 0},"estimatedResources": [],"totalResourceUrls": 0,"inlineStyleBytes": 375,"inlineScriptBytes": 0,"commentBytes": 0,"error": null,"checkedAt": "2026-03-01T12:00:00.000Z"}
How much does it cost to analyze page size?
Page Size Analyzer uses Apify's pay-per-event pricing model. You are only charged for what you use.
| Event | Price | Description |
|---|---|---|
| Start | $0.035 | One-time per run |
| URL analyzed | $0.001 | Per page analyzed |
Cost examples:
- Analyzing 10 pages: $0.035 + (10 x $0.001) = $0.045
- Analyzing 100 pages: $0.035 + (100 x $0.001) = $0.135
- Analyzing 1,000 pages: $0.035 + (1,000 x $0.001) = $1.035
How to analyze web page size
- Go to Page Size Analyzer on Apify Store
- Enter one or more URLs in the urls field
- Click Start and wait for the analysis to finish
- Review the resource breakdown for each page
- Download results as JSON, CSV, or Excel
Using the Apify API
You can call Page Size Analyzer programmatically from any language using the Apify API. The actor slug is automation-lab/page-size-analyzer.
Node.js
import { ApifyClient } from 'apify-client';const client = new ApifyClient({ token: 'YOUR_TOKEN' });const run = await client.actor('automation-lab/page-size-analyzer').call({urls: ['https://example.com', 'https://www.google.com'],});const { items } = await client.dataset(run.defaultDatasetId).listItems();console.log(items);
Python
from apify_client import ApifyClientclient = ApifyClient('YOUR_TOKEN')run = client.actor('automation-lab/page-size-analyzer').call(run_input={'urls': ['https://example.com', 'https://www.google.com'],})items = client.dataset(run['defaultDatasetId']).list_items().itemsprint(items)
cURL
curl "https://api.apify.com/v2/acts/automation-lab~page-size-analyzer/runs" \-X POST \-H "Content-Type: application/json" \-H "Authorization: Bearer YOUR_TOKEN" \-d '{"urls": ["https://example.com", "https://www.google.com"]}'
Use with AI agents via MCP
Page Size Analyzer is available as a tool for AI assistants via the Model Context Protocol (MCP).
Setup for Claude Code
$claude mcp add --transport http apify "https://mcp.apify.com"
Setup for Claude Desktop, Cursor, or VS Code
{"mcpServers": {"apify": {"url": "https://mcp.apify.com"}}}
Example prompts
- "How large is this webpage?"
- "Analyze page weight for our top 10 landing pages"
Learn more in the Apify MCP documentation.
Integrations
Page Size Analyzer integrates with the major automation and data platforms through the Apify ecosystem:
- Make (formerly Integromat) -- trigger page size checks automatically and route the results to any of Make's 1,000+ app integrations.
- Zapier -- create Zaps that run page size analysis when new URLs are added to a spreadsheet or database.
- Google Sheets -- export results directly for ongoing tracking and trend visualization.
- Slack -- send alerts to a channel when a page exceeds a size threshold or when new bloat is detected.
- Webhooks -- get notified when a run finishes and post-process the data in your own backend systems.
- n8n -- orchestrate runs from n8n workflows or any tool that supports HTTP requests and the Apify REST API.
Tips and best practices
- Start small, then scale -- test with 3-5 URLs first to review the output schema before running large batches.
- Schedule regular runs -- set up a daily or weekly Apify schedule to track page weight trends over time.
- Combine with performance tools -- pair this actor with Website Performance Checker for a full picture of both page weight and server speed.
- Watch inline code -- high
inlineScriptBytesorinlineStyleBytesoften indicates opportunities to externalize and cache code. - Filter by resource type -- use the
resourceCountsbreakdown to focus optimization on the heaviest category (scripts, images, etc.).
Legality
This tool analyzes publicly accessible web content. Automated analysis of public web resources is standard practice in SEO and web development. Always respect robots.txt directives and rate limits when analyzing third-party websites. For personal data processing, ensure compliance with applicable privacy regulations.
FAQ
Can I analyze pages that require login or authentication? No. This actor fetches pages as a public visitor without cookies or session tokens. It is designed for publicly accessible URLs.
Does the actor load JavaScript and render the page? No. It fetches the raw HTML document and parses it statically. Resource counts are based on tags found in the HTML source, not on what a browser would render after JavaScript execution. This approach is faster and more cost-effective for bulk analysis.
What happens if a URL is invalid or unreachable?
The actor returns a result for that URL with the error field populated and all size metrics set to zero. Other URLs in the batch are still processed normally.
How accurate are the resource counts? Resource counts reflect what is present in the raw HTML source. Dynamically injected resources (loaded via JavaScript after page load) are not counted. For most server-rendered pages, the counts are accurate. For heavily client-rendered single-page applications, the counts may be lower than what a browser would load.
The reported page size is much smaller than what browser DevTools shows. Why?
The actor measures the raw HTML document size and counts resource references found in the source. It does not download external resources (images, scripts, CSS files) or account for their file sizes. Browser DevTools shows the total transferred size including all downloaded resources. Use this actor's resourceCounts to identify how many external resources exist, then combine with a performance tool for full transfer size analysis.
Can I export results to CSV or Excel? Yes. Apify datasets support export in JSON, CSV, Excel, XML, and other formats. After the run completes, use the dataset export feature in the Apify Console or the API to download results in your preferred format.
Other SEO and website analysis tools
- Pagination Detector — detect pagination patterns and next/prev links on web pages
- Privacy Policy Detector — find privacy policy and legal page links on websites
- Website Performance Checker — measure server response times and performance metrics
- Website Health Report — audit website health including errors and redirects
- Broken Link Checker — find broken links and dead URLs on websites