Broken Link Monitor For Website Owners avatar

Broken Link Monitor For Website Owners

Pricing

Pay per usage

Go to Apify Store
Broken Link Monitor For Website Owners

Broken Link Monitor For Website Owners

I build web automation and website auditing tools for website owners, agencies, and marketers.

Pricing

Pay per usage

Rating

0.0

(0)

Developer

Shahab Uddin

Shahab Uddin

Maintained by Community

Actor stats

0

Bookmarked

3

Total users

1

Monthly active users

6 days ago

Last modified

Share

๐Ÿš€ Broken Link Monitor for Website Owners

Find broken links, redirects, and website errors instantly. Scan any website and detect SEO issues that hurt rankings, user experience, and conversions.

๐Ÿ” What is this tool?

Broken Link Monitor for Website Owners is a powerful website crawler & SEO audit tool that scans your website and detects:

โŒ Broken links (4xx / 5xx errors) ๐Ÿ” Redirects (301 / 302 / 307 / 308) โš ๏ธ Page errors and crawl issues ๐Ÿ”— Internal and external links ๐Ÿ“Š Website link health overview

Perfect for website owners, SEO experts, agencies, developers, and bloggers.

๐Ÿ’ก Why use this tool?

Broken links and redirects can:

Hurt your Google rankings (SEO) Reduce user trust and experience Cause lost traffic and revenue Affect site crawlability and indexing

๐Ÿ‘‰ This tool helps you fix those issues fast.

โšก Key Features ๐ŸŒ Multi-page website crawler ๐Ÿ” Internal + external link checker โš™๏ธ Technical SEO audit ready ๐Ÿ” Smart SSL handling (auto retry) โšก Fast scanning with safe defaults ๐Ÿ“ Export results (JSON / CSV) ๐Ÿง  Intelligent crawl stopping logic ๐Ÿ› ๏ธ Works even on sites with SSL issues ๐ŸŽฏ Best For Website owners fixing broken links SEO professionals doing technical audits Agencies managing client websites Developers maintaining site health Bloggers & ecommerce store owners ๐Ÿงพ Input

Provide the following:

{ "startUrl": "https://example.com/", "maxPages": 25, "checkExternalLinks": false, "ignoreSslErrors": false } Input Options Field Description startUrl Website URL to scan maxPages Max pages to crawl (recommended: 10โ€“50) checkExternalLinks Check external links or not ignoreSslErrors Ignore SSL errors if site has issues ๐Ÿ“Š Output

Each result includes:

sourcePage โ†’ Page where link was found linkUrl โ†’ Detected link finalUrl โ†’ Final URL after redirects statusCode โ†’ HTTP status isInternal โ†’ Internal or external result โ†’ broken / redirect / error Summary Example { "startUrl": "https://example.com/", "crawledPages": 12, "foundLinks": 145, "brokenLinks": 5, "redirects": 9, "externalLinks": 60, "internalLinks": 85 } โš™๏ธ How it works Starts from your homepage Crawls internal links (same domain) Extracts all valid URLs Checks status of each link Reports broken links, redirects, and errors Stops based on maxPages or crawl limits ๐Ÿง  Smart Crawling System Only crawls internal pages Avoids duplicate URLs Handles relative & absolute links Ignores invalid links (mailto, tel, javascript) Continues even if some pages fail Automatically retries SSL errors when needed ๐Ÿšจ Important Notes If a site has JavaScript-only navigation, some links may not be detected Increase maxPages for deeper scans Use ignoreSslErrors = true only if SSL issues occur ๐Ÿ’ฐ Why this tool is different

Unlike basic link checkers, this tool:

โœ”๏ธ Performs a real multi-page crawl โœ”๏ธ Detects technical SEO issues โœ”๏ธ Handles SSL failures automatically โœ”๏ธ Designed for real-world websites โœ”๏ธ Built for speed + accuracy ๐Ÿงช Example Use Cases SEO audit before website launch Fix broken links after migration Monitor site health regularly Improve Google indexing Find hidden crawl errors ๐Ÿ”ฅ SEO Keywords (Optimized)

Broken link checker, website audit tool, SEO audit tool, website crawler, link checker tool, technical SEO tool, find broken links, website health checker, crawl website, SEO scanner


Apify Integration

This actor is designed to run on the Apify platform. It uses Apify's input schema and dataset features for configuration and result storage.

How to use on Apify

  1. Deploy the actor
    • Push this repository to your Apify account or create a new actor and upload the code.
  2. Configure input
    • The actor uses the following input schema (see input_schema.json):
      • startUrl (string, required): The first page to crawl (e.g., https://example.com/)
      • maxPages (integer, default: 25): Maximum number of internal pages to crawl
      • checkExternalLinks (boolean, default: false): Whether to check external links
      • ignoreSslErrors (boolean, default: false): Ignore SSL certificate errors
  3. Run the actor
    • The actor will crawl the website, check links, and store results in the Apify dataset.
  4. View results
    • Results are available in the Apify run dataset as JSON or CSV. Each record contains:
      • sourcePage, linkUrl, finalUrl, statusCode, isInternal, result

Example input (Apify UI or API)

{
"startUrl": "https://example.com/",
"maxPages": 25,
"checkExternalLinks": false,
"ignoreSslErrors": false
}

Example output (dataset record)

{
"sourcePage": "https://example.com/",
"linkUrl": "https://www.example.com/",
"finalUrl": "https://www.example.com/",
"statusCode": 200,
"isInternal": false,
"result": "ok"
}