AI Technical SEO MCP Server
Pricing
Pay per usage
AI Technical SEO MCP Server
AI-ready technical SEO auditing with 7 specialized tools. Check robots.txt, llms.txt, generate sitemaps, run Lighthouse & GTmetrix audits, test rich results, and perform comprehensive technical SEO audits - all through a unified MCP interface.
Pricing
Pay per usage
Rating
0.0
(0)
Developer

John Rippy
Actor stats
0
Bookmarked
1
Total users
0
Monthly active users
3 days ago
Last modified
Categories
Share
Technical SEO MCP Server
AI-ready technical SEO auditing with 7 specialized tools. Check robots.txt, llms.txt, generate sitemaps, run Lighthouse & GTmetrix audits, test rich results, and perform comprehensive technical SEO audits - all through a unified MCP interface.
Built by John Rippy | johnrippy.link
What is MCP?
The Model Context Protocol (MCP) is an open standard that allows AI assistants like Claude, ChatGPT, and Cursor to use external tools. This MCP Server exposes 7 technical SEO tools that AI assistants can call directly.
Available Tools
1. check_robots_txt
Check and analyze a website's robots.txt file. Validates syntax, checks for common issues, and verifies bot access rules.
{"url": "https://example.com"}
Returns: Rules, sitemaps, crawl delays, issues, recommendations
2. check_llms_txt
Check if a website has an llms.txt file for AI/LLM crawlers. Analyzes AI-readiness and LLM access policies.
{"url": "https://example.com"}
Returns: AI-readiness score, detected AI policies, recommendations
3. generate_sitemap
Generate or analyze XML sitemap for a website. Crawls site structure and creates sitemap with priorities and change frequencies.
{"url": "https://example.com","maxPages": 100,"includeImages": true}
Returns: Complete sitemap XML, URL stats, change frequencies
4. audit_technical_seo
Comprehensive technical SEO audit including meta tags, headers, schema markup, page speed indicators, mobile-friendliness, and crawlability issues.
{"url": "https://example.com","depth": 2,"checkMobile": true}
Returns: Overall score, category scores, issues by severity, recommendations
5. test_rich_results
Test structured data and schema.org markup for Google Rich Results eligibility. Validates JSON-LD, Microdata, and RDFa.
{"url": "https://example.com"}
Returns: Eligibility status, detected schema types, validation errors/warnings
6. run_lighthouse
Run Google Lighthouse audit for performance, accessibility, best practices, SEO, and PWA scores.
{"url": "https://example.com","device": "mobile","categories": ["performance", "accessibility", "seo"]}
Returns: Category scores, Core Web Vitals, opportunities, diagnostics
7. run_gtmetrix
Run GTmetrix performance test for page speed, Core Web Vitals (LCP, TBT, CLS), and optimization recommendations.
{"url": "https://example.com","location": "San Antonio, TX","browser": "chrome"}
Returns: Grades, Web Vitals, page timing, size breakdown, recommendations
Quick Start
Using with Apify (Direct)
# Run via APIcurl -X POST "https://api.apify.com/v2/acts/localhowl~technical-seo-mcp-server/runs" \-H "Authorization: Bearer YOUR_APIFY_TOKEN" \-H "Content-Type: application/json" \-d '{"tool": "run_lighthouse","toolInput": {"url": "https://example.com","device": "mobile"},"demoMode": true}'
Using with Claude Desktop
Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json):
{"mcpServers": {"technical-seo": {"command": "npx","args": ["-y", "@anthropic-ai/mcp-server-apify", "--actor", "localhowl~technical-seo-mcp-server"],"env": {"APIFY_TOKEN": "your_apify_token"}}}}
Using with n8n
Install the n8n-nodes-johnrippy-seo package for native n8n integration with this MCP server.
Example Prompts for AI Assistants
Once connected, you can ask your AI assistant:
- "Check the robots.txt file for example.com and identify any issues"
- "Run a Lighthouse audit on my-site.com for mobile"
- "Test if my homepage has valid structured data for rich results"
- "Generate a sitemap for my website with up to 200 pages"
- "Audit the technical SEO of competitor.com"
- "Check if example.com has an llms.txt file for AI crawlers"
- "Run a GTmetrix performance test on my landing page"
Use Cases
SEO Agencies
- Automated technical audits for client onboarding
- Regular monitoring of crawlability and indexability
- Competitor technical SEO analysis
Content Teams
- Verify structured data before publishing
- Check page speed impact of new content
- Monitor Core Web Vitals across key pages
Developers
- Pre-launch technical SEO checklist
- CI/CD integration for performance testing
- Schema markup validation
AI-First Workflows
- Claude or ChatGPT can audit sites on demand
- Automated technical SEO monitoring with AI analysis
- Natural language interface to technical SEO tools
Pricing
Pay-per-event pricing on Apify:
- Base cost per run
- Additional costs based on underlying actor usage
Demo mode is free and returns realistic sample data.
Related Tools
This MCP Server bundles these underlying Apify actors:
robots-txt-checker- Robots.txt analysisllms-txt-checker- AI crawler policy checkersitemap-generator- XML sitemap generationtechnical-seo-auditor- Comprehensive technical auditrich-results-tester- Schema.org validationgoogle-lighthouse-checker- Lighthouse auditsgtmetrix-tester- GTmetrix performance testing
Support
- GitHub: Issues
- LinkedIn: John Rippy
- Website: johnrippy.link
License
MIT License - Free to use for any purpose.