Jan 8, 2026
New Features for AI-Powered Development
New
Docs
API
Making it easier than ever to integrate Apify with your AI coding assistants and agents
1. MCP Documentation Tools, No Authentication Required
Give your AI agents direct access to Apify's technical documentation without the hassle of API keys. We've released MCP documentation tools that allow AI agents to read and reference official Apify documentation instantly, with zero configuration overhead.
Why this matters:
- Zero-friction setup: No API keys, tokens, or complex authentication flows. Install and start using immediately.
- Reduced hallucinationsYour AI agent pulls from the actual documentation source, ensuring accurate and up-to-date responses.Read-only & secure
- Pure information retrieval with no write access, no security risks to your account or infrastructure.
2. One-Click Cursor & VS Code Integration
Connect your favorite AI-powered IDE to Apify's MCP server in seconds. We've added native integration options directly to our documentation's LLM dropdown, allowing you to configure Apify's MCP server in Cursor or VS Code with a single click.
What's new:
- Copy MCP Config — Instantly copy the pre-configured MCP server JSON to your clipboard
- Connect to Cursor — Opens Cursor directly with the Apify MCP configuration via cursor:// deeplink
- Connect to VS Code — Opens VS Code with the configuration via vscode: deeplink BoldWeb configurator fallback — If deeplinks aren't available, you're redirected to our web-based MCP configurator
Where to find it: Look for the LLM dropdown menu on any of our main documentation pages. Select your preferred IDE, and you're connected. No manual JSON editing required.
Why this matters:
- Ask your AI coding assistant questions while it references actual Apify documentation
- Get accurate code examples without switching context between your IDE and browser
- Debug issues with your agent having full access to relevant guides
3. Full Documentation Available as LLM-Ready Markdown
Index our entire documentation directly in your coding environment for maximum AI context. Our complete documentation is now available in LLM-optimized formats, designed specifically for AI coding assistants and RAG pipelines:
- llms.txt: https://docs.apify.com/llms.txt - Quick context, smaller context windows
- llms-full.txt: https://docs.apify.com/llms-full.txt - Full indexing, comprehensive RAG systems
Why developers love this:
- Local indexing in Cursor/VS Code — Add these URLs to your IDE's documentation index. Your AI assistant will have persistent access to Apify's documentation without making external calls during conversations.
- Faster responses — Pre-indexed documentation means no latency from real-time lookups. Your agent responds instantly with accurate Apify knowledge.
- Offline-capable — Download llms-full.txt once and your AI assistant works with full Apify context even without internet access.
- RAG pipeline ready — These files are structured specifically for chunking and embedding in vector databases.
Quick setup for Cursor:
- Open Cursor Settings
- Navigate to Indexing and Docs
- Click Add Docs
- Add: https://docs.apify.com/llms-full.txt
Your Cursor AI will now reference Apify documentation when answering questions about the platform.
Getting Started
All three features work together to create a seamless AI-assisted development experience:
- Quick questions? → Use the MCP documentation tools (no auth required)
- IDE integration? → Click "Connect to Cursor" or "Connect to VS Code" in our docs
- Deep indexing? → Add llms-full.txt to your project for persistent context






