Ollama Apify Mcp avatar
Ollama Apify Mcp

Pricing

from $0.01 / 1,000 results

Go to Apify Store
Ollama Apify Mcp

Ollama Apify Mcp

The Ollama MCP Actor brings together Apify’s web-scraping power with fast, private, on-device AI. No external APIs required. It lets you run local LLMs through Ollama using the Model Context Protocol, so you can analyze scraped data, extract insights, and generate responses with full control.

Pricing

from $0.01 / 1,000 results

Rating

0.0

(0)

Developer

Anwesh Mishra

Anwesh Mishra

Maintained by Community

Actor stats

0

Bookmarked

1

Total users

1

Monthly active users

3 days ago

Last modified

Share

Ollama-Apify-MCP

Bring powerful local AI into your Apify workflows.

This project connects Ollama’s locally-run language models with the Model Context Protocol (MCP) and Apify’s scraping & automation platform. It enables you to process scraped data, extract insights, and generate intelligent responses — all without external APIs.


🧠 Overview

The Ollama-Apify-MCP Actor bridges Apify workflows with local LLMs via MCP, allowing AI-driven analysis and reasoning while preserving privacy and reducing costs.


🚀 Key Features

  • 🔗 Local LLM integration — Run models like Llama, Mistral, CodeLlama, and more using Ollama
  • 🧩 MCP-based communication — Standards-compliant protocol for tool interaction
  • ⚙️ Automatic context & preprocessing — Improves model response quality
  • 🛠️ Extensible tool architecture — Easily add custom MCP tools & resources
  • 🔁 Robust error handling & retries — Reliable execution in workflows

📦 Quick Start

Use as in cursor, copilot, claude code or desktop

{
"mcpServers": {
"ollama-apify-mcp": {
"url": "https://lenticular-negative--ollama-apify-mcp.apify.actor/mcp?token={YOUR_TOKEN}"
}
}
}

💻 Run Locally

pip install -r requirements.txt
APIFY_META_ORIGIN=STANDBY python -m src

Server runs at:

http://localhost:3000/mcp

☁️ Deploy to Apify

  1. Push the repo to GitHub
  2. Add it as an Actor in Apify Console
  3. Enable Standby Mode
  4. Deploy

MCP endpoint:

https://lenticular-negative--ollama-apify-mcp.apify.actor/mcp

Include your API token:

Authorization: Bearer <APIFY_TOKEN>

🎯 Use Cases

  • 📊 Analyze & summarize scraped web data
  • 🔐 Privacy-first local LLM processing
  • ⚡ Low-latency on-device inference
  • 🧱 Build AI tools inside Apify workflows

🧩 Requirements

  • Python 3.7+
  • Ollama installed locally
  • Apify CLI (for deployment)

❤️ Contributing

PRs and feature ideas are welcome — feel free to extend tools, improve docs, or share sample workflows.


📄 License

MIT License