Ollama Model Scraper - Local LLM Library
Pricing
Pay per usage
Ollama Model Scraper - Local LLM Library
Scrape Ollama model library for model names, sizes, quantization levels, parameter counts, and pull counts.
Ollama Model Scraper - Local LLM Library
Pricing
Pay per usage
Scrape Ollama model library for model names, sizes, quantization levels, parameter counts, and pull counts.
You can access the Ollama Model Scraper - Local LLM Library programmatically from your own applications by using the Apify API. You can also choose the language preference from below. To use the Apify API, you’ll need an Apify account and your API token, found in Integrations settings in Apify Console.
{ "mcpServers": { "apify": { "command": "npx", "args": [ "mcp-remote", "https://mcp.apify.com/?tools=tropical_quince/ollama-model-scraper", "--header", "Authorization: Bearer <YOUR_API_TOKEN>" ] } }}Get a ready-to-use configuration for your MCP client with the Ollama Model Scraper - LLM Model Data Actor preconfigured at mcp.apify.com?tools=tropical_quince/ollama-model-scraper .
You can connect to the Apify MCP Server using clients like Tester MCP Client, or any other MCP client of your choice.
If you want to learn more about our Apify MCP implementation, check out our MCP documentation. To learn more about the Model Context Protocol in general, refer to the official MCP documentation or read our blog post.