Ollama Model Scraper - Local LLM Library
Pricing
Pay per usage
Ollama Model Scraper - Local LLM Library
Scrape Ollama model library for model names, sizes, quantization levels, parameter counts, and pull counts.
Ollama Model Scraper - Local LLM Library
Pricing
Pay per usage
Scrape Ollama model library for model names, sizes, quantization levels, parameter counts, and pull counts.
You can access the Ollama Model Scraper - Local LLM Library programmatically from your own applications by using the Apify API. You can also choose the language preference from below. To use the Apify API, you’ll need an Apify account and your API token, found in Integrations settings in Apify Console.
$echo '{< "searchTerms": [< "llama",< "mistral"< ]<}' |<apify call tropical_quince/ollama-model-scraper --silent --output-datasetThe Apify CLI is the official tool that allows you to use Ollama Model Scraper - Local LLM Library locally, providing convenience functions and automatic retries on errors.
Using installation script (macOS/Linux):
$curl -fsSL https://apify.com/install-cli.sh | bashUsing installation script (Windows):
$irm https://apify.com/install-cli.ps1 | iexUsing Homebrew:
$brew install apify-cliUsing NPM:
$npm install -g apify-cliOther API clients include: