OpenRouter Proxy
Pricing
Pay per event
OpenRouter Proxy
You can use any AI LLM model without registration to AI providers. Use this Actor as a proxy for all requests. Pay only for the real used credit using pay per event pricing.
0.0 (0)
Pricing
Pay per event
1
Total users
2
Monthly users
2
Runs succeeded
>99%
Last modified
11 hours ago
This Apify Actor creates a proxy for the Open Router API, allowing you to access multiple AI models through a unified OpenAI-compatible interface. All requests are charged to your Apify account on a pay-per-event basis.
What this Actor does
- Proxy access: Routes your API requests to Open Router's extensive collection of AI models
- OpenAI compatibility: Works seamlessly with the OpenAI SDK and any OpenAI-compatible client
- Transparent billing: Charges are applied to your Apify account at the same rates as Open Router
- Full feature support: Supports both streaming and non-streaming responses
- No API key management: Uses your Apify token for authentication - no need to manage separate Open Router API keys
Pricing
This Actor uses a pay-per-event pricing model through Apify. Each API request counts as one event. The underlying Open Router API costs are included in the per-event pricing, plus a 10% fee to cover the cost of running the proxy server.
Quick start
1. Install the OpenAI package
$npm install openai
2. Basic usage
import OpenAI from 'openai';const openai = new OpenAI({baseURL: 'https://michal-kalita--openrouter-proxy.apify.actor/api/v1',apiKey: 'no-key-required-but-must-not-be-empty', // Any non-empty string works; do NOT use a real API key.defaultHeaders: {Authorization: `Bearer ${process.env.APIFY_TOKEN}`, // Apify token is loaded automatically in runtime},});async function main() {const completion = await openai.chat.completions.create({model: 'openrouter/auto',messages: [{role: 'user',content: 'What is the meaning of life?',},],});console.log(completion.choices[0].message);}await main();
3. Streaming responses
const stream = await openai.chat.completions.create({model: 'openrouter/auto',messages: [{role: 'user',content: 'Write a short story about a robot.',},],stream: true,});for await (const chunk of stream) {process.stdout.write(chunk.choices[0]?.delta?.content || '');}
Available models
This proxy supports all models available through Open Router from providers including:
- OpenAI
- Anthropic
- Meta
- Perplexity
- And many more...
For a complete list of available models, visit Open Router's models page.
Authentication
The Actor uses your Apify token for authentication. In Apify Actor environments, APIFY_TOKEN
is automatically available. For local development, you can:
- Set the environment variable:
export APIFY_TOKEN=your_token_here
- Or pass it directly in the Authorization header
- Find your token in the Apify Console
Support
For issues related to this Actor, please contact the Actor developer.