OpenAI Vector Store Integration avatar
OpenAI Vector Store Integration

Pricing

Pay per usage

Go to Store
OpenAI Vector Store Integration

OpenAI Vector Store Integration

Developed by

Jiří Spilka

Jiří Spilka

Maintained by Apify

The Apify OpenAI Vector Store integration uploads data from Apify Actors to the OpenAI Vector Store linked to OpenAI Assistant.

4.8 (5)

Pricing

Pay per usage

14

Total users

166

Monthly users

28

Runs succeeded

90%

Last modified

4 months ago

You can access the OpenAI Vector Store Integration programmatically from your own applications by using the Apify API. You can also choose the language preference from below. To use the Apify API, you’ll need an Apify account and your API token, found in Integrations settings in Apify Console.

{
"mcpServers": {
"local-actors-mcp-server": {
"command": "npx",
"args": [
"-y",
"@apify/actors-mcp-server",
"--actors",
"jiri.spilka/openai-vector-store-integration"
],
"env": {
"APIFY_TOKEN": "<YOUR_API_TOKEN>"
}
}
}
}

Configure MCP server with OpenAI Vector Store Integration

You can interact with the MCP server via standard input/output - stdio (as shown above), which is ideal for local integrations and command-line tools such as the Claude desktop client, or you can interact with the server through Server-Sent Events (SSE) to send messages and receive responses, which looks as follows:

{
"mcpServers": {
"remote-actors-mcp-server": {
"type": "sse",
"url": "https://mcp.apify.com/sse?actors=jiri.spilka/openai-vector-store-integration",
"headers": {
"Authorization": "Bearer <YOUR_API_TOKEN>"
}
}
}
}

You can connect to the Apify MCP Server using clients like Tester MCP Client, or any other supported MCP client of your choice.

If you want to learn more about our Apify MCP implementation, check out our MCP documentation. To learn more about the Model Context Protocol in general, refer to the official MCP documentation or read our blog post.