
Pinecone Integration
No credit card required

Pinecone Integration
No credit card required
This integration transfers data from Apify Actors to a Pinecone and is a good starting point for a question-answering, search, or RAG use case.
Max token lentght...
I cant push big databases, it seems like they think it is one long chunk, but a lot of smaller chunks??? Hope there is a solution!
team2
For more information, I get this error:
Failed to update the database. Please ensure the following:
The database is configured properly. The vector dimension of your embedding model in the Actor input (Embedding settings → model) matches the one set up in the database. Error message: Error code: 400 - {'error': {'message': 'Requested 1,073,396 tokens, max 600,000 tokens per request', 'type': 'max_tokens_per_request', 'param': None, 'code': 'max_tokens_per_request'}}
So, when I push a database that is larger than 600,000 tokens, it fails. Not just one chunk—even if all chunks are 500 tokens long, if the total exceeds 600,000 tokens, it fails.
I wonder why? Who sets this limit? And is it possible to either:
Create a workaround so that after 600k tokens, it creates a new request? Increase the limit?
And another thing: Instead of updating chunks, why is it not possible to just retrieve all chunks with matching URLs, push new ones, and delete the old ones? Would this not minimize the loading cost on Apify and reduce costs on Pinecone, where requesting so many chunks is quite expensive, in this case we would only request the url?
team2
????
responsible_box
:)
responsible_box
:(
Actor Metrics
45 monthly users
-
25 bookmarks
95% runs succeeded
16 days response time
Created in Jun 2024
Modified a month ago