Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow users to configure embed_batch_size or ThreadPoolExecutor size when calling Client.embed #534

Open
acompa opened this issue Jul 3, 2024 · 1 comment

Comments

@acompa
Copy link

acompa commented Jul 3, 2024

It looks like batching was added in #437 - thank you for implementing this, it's very helpful.

I notice that batching, as defined here, depends on a fixed batch size. This can be suboptimal for clients submitting a large number of smaller documents, as we cannot configure the ThreadPoolExecutor size to parallelize a large number of small data payloads. As a result a client might end up blocking while waiting for small network responses.

Would it be possible to allow clients to configure either the ThreadPoolExecutor size or the embed_batch_size setting when calling embed?

@billytrend-cohere
Copy link
Collaborator

Hey @acompa thanks for the feedback, this will be fixed by #536 ! you will be able to pass your own executor in. I'll ping the thread when it's released

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants