How can I change embedding model for chromadb? #517
Answered
by
jinwook-chang
jinwook-chang
asked this question in
Q&A
Replies: 3 comments 1 reply
-
can you ellaborate? The snippet you post will use ChromaDB as embedder model. you want to change that? Change that to what? |
Beta Was this translation helpful? Give feedback.
1 reply
-
from chromadb.utils import embedding_functions bge_embeddingFunction = embedding_functions.SentenceTransformerEmbeddingFunction(model_name="intfloat/multilingual-e5-large") # your new model
class MyVanna(ChromaDB_VectorStore, OpenAI_Chat):
def __init__(self, config=None):
ChromaDB_VectorStore.__init__(self, config=config)
OpenAI_Chat.__init__(self, client=client, config=config)
vn = MyVanna(config={'model': 'gpt-4o', 'temperature':0, 'embedding_function':bge_embeddingFunction}) |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
jinwook-chang
-
I am unable to use OpenAIEmbeddingFunction with Chroma DB. Here is the error: |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm using config as below,
but I'm not sure how to change embedding model..
Beta Was this translation helpful? Give feedback.
All reactions