Replies: 1 comment 1 reply
-
Since this already uses the OpenAI client, you can do something like this: from vanna.openai import OpenAI_Chat
from vanna.chromadb import ChromaDB_VectorStore
from openai import OpenAI
deepseek_client = OpenAI(api_key="<DeepSeek API Key>", base_url="https://api.deepseek.com")
class MyVanna(ChromaDB_VectorStore, OpenAI_Chat):
def __init__(self, config=None):
ChromaDB_VectorStore.__init__(self, config=config)
OpenAI_Chat.__init__(self, client=deepseek_client, config=config) # Your deepseek client is used here
vn = MyVanna(config={'model': 'deepseek-chat'}) |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Looks like a good LLM, and the implementation don't seem too hard to do, we can use basic the same code of the openai
https://api-docs.deepseek.com/
Beta Was this translation helpful? Give feedback.
All reactions