Issue: Different Query Predictions After Switching Vector DB to ChromaDB #682
Unanswered
VirendraSttl
asked this question in
Q&A
Replies: 1 comment
-
LLMs are fundamentally nondeterministic and can be sensitive to inputs related to both the embedding function and how the vector database performs searches. You should not expect the exact same behavior. Generally speaking, adding more question-SQL pairs to your training data should help. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Description
I have been using the Vanna AI package with my custom LLM and initially tested it with the default Vanna vector DB. Everything was working fine, and the query predictions were accurate. However, after switching the vector DB to ChromaDB, I noticed that the model is predicting different queries, although the results remain the same.
Instead of returning the customer name, the model now returns the customer ID. This behavior only started after the switch to ChromaDB, and the structure of the results (i.e., the customer data) has not been altered. I also tested it multiple times and got the same results while switching vector db.
To Reproduce
Steps to reproduce the behavior:
Use the default Vanna vector DB with custom LLM – query prediction works fine and returns the customer name.
from vanna.vannadb import VannaDB_VectorStore
Switch the vector DB to ChromaDB.
from vanna.chromadb import ChromaDB_VectorStore
Query predictions change, and the model returns customer IDs instead of names.
Expected Behavior
The query predictions should remain consistent when switching from the default Vanna vector DB to ChromaDB. The model should return customer names as before, not customer IDs.
Actual Behavior
After switching to ChromaDB, the query results return customer IDs instead of customer names, although the underlying data and results are the same.
Beta Was this translation helpful? Give feedback.
All reactions