-
I'm trying to use chat-ui to talk to a vLLM server, using vLLM's OpenAI API, serving Mixtral-8x22B. vLLM was throwing an error on every request from chat-ui: "Conversation roles must alternate user/assistant/user/assistant/..." After googling I learned that this was because Mixtral doesn't support the "system" role, and I should use a chatPromptTemplate like the one below. But setting the chatPromptTemplate had no effect on the queries chat-ui was sending to vLLM. Can anyone tell me what I'm doing wrong? Here is my MODELS config:
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
I don't know why chat-ui seemed to be ignoring the chatPromptTemplate. But I eventually fixed the problem in vLLM by using this template to override the default Mixtral template:
|
Beta Was this translation helpful? Give feedback.
I don't know why chat-ui seemed to be ignoring the chatPromptTemplate. But I eventually fixed the problem in vLLM by using this template to override the default Mixtral template: