Configuring LibreChat to connect to Ollama appears to use the openai.js client instead of the ollama.js client. #5085
Replies: 3 comments 1 reply
-
This is expected, the ollama client is only used when attaching images or using agents. I use ollama and do not experience the issue, but I can see about always using the client as suggested. |
Beta Was this translation helpful? Give feedback.
-
Same challenge. getting the AI response error; aborting request: connection error. Debug log: |
Beta Was this translation helpful? Give feedback.
-
This is a common issue for ollama and most apps. Please see: continuedev/continue#2358 |
Beta Was this translation helpful? Give feedback.
-
What happened?
Configuring LibreChat to connect to Ollama appears to use the openai.js client instead of the ollama.js client.
This results in minor incompatibilities due to the behavior of the OpenAI client, as described in issue #3330.
This assumption is based primarily on observations in the log files (e.g., chat completion reports) and the encountered behavior, which aligns with that of the OpenAI client.
Best regards
K.
Steps to Reproduce
Configure custom provider as in the doc
Look in the debug log i see that is using openaiclient
What browsers are you seeing the problem on?
Firefox
Relevant log output
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions