-
Notifications
You must be signed in to change notification settings - Fork 66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error when using streaming chat completion #44
Comments
Just to expand on this a little (I work with @Ali-Kalout :) )—it's not easy to tell what the malformed JSON looks like, but it seems possible that this PR could improve the situation. I also noticed yesterday that OpenAI has added a JSON mode to chat completions. Adding |
If I'm not mistaken, if I just update my Instructor.chat_completion(
model: @model,
response_model: response_model,
messages: messages,
stream: stream,
max_retries: @max_retries,
# the important part below
response_format: %{ type: "json_object"}
) |
I have this PR that helps with debugging: #46. |
I'm also running into this same issue. I believe its happening because of the use of |
Hi all, we sometimes get the following error, when using the chat completion feature while having the
stream
field as true.The text was updated successfully, but these errors were encountered: