Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] JSON format output in ChatQnA #1249

Open
weichengintel opened this issue Dec 13, 2024 · 2 comments
Open

[Feature] JSON format output in ChatQnA #1249

weichengintel opened this issue Dec 13, 2024 · 2 comments
Assignees
Labels
Dev feature New feature or request

Comments

@weichengintel
Copy link

Priority

P3-Medium

OS type

Ubuntu

Hardware type

Xeon-GNR

Running nodes

Single Node

Description

Add a feature that allows the LLM to output in JSON format, similar to what OpenAI provides, by including a parameter in the HTTP request.

REF:

Structured Outputs - OpenAI API

@weichengintel weichengintel added the feature New feature or request label Dec 13, 2024
@xiguiw
Copy link
Collaborator

xiguiw commented Dec 25, 2024

@weichengintel

The example output format is of low priority.
Could you give some more background so that we can evaluate the priority of this feature?

@weichengintel
Copy link
Author

@weichengintel

The example output format is of low priority. Could you give some more background so that we can evaluate the priority of this feature?

I also consider this low priority. My intention for adding it is mainly to check the model I am using through the output logs when interacting with the ChatQnA UI interface.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Dev feature New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants