Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: changed stream parameter to streaming in curl command #1023

Closed

Conversation

jjmaturino
Copy link

Description

Incorrect json key set for curl command.

Issues

N/A

Type of change

List the type of change like below. Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)

Dependencies

N/A

Tests

Integration tests under tests/llms

@jjmaturino
Copy link
Author

@lvliang-intel RFR 🛸

@xiguiw
Copy link
Collaborator

xiguiw commented Dec 26, 2024

@jjmaturino

https://github.com/huggingface/text-generation-inference/blob/main/README.md#get-started
it's stream, not streaming.

Would you please let us in which format or protocol, streaming is used?

@jjmaturino
Copy link
Author

@xiguiw .

On the Pydantic Model LLMParamsDoc , the attribute for stream is defined as streaming.

From my understanding and testing, Pydantic Model JSON Marshaling and Unmarshalling is based on the attribute variable name as the key in the json objects.

streaming: bool = True

This is why I changed stream to streaming to make the curl command work.

So if this is the desired case, we will need to change the pydantic model's key... that seems like a breaking change imo.

@jjmaturino
Copy link
Author

@xiguiw Let me know if you want me to open up an issue and related pr for the topic discussed above!

@jjmaturino
Copy link
Author

Created issue ticket

@jjmaturino jjmaturino closed this Dec 26, 2024
@jjmaturino jjmaturino force-pushed the fix-pg-textgen-readme branch from eb7aaef to 8d6b4b0 Compare December 26, 2024 07:01
@jjmaturino jjmaturino reopened this Dec 26, 2024
Signed-off-by: Juan Maturino <[email protected]>
@jjmaturino jjmaturino force-pushed the fix-pg-textgen-readme branch from d0f90c0 to c0ffb7c Compare December 26, 2024 07:05
@xiguiw
Copy link
Collaborator

xiguiw commented Dec 30, 2024

@xiguiw .

On the Pydantic Model LLMParamsDoc , the attribute for stream is defined as streaming.

From my understanding and testing, Pydantic Model JSON Marshaling and Unmarshalling is based on the attribute variable name as the key in the json objects.

streaming: bool = True

This is why I changed stream to streaming to make the curl command work.

So if this is the desired case, we will need to change the pydantic model's key... that seems like a breaking change imo.

@jjmaturino
Good catching!

Thanks! Let me check it.

@jjmaturino
Copy link
Author

@xiguiw No problem! Let me know if I can help.

Also, any tips on getting PR's reviewed more quickly in the future?

I want to contribute as much as I can to this project and would like a tight feedback loop 🚀

Hope you have a great week and happy new year!

@xiguiw
Copy link
Collaborator

xiguiw commented Dec 31, 2024

@xiguiw No problem! Let me know if I can help.

Also, any tips on getting PR's reviewed more quickly in the future?

I want to contribute as much as I can to this project and would like a tight feedback loop 🚀

Hope you have a great week and happy new year!

@jjmaturino
Thank you for your contribution!
Sorry for the late response for the PR review.
We noted this. Now we are trying to improve this process.

My suggestion for the PR's review feedback is to describe what the problems, feature that the PR for clearly, so we get to know the priority of the PR.

Happy new year!

@xiguiw
Copy link
Collaborator

xiguiw commented Dec 31, 2024

@jjmaturino
Could this PR be closed?

@jjmaturino
Copy link
Author

@xiguiw yes! I was talking about other prs

@jjmaturino
Copy link
Author

Closed!

@jjmaturino jjmaturino closed this Dec 31, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants