Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: BeamSearch of LLM inference does not take into account temperature parameter when choosing best beams #201

Open
Muxas opened this issue Nov 13, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@Muxas
Copy link
Member

Muxas commented Nov 13, 2024

Temperature is only used to generate tokens with a help of a sampler, but it shall also define token probabilities when computing probability of a beam

@Muxas Muxas added the bug Something isn't working label Nov 13, 2024
@github-project-automation github-project-automation bot moved this to To do in NNTile Nov 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: To do
Development

No branches or pull requests

1 participant