Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update llama-run to include temperature option #10899

Merged
merged 1 commit into from
Dec 23, 2024

Conversation

ericcurtin
Copy link
Contributor

@ericcurtin ericcurtin commented Dec 19, 2024

This commit updates the examples/run/README.md file to include a new option for setting the temperature and updates the run.cpp file to parse this option.

@ericcurtin ericcurtin force-pushed the llama-run-temp branch 3 times, most recently from ca259bd to cd61ea0 Compare December 19, 2024 14:43
Copy link
Collaborator

@ngxson ngxson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tbh I'm not sure what's the long-term plan for llama-run.

My thought is that if now we add --temp, I'm pretty sure someone will also add other sampling params like top-k, top-p, DRY, etc in the near future, to a point that it will defeat the initial goal of llama-run which is "just run".

examples/run/run.cpp Outdated Show resolved Hide resolved
examples/run/run.cpp Outdated Show resolved Hide resolved
@ericcurtin
Copy link
Contributor Author

Tbh I'm not sure what's the long-term plan for llama-run.

My thought is that if now we add --temp, I'm pretty sure someone will also add other sampling params like top-k, top-p, DRY, etc in the near future, to a point that it will defeat the initial goal of llama-run which is "just run".

I had a use case for --temp.

If somebody has a use case for extra arguments, I don't have an immediate issue with merging them. Yes, I'd hope it would be less complex than llama-cli, but personally I have no issue with people adding extra args if they need them.

This commit updates the `examples/run/README.md` file to include a new
option for setting the temperature and updates the `run.cpp` file to
parse this option.

Signed-off-by: Eric Curtin <[email protected]>
@ericcurtin
Copy link
Contributor Author

But of course, simple use cases like:

llama-run smollm:135m

should continue to work.

@ericcurtin
Copy link
Contributor Author

This should be an easy review @slaren @ggerganov

@slaren slaren merged commit dab76c9 into ggerganov:master Dec 23, 2024
48 checks passed
@ericcurtin ericcurtin deleted the llama-run-temp branch December 23, 2024 11:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants