Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Added temperature support for hosted LLM using vLLM Changes made: * Introduced default temperature of 0.7 in __init__ method * Updated JSON payload in submit_prompt method to include temperature This change allows users to control the randomness of the model's output. If not specified, it defaults to 0.7, providing a balance between creativity and coherence in the generated text.
- Loading branch information