-
-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
It looks like the prompt wasn't removed from the response when using the Hugging Face provider. #294
Comments
Hey @danny-su, I'm pretty sure this is an issue with the LLM and not the plugin, since the plugin sets the message from the LLM's response. Edit: Huh, unless this LLM returns the prompt in the response on purpose. Is this something that can be configured? |
@Blarc You can remove the leading prompt by its length. |
@danny-su I am still not exactly sure how to implement this.
|
@Blarc You need to remove the first n characters; there is no option for this purpose. |
@Blarc It seems not to work as expected. |
@danny-su That is a bit odd, since it works fine for me. Does the generated message contain the whole prompt or only a part? |
@Blarc It only contains part of the prompt. |
@Blarc, I got the connection time-out error when using Llama 3.3. |
The text was updated successfully, but these errors were encountered: