We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to answer search results in Chinese!
The text was updated successfully, but these errors were encountered:
你在后台把{"role": "user", "content": query},改成{"role": "user", "content": query+"请把你的回答翻译成中文"},很大几率会回答中文
Sorry, something went wrong.
Query by chinese and use a chinese model as the backend llm, like qwen, then the default output will be chinese
How to set the backend as "qwen", can you provide guidance?
你先自己用fastchat也好,什么也好,自己部署一个类似于openai 的api 的接口,然后修改 base_url=f"https://{self.model}.lepton.run/api/v1/",这一个链接,例如base_url=f"https:localhost:8000/v1/", 然后修改"LLM_MODEL": "mixtral-8x7b",这里的模型名字,就可以了,qwen不会生成Related,可能会报错,但不影响模型的回答。
base_url=f"https://{self.model}.lepton.run/api/v1/",
base_url=f"https:localhost:8000/v1/",
No branches or pull requests
How to answer search results in Chinese!
The text was updated successfully, but these errors were encountered: