Replies: 3 comments
-
for me it is like catching the clouds while dancing in the dark- how to set-up the max_tokens settings inside the local-llm module? |
Beta Was this translation helpful? Give feedback.
-
Local-LLM has gone through a lot of changes. It is multimodal now, so the name |
Beta Was this translation helpful? Give feedback.
-
Local provider has been replaced by ezLocalai in the last release. |
Beta Was this translation helpful? Give feedback.
-
hello,
i'm using AGiXT with docker and have a local LLM Mistral-7B-OpenOrca for inference. i've tried several LLM and each just produces only 100-150 tokens (~460 characters) response. the max_tokens setting is changed to 16384 in the Defaults.py and config.json (in agents/local agent folder). the gpt4all agent gives >350 tokens (~1500 characters) response. how can i set up the max_tokens of response for a local agent?
Beta Was this translation helpful? Give feedback.
All reactions