You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@ngocuyen1207
its working fine. I believe you didnt setup the ollama server in your local at "0.0.0.0:11434"
You need to setup the server locally first
install ollama curl -fsSL https://ollama.com/install.sh | sh
serve it locally
ollama serve
pull desired model to local ollama pull llama3.2:1b
check end point
curl http://0.0.0.0:11434/api/generate -d '{
"model": "llama3.2:1b",
"prompt":"Why is the sky blue?"
}'
Bug description
I tried to run this code:
from adalflow.core import Generator, ModelClientType, ModelClient
generator = Generator(
model_client=ModelClientType.OLLAMA(host = "0.0.0.0:11434"),
model_kwargs={
"model": "llama3.2",
},
)
llm_response = generator.call(prompt_kwargs={"input_str": "Hi"})
Then I got this error:
Error calling the model: <TITLE>internal error - server connection terminated</TITLE>
internal error - server connection terminated
Description: internal error - server connection terminated
What version are you seeing the problem on?
How to reproduce the bug
Error messages and logs
Environment
Linux
More info
No response
The text was updated successfully, but these errors were encountered: