Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error calling the Ollama #291

Open
ngocuyen1207 opened this issue Dec 2, 2024 · 2 comments
Open

Error calling the Ollama #291

ngocuyen1207 opened this issue Dec 2, 2024 · 2 comments
Labels
bug Something isn't working, either in /adalflow, /tutorials, or /use cases...

Comments

@ngocuyen1207
Copy link

Bug description

I tried to run this code:
from adalflow.core import Generator, ModelClientType, ModelClient

generator = Generator(
model_client=ModelClientType.OLLAMA(host = "0.0.0.0:11434"),
model_kwargs={
"model": "llama3.2",
},
)
llm_response = generator.call(prompt_kwargs={"input_str": "Hi"})

Then I got this error:
Error calling the model: <TITLE>internal error - server connection terminated</TITLE>

internal error - server connection terminated


Description: internal error - server connection terminated

What version are you seeing the problem on?

0.2.6

How to reproduce the bug

I tried to run this code:
from adalflow.core import Generator, ModelClientType, ModelClient

generator = Generator( 
   model_client=ModelClientType.OLLAMA(host = "0.0.0.0:11434"),
   model_kwargs={
   "model": "llama3.2",
},
)
llm_response = generator.call(prompt_kwargs={"input_str": "Hi"})

Error messages and logs

Error calling the model: <HEAD><TITLE>internal error - server connection terminated</TITLE></HEAD>
<BODY BGCOLOR="white" FGCOLOR="black"><H1>internal error - server connection terminated</H1><HR>
<FONT FACE="Helvetica,Arial"><B>
Description: internal error - server connection terminated</B></FONT>
<HR>
<!-- default "internal error - server connection terminated" response (502) -->
</BODY>

Environment

  • OS: [e.g., Linux, Windows, macOS]
    Linux

More info

No response

@ngocuyen1207 ngocuyen1207 added the bug Something isn't working, either in /adalflow, /tutorials, or /use cases... label Dec 2, 2024
@liyin2015
Copy link
Member

@liyin2015

@ajithvcoder
Copy link
Contributor

@ngocuyen1207
its working fine. I believe you didnt setup the ollama server in your local at "0.0.0.0:11434"
You need to setup the server locally first

install ollama
curl -fsSL https://ollama.com/install.sh | sh

serve it locally

ollama serve

pull desired model to local
ollama pull llama3.2:1b

check end point

 curl http://0.0.0.0:11434/api/generate -d '{
  "model": "llama3.2:1b",
  "prompt":"Why is the sky blue?"
}'

Now run below

from adalflow.components.model_client import OllamaClient

generator = Generator( 
   model_client=ModelClientType.OLLAMA(host = "0.0.0.0:11434"),
   model_kwargs={
    "model": "llama3.2:1b",
    "stream": False,
},
)

llm_response = generator.call(prompt_kwargs={"input_str": "why is sky blue"})
print(llm_response)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working, either in /adalflow, /tutorials, or /use cases...
Projects
None yet
Development

No branches or pull requests

3 participants