Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama LOCAL_API_URL not working #26

Open
papiche opened this issue Sep 11, 2024 · 1 comment
Open

ollama LOCAL_API_URL not working #26

papiche opened this issue Sep 11, 2024 · 1 comment

Comments

@papiche
Copy link

papiche commented Sep 11, 2024

I ried to register my ollama node into api_config.yaml

SERPER_API_KEY: null
OPENAI_API_KEY: null
ANTHROPIC_API_KEY: null
LOCAL_API_KEY: anykey
LOCAL_API_URL: http://127.0.0.1:11434

But encounter an error

python webapp.py --api_config api_config.yaml
== Init decompose_model with model: gpt-4o
[INFO]2024-09-11 20:58:57,178 __init__.py:61: == LLMClient is not specified, use default llm client.
Traceback (most recent call last):
  File "/home/frd/workspace/OpenFactVerification/webapp.py", line 84, in <module>
    factcheck_instance = FactCheck(
                         ^^^^^^^^^^
  File "/home/frd/workspace/OpenFactVerification/factcheck/__init__.py", line 63, in __init__
    setattr(self, key, LLMClient(model=_model_name, api_config=self.api_config))
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/frd/workspace/OpenFactVerification/factcheck/utils/llmclient/gpt_client.py", line 15, in __init__
    self.client = OpenAI(api_key=self.api_config["OPENAI_API_KEY"])
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/frd/miniconda3/lib/python3.12/site-packages/openai/_client.py", line 105, in __init__
    raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

What should I do ?
thanks

@PardonMySkillz
Copy link

I am also encountering the same problem. Seems like ollama is currently not supportedd.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants