Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gemini openai API #17

Closed
dranoto opened this issue Dec 27, 2024 · 2 comments
Closed

Gemini openai API #17

dranoto opened this issue Dec 27, 2024 · 2 comments

Comments

@dranoto
Copy link

dranoto commented Dec 27, 2024

I am unable to reach the gemini API using the openai_compatible settings.

The url should look like this:
url = f'{oai_compatible_url}/v1beta2/models/{self.model}

and the oai_compatible_url in the .env is: "https://generativelanguage.googleapis.com"

Can you add this as a feature request? I am happy to test. I tried to change the lm.py myself but it isn't working because I don't know what I am doing otherwise or if there is another section I missed.

@gkamer8
Copy link
Contributor

gkamer8 commented Dec 29, 2024

Hey, I'm glad you raised an issue, thank you!

So the problem is that Abbey adds a /v1 to the URL behind the scenes, but the Google OpenAI Compatible API doesn't use the /v1 convention. So, if you used "https://generativelanguage.googleapis.com/v1beta/openai" in your settings.yml file, an erroneous "/v1" would get added before "/chat/completions" (looking at Google docs here).

I just put in a fix so that if the URL you provide includes a path (i.e., /v1beta/openai) then the v1 won't get added. Feel free to pull and give it a spin! Your url for openai_compatible should be:

https://generativelanguage.googleapis.com/v1beta/openai

based on the linked Google documentation, with an entry in lms like:

- provider: openai_compatible
  model: "gemini-1.5-flash"
  context_length: 1_000_000
  name: "Gemini 1.5 Flash"

And that should work! Before you pull, confirm that you're using the most recent setup strategy using settings.yml.

If you're using the old setup with a backend specific .env file or shell script, you'll have to upgrade to the new setup (see the README), or modify the code yourself: just remove "/v1" anywhere it appears in the OpenAICompatible classes and rebuild.

@gkamer8 gkamer8 closed this as completed Dec 29, 2024
@gkamer8
Copy link
Contributor

gkamer8 commented Dec 29, 2024

Btw, I will put official gemini API support on the roadmap asap, but it was easier to just fix the bug for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants