Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Azure, Palm, Anthropic, Cohere, Hugging Face Llama2 70b Models - using litellm #202

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

ishaan-jaff
Copy link

Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.

Motivation

  • Adding support for more models
  • Simplifying maintaining existing model integrations

Please describe the motivation of this PR and the goal you want to achieve through this PR.

Modification

This PR adds support for models from all the above mentioned providers using https://github.com/BerriAI/litellm/

All LLM API Models are guaranteed to have the same Input/Output interface

Here's a sample of how it's used:

from litellm import completion

## set ENV variables
# ENV variables can be set in .env file, too. Example in .env.example
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"
os.environ["HF_API_TOKEN"] = "hf-key"

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# hugging face llama2 call
completion(model="meta-llama/llama-2-7b-hf", messages=messages)

# hugging face llama2 Guanaco call
completion(model="TheBloke/llama-2-70b-Guanaco-QLoRA-fp16", messages=messages)

# cohere call
response = completion("command-nightly", messages)

# anthropic call
response = completion(model="claude-instant-1", messages=messages)

Please briefly describe what modification is made in this PR.

BC-breaking (Optional)

Does the modification introduce changes that break the backward compatibility of the downstream repositories?
If so, please describe how it breaks the compatibility and how the downstream projects should modify their code to keep compatibility with this PR.

Use cases (Optional)

If this PR introduces a new feature, it is better to list some use cases here and update the documentation.

Checklist

Before PR:

  • Pre-commit or other linting tools are used to fix the potential lint issues.
  • Bug fixes are fully covered by unit tests, the case that causes the bug should be added in the unit tests.
  • The modification is covered by complete unit tests. If not, please add more unit test to ensure the correctness.
  • The documentation has been modified accordingly, like docstring or example tutorials.

After PR:

  • If the modification has potential influence on downstream or other related projects, this PR should be tested with those projects.
  • CLA has been signed and all committers have signed the CLA in this PR.

@ishaan-jaff
Copy link
Author

@gaotongxiao can you please take a look at this PR when possible ?😊

Happy to add more docs/tests if this initial commit looks good

@ishaan-jaff
Copy link
Author

we're rolling out support for all Hugging Face chat+text models - happy to add examples of any if there are specific ones you'd like to add support for

Copy link
Contributor

@gaotongxiao gaotongxiao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR! It's a nice feature and I believe that the community would love it. But I think there are still some aspects to improve:

The modification has been made to OpenAI class, which was initially targeted at OpenAI's API and included some unique features, such as the management of orgs: https://github.com/InternLM/opencompass/blob/2cba53aef1300ecd6bbef46588f6f6c6b7c17a81/opencompass/models/openai_api.py#L83-L87
and the adjustment of max_out_len based on the hardcoded model names:
https://github.com/InternLM/opencompass/blob/2cba53aef1300ecd6bbef46588f6f6c6b7c17a81/opencompass/models/openai_api.py#L155-L166

These snippets are not quite compatible with other models and may introduce some unintended bugs in the future. I'd suggest creating a new class named LiteLLM in litellm_api.py, and pushing your implementation there, without the snippet I mentioned above and so it becomes general.

If possible, consider adding add a configuration like https://github.com/InternLM/opencompass/blob/main/configs/eval_gpt3.5.py demonstrating the usage of LiteLLM. You may also want to write a short usage guide at https://github.com/InternLM/opencompass/tree/main/docs/en/advanced_guides.

@@ -24,3 +24,4 @@ tokenizers>=0.13.3
torch>=1.13.1
tqdm==4.64.1
transformers>=4.29.1
litellm
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We might not add it to dependency - it's a very nice feature, but not a must for many users.

@@ -1,6 +1,7 @@
import json
import os
import time
from litellm import completion
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since it won't be a required dependency, the import can be moved to __init__. It's generally better to ask users to install litellm at runtime.

@tonysy
Copy link
Collaborator

tonysy commented Aug 21, 2023

@ishaan-jaff Hi, thanks for the contribution, any update plan?

@Leymore Leymore assigned gaotongxiao and unassigned Leymore Aug 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants