Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade Transformers Library #460

Open
hemandee opened this issue Jul 29, 2024 · 1 comment
Open

Upgrade Transformers Library #460

hemandee opened this issue Jul 29, 2024 · 1 comment

Comments

@hemandee
Copy link

🚀 The feature, motivation and pitch

Looking for the transformers library to be upgraded to support LLAMA-3.1 models in sagemaker environments.

https://github.com/huggingface/transformers/releases/tag/v4.43.0
Currently if I manually upgrade I get the following warnings but installation completes:

....

   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 9.4/9.4 MB 76.3 MB/s eta 0:00:00
Installing collected packages: transformers
  Attempting uninstall: transformers
    Found existing installation: transformers 4.40.2
    Uninstalling transformers-4.40.2:
      Successfully uninstalled transformers-4.40.2
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
autogluon-multimodal 0.8.3 requires pandas<1.6,>=1.4.1, but you have pandas 2.1.4 which is incompatible.
autogluon-multimodal 0.8.3 requires pytorch-lightning<1.10.0,>=1.9.0, but you have pytorch-lightning 2.0.9 which is incompatible.
autogluon-multimodal 0.8.3 requires scikit-learn<1.4.1,>=1.1, but you have scikit-learn 1.4.2 which is incompatible.
autogluon-multimodal 0.8.3 requires torch<1.14,>=1.9, but you have torch 2.0.0.post104 which is incompatible.
autogluon-multimodal 0.8.3 requires torchmetrics<0.12.0,>=0.11.0, but you have torchmetrics 1.0.3 which is incompatible.
autogluon-multimodal 0.8.3 requires torchvision<0.15.0, but you have torchvision 0.15.2a0+ab7b3e6 which is incompatible.
autogluon-multimodal 0.8.3 requires transformers[sentencepiece]<4.41.0,>=4.36.0, but you have transformers 4.43.3 which is incompatible.
Successfully installed transformers-4.43.3

Use case description

No response

Alternatives

No response

Additional context

No response

@aws-tianquaw
Copy link
Contributor

Hi @hemandee, AutoGluon package that is consumed by SMD images currently only supports transformers up to 4.40.x version, so you will get the error messages when upgrading transformers to higher versions. If you only need to use transformers without autogluon, I'd suggest you to run micromamba install transformers==4.43.3 to install transformers. This way, micromamba will resolve the dependency group for you and remove AutoGluon to avoid dependency conflict. If you need to use autogluon with transformers>=4.41, please open a feature request in https://github.com/autogluon/autogluon asking the package owners to add the support.

Please let me know if you have further questions, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants