You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
....
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 9.4/9.4 MB 76.3 MB/s eta 0:00:00
Installing collected packages: transformers
Attempting uninstall: transformers
Found existing installation: transformers 4.40.2
Uninstalling transformers-4.40.2:
Successfully uninstalled transformers-4.40.2
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
autogluon-multimodal 0.8.3 requires pandas<1.6,>=1.4.1, but you have pandas 2.1.4 which is incompatible.
autogluon-multimodal 0.8.3 requires pytorch-lightning<1.10.0,>=1.9.0, but you have pytorch-lightning 2.0.9 which is incompatible.
autogluon-multimodal 0.8.3 requires scikit-learn<1.4.1,>=1.1, but you have scikit-learn 1.4.2 which is incompatible.
autogluon-multimodal 0.8.3 requires torch<1.14,>=1.9, but you have torch 2.0.0.post104 which is incompatible.
autogluon-multimodal 0.8.3 requires torchmetrics<0.12.0,>=0.11.0, but you have torchmetrics 1.0.3 which is incompatible.
autogluon-multimodal 0.8.3 requires torchvision<0.15.0, but you have torchvision 0.15.2a0+ab7b3e6 which is incompatible.
autogluon-multimodal 0.8.3 requires transformers[sentencepiece]<4.41.0,>=4.36.0, but you have transformers 4.43.3 which is incompatible.
Successfully installed transformers-4.43.3
Use case description
No response
Alternatives
No response
Additional context
No response
The text was updated successfully, but these errors were encountered:
Hi @hemandee, AutoGluon package that is consumed by SMD images currently only supports transformers up to 4.40.x version, so you will get the error messages when upgrading transformers to higher versions. If you only need to use transformers without autogluon, I'd suggest you to run micromamba install transformers==4.43.3 to install transformers. This way, micromamba will resolve the dependency group for you and remove AutoGluon to avoid dependency conflict. If you need to use autogluon with transformers>=4.41, please open a feature request in https://github.com/autogluon/autogluon asking the package owners to add the support.
Please let me know if you have further questions, thanks!
🚀 The feature, motivation and pitch
Looking for the transformers library to be upgraded to support LLAMA-3.1 models in sagemaker environments.
https://github.com/huggingface/transformers/releases/tag/v4.43.0
Currently if I manually upgrade I get the following warnings but installation completes:
Use case description
No response
Alternatives
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: