-
The Sana training is working, but there is a >200GB of disk space used for the text cache. How can I keep Gemma in memory and have it make embeddings during training instead of pre-caching? |
Beta Was this translation helpful? Give feedback.
Answered by
bghira
Dec 19, 2024
Replies: 1 comment 2 replies
-
this isn't supported, and that would add more than 9GB of VRAM to the consumption |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
if you're using GH200s you're already wasting money like an idiot