Finetuning SigLiT/ SigLIP details, optimizers, hyperparams, learning rate schedulers. #128
sharadchandakacherla
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Has anyone tried finetuning SigLiT/ SigLIP like model on a particular task, I am facing issues with very low loss values, and suspect the gradients not to be moving at all. If some one has tried it, would like to learn from their experiences.
Beta Was this translation helpful? Give feedback.
All reactions