[Solved] Awful LORA results after update #1033
Replies: 3 comments 2 replies
-
Have you tried halving the LR? Kohya has made many many updates to his code so it is quite possible it is the cause... |
Beta Was this translation helpful? Give feedback.
-
I think I can confirm this. It doesn't matter what I do, results come out badly overcooked. I have tried many radically different LRs, different optimizers, different dropout rates, etc... EDIT: Or maybe not? I am looking at some of my earliest attempts with settings I thought were really bad and they seem to look okay-ish. Ill try again. |
Beta Was this translation helpful? Give feedback.
-
same problem here, very bad results since update |
Beta Was this translation helpful? Give feedback.
-
Hi!
EDIT: Reinstalling fixed my issue. I guess something broke during the update.
So I updated kohya_ss GUI for the first time in about 2 months and my LORA training now instantly gets overcooked, even when lowering the LR and increasing the weight_decay.
Using my old config files also gives overcooked LORAs when they used to work very well and just to check I tried using the same exact dataset but it's still overcooking quickly.
The update about double the amount of settings for my training so I guess it's one of these new settings that is causing this but I can't find any good documentation on these new settings, like Scale weight norms, different dropouts, block settigs and convolutions.
There are short explanations in the GUI but as a novice when it comes to machine learning they don't really help me.
xFormers is finally working after this update though and training seem to be about twice as fast but it doesn't really help me if the results are trash.
Anyone had a similar issue and resolved it?
Beta Was this translation helpful? Give feedback.
All reactions