Merge LoRA - does not work #119
Unanswered
SwatMessiah
asked this question in
Q&A
Replies: 1 comment
-
i found that when i create a folder just on G:/Testfolder
but interesting for Dreambooth LoRA and Extract LoRA this is no problem |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Dreambooth LoRA and Extract LoRA is working fine .
but always when i try to Merge 2 LoRAs i get 2 different errors
usage: merge_lora.py [-h] [--v2] [--save_precision {None,float,fp16,bf16}] [--precision {float,fp16,bf16}]
[--sd_model SD_MODEL] [--save_to SAVE_TO] [--models [MODELS ...]] [--ratios [RATIOS ...]]
merge_lora.py: error: unrecognized arguments: els_2/model/MyLora-Merge.pt
Traceback (most recent call last):
File "G:_Stable-Diffusion\Stable_Diffusion\stable-diffusion-TrainingKohya\kohya\kohya_ss\networks\merge_lora.py", line 179, in
merge(args)
File "G:_Stable-Diffusion\Stable_Diffusion\stable-diffusion-TrainingKohya\kohya\kohya_ss\networks\merge_lora.py", line 128, in merge
assert len(args.models) == len(args.ratios), f"number of models must be equal to number of ratios / モデルの数と重みの数
は合わせてください"
AssertionError: number of models must be equal to number of ratios / モデルの数と重みの数は合わせてください
Beta Was this translation helpful? Give feedback.
All reactions