Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loading popular anime LoRA causes incompatible keys error #10562

Open
bghira opened this issue Jan 13, 2025 · 1 comment
Open

Loading popular anime LoRA causes incompatible keys error #10562

bghira opened this issue Jan 13, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@bghira
Copy link
Contributor

bghira commented Jan 13, 2025

Describe the bug

My understanding is that this LoRA applies to the T5 model layers, which is where everything is getting thrown off.

The model I've reuploaded from CivitAI is available on the hub here.

Maybe this makes retrieval easier for you.

Reproduction

import torch
from diffusers import FluxPipeline
from huggingface_hub import hf_hub_download

base_model = 'black-forest-labs/FLUX.1-dev'
pipe = FluxPipeline.from_pretrained("black-forest-labs/FLUX.1-dev", torch_dtype=torch.bfloat16)

print("Loading Anime Variant 2 LoRA")
pipe.load_lora_weights('Anime v1.3.safetensors')
print("Loaded LoRA successfully.")

Logs

lora_te1_text_model_encoder_layers_0_mlp_fc1.alpha, lora_te1_text_model_encoder_layers_0_mlp_fc1.lora_down.weight, lora_te1_text_model_encoder_layers_0_mlp_fc1.lora_up.weight, lora_te1_text_model_encoder_layers_0_mlp_fc2.alpha, lora_te1_text_model_encoder_layers_0_mlp_fc2.lora_down.weight, lora_te1_text_model_encoder_layers_0_mlp_fc2.lora_up.weight, lora_te1_text_model_encoder_layers_0_self_attn_k_proj.alpha, lora_te1_text_model_encoder_layers_0_self_attn_k_proj.lora_down.weight, lora_te1_text_model_encoder_layers_0_self_attn_k_proj.lora_up.weight, lora_te1_text_model_encoder_layers_0_self_attn_out_proj.alpha, lora_te1_text_model_encoder_layers_0_self_attn_out_proj.lora_down.weight, lora_te1_text_model_encoder_layers_0_self_attn_out_proj.lora_up.weight, lora_te1_text_model_encoder_layers_0_self_attn_q_proj.alpha, lora_te1_text_model_encoder_layers_0_self_attn_q_proj.lora_down.weight, lora_te1_text_model_encoder_layers_0_self_attn_q_proj.lora_up.weight, lora_te1_text_model_encoder_layers_0_self_attn_v_proj.alpha, lora_te1_text_model_encoder_layers_0_self_attn_v_proj.lora_down.weight, lora_te1_text_model_encoder_layers_0_self_attn_v_proj.lora_up.weight, lora_te1_text_model_encoder_layers_10_mlp_fc1.alpha, lora_te1_text_model_encoder_layers_10_mlp_fc1.lora_down.weight, lora_te1_text_model_encoder_layers_10_mlp_fc1.lora_up.weight, lora_te1_text_model_encoder_layers_10_mlp_fc2.alpha, lora_te1_text_model_encoder_layers_10_mlp_fc2.lora_down.weight, lora_te1_text_model_encoder_layers_10_mlp_fc2.lora_up.weight, lora_te1_text_model_encoder_layers_10_self_attn_k_proj.alpha, lora_te1_text_model_encoder_layers_10_self_attn_k_proj.lora_down.weight, lora_te1_text_model_encoder_layers_10_self_attn_k_proj.lora_up.weight, lora_te1_text_model_encoder_layers_10_self_attn_out_proj.alpha, lora_te1_text_model_encoder_layers_10_self_attn_out_proj.lora_down.weight, lora_te1_text_model_encoder_layers_10_self_attn_out_proj.lora_up.weight, lora_te1_text_model_encoder_layers_10_self_attn_q_proj.alpha, lora_te1_text_model_encoder_layers_10_self_attn_q_proj.lora_down.weight, lora_te1_text_model_encoder_layers_10_self_attn_q_proj.lora_up.weight, lora_te1_text_model_encoder_layers_10_self_attn_v_proj.alpha, lora_te1_text_model_encoder_layers_10_self_attn_v_proj.lora_down.weight, lora_te1_text_model_encoder_layers_10_self_attn_v_proj.lora_up.weight, lora_te1_text_model_encoder_layers_11_mlp_fc1.alpha, lora_te1_text_model_encoder_layers_11_mlp_fc1.lora_down.weight, lora_te1_text_model_encoder_layers_11_mlp_fc1.lora_up.weight, lora_te1_text_model_encoder_layers_11_mlp_fc2.alpha, lora_te1_text_model_encoder_layers_11_mlp_fc2.lora_down.weight, lora_te1_text_model_encoder_layers_11_mlp_fc2.lora_up.weight, lora_te1_text_model_encoder_layers_11_self_attn_k_proj.alpha, lora_te1_text_model_encoder_layers_11_self_attn_k_proj.lora_down.weight, lora_te1_text_model_encoder_layers_11_self_attn_k_proj.lora_up.weight, lora_te1_text_model_encoder_layers_11_self_attn_out_proj.alpha, lora_te1_text_model_encoder_layers_11_self_attn_out_proj.lora_down.weight, lora_te1_text_model_encoder_layers_11_self_attn_out_proj.lora_up.weight, lora_te1_text_model_encoder_layers_11_self_attn_q_proj.alpha, lora_te1_text_model_encoder_layers_11_self_attn_q_proj.lora_down.weight, lora_te1_text_model_encoder_layers_11_self_attn_q_proj.lora_up.weight, lora_te1_text_model_encoder_layers_11_self_attn_v_proj.alpha, lora_te1_text_model_encoder_layers_11_self_attn_v_proj.lora_down.weight, lora_te1_text_model_encoder_layers_11_self_attn_v_proj.lora_up.weight, lora_te1_text_model_encoder_layers_1_mlp_fc1.alpha, lora_te1_text_model_encoder_layers_1_mlp_fc1.lora_down.weight, lora_te1_text_model_encoder_layers_1_mlp_fc1.lora_up.weight, lora_te1_text_model_encoder_layers_1_mlp_fc2.alpha, lora_te1_text_model_encoder_layers_1_mlp_fc2.lora_down.weight, lora_te1_text_model_encoder_layers_1_mlp_fc2.lora_up.weight, lora_te1_text_model_encoder_layers_1_self_attn_k_proj.alpha, lora_te1_text_model_encoder_layers_1_self_attn_k_proj.lora_down.weight, lora_te1_text_model_encoder_layers_1_self_attn_k_proj.lora_up.weight, lora_te1_text_model_encoder_layers_1_self_attn_out_proj.alpha, lora_te1_text_model_encoder_layers_1_self_attn_out_proj.lora_down.weight, lora_te1_text_model_encoder_layers_1_self_attn_out_proj.lora_up.weight, lora_te1_text_model_encoder_layers_1_self_attn_q_proj.alpha, lora_te1_text_model_encoder_layers_1_self_attn_q_proj.lora_down.weight, lora_te1_text_model_encoder_layers_1_self_attn_q_proj.lora_up.weight, lora_te1_text_model_encoder_layers_1_self_attn_v_proj.alpha, lora_te1_text_model_encoder_layers_1_self_attn_v_proj.lora_down.weight, lora_te1_text_model_encoder_layers_1_self_attn_v_proj.lora_up.weight, lora_te1_text_model_encoder_layers_2_mlp_fc1.alpha, lora_te1_text_model_encoder_layers_2_mlp_fc1.lora_down.weight, lora_te1_text_model_encoder_layers_2_mlp_fc1.lora_up.weight, lora_te1_text_model_encoder_layers_2_mlp_fc2.alpha, lora_te1_text_model_encoder_layers_2_mlp_fc2.lora_down.weight, lora_te1_text_model_encoder_layers_2_mlp_fc2.lora_up.weight, lora_te1_text_model_encoder_layers_2_self_attn_k_proj.alpha, lora_te1_text_model_encoder_layers_2_self_attn_k_proj.lora_down.weight, lora_te1_text_model_encoder_layers_2_self_attn_k_proj.lora_up.weight, lora_te1_text_model_encoder_layers_2_self_attn_out_proj.alpha, lora_te1_text_model_encoder_layers_2_self_attn_out_proj.lora_down.weight, lora_te1_text_model_encoder_layers_2_self_attn_out_proj.lora_up.weight, lora_te1_text_model_encoder_layers_2_self_attn_q_proj.alpha, lora_te1_text_model_encoder_layers_2_self_attn_q_proj.lora_down.weight, lora_te1_text_model_encoder_layers_2_self_attn_q_proj.lora_up.weight, lora_te1_text_model_encoder_layers_2_self_attn_v_proj.alpha, lora_te1_text_model_encoder_layers_2_self_attn_v_proj.lora_down.weight, lora_te1_text_model_encoder_layers_2_self_attn_v_proj.lora_up.weight, lora_te1_text_model_encoder_layers_3_mlp_fc1.alpha, lora_te1_text_model_encoder_layers_3_mlp_fc1.lora_down.weight, lora_te1_text_model_encoder_layers_3_mlp_fc1.lora_up.weight, lora_te1_text_model_encoder_layers_3_mlp_fc2.alpha, lora_te1_text_model_encoder_layers_3_mlp_fc2.lora_down.weight, lora_te1_text_model_encoder_layers_3_mlp_fc2.lora_up.weight, lora_te1_text_model_encoder_layers_3_self_attn_k_proj.alpha, lora_te1_text_model_encoder_layers_3_self_attn_k_proj.lora_down.weight, lora_te1_text_model_encoder_layers_3_self_attn_k_proj.lora_up.weight, lora_te1_text_model_encoder_layers_3_self_attn_out_proj.alpha, lora_te1_text_model_encoder_layers_3_self_attn_out_proj.lora_down.weight, lora_te1_text_model_encoder_layers_3_self_attn_out_proj.lora_up.weight, lora_te1_text_model_encoder_layers_3_self_attn_q_proj.alpha, lora_te1_text_model_encoder_layers_3_self_attn_q_proj.lora_down.weight, lora_te1_text_model_encoder_layers_3_self_attn_q_proj.lora_up.weight, lora_te1_text_model_encoder_layers_3_self_attn_v_proj.alpha, lora_te1_text_model_encoder_layers_3_self_attn_v_proj.lora_down.weight, lora_te1_text_model_encoder_layers_3_self_attn_v_proj.lora_up.weight, lora_te1_text_model_encoder_layers_4_mlp_fc1.alpha, lora_te1_text_model_encoder_layers_4_mlp_fc1.lora_down.weight, lora_te1_text_model_encoder_layers_4_mlp_fc1.lora_up.weight, lora_te1_text_model_encoder_layers_4_mlp_fc2.alpha, lora_te1_text_model_encoder_layers_4_mlp_fc2.lora_down.weight, lora_te1_text_model_encoder_layers_4_mlp_fc2.lora_up.weight, lora_te1_text_model_encoder_layers_4_self_attn_k_proj.alpha, lora_te1_text_model_encoder_layers_4_self_attn_k_proj.lora_down.weight, lora_te1_text_model_encoder_layers_4_self_attn_k_proj.lora_up.weight, lora_te1_text_model_encoder_layers_4_self_attn_out_proj.alpha, lora_te1_text_model_encoder_layers_4_self_attn_out_proj.lora_down.weight, lora_te1_text_model_encoder_layers_4_self_attn_out_proj.lora_up.weight, lora_te1_text_model_encoder_layers_4_self_attn_q_proj.alpha, lora_te1_text_model_encoder_layers_4_self_attn_q_proj.lora_down.weight, lora_te1_text_model_encoder_layers_4_self_attn_q_proj.lora_up.weight, lora_te1_text_model_encoder_layers_4_self_attn_v_proj.alpha, lora_te1_text_model_encoder_layers_4_self_attn_v_proj.lora_down.weight, lora_te1_text_model_encoder_layers_4_self_attn_v_proj.lora_up.weight, lora_te1_text_model_encoder_layers_5_mlp_fc1.alpha, lora_te1_text_model_encoder_layers_5_mlp_fc1.lora_down.weight, lora_te1_text_model_encoder_layers_5_mlp_fc1.lora_up.weight, lora_te1_text_model_encoder_layers_5_mlp_fc2.alpha, lora_te1_text_model_encoder_layers_5_mlp_fc2.lora_down.weight, lora_te1_text_model_encoder_layers_5_mlp_fc2.lora_up.weight, lora_te1_text_model_encoder_layers_5_self_attn_k_proj.alpha, lora_te1_text_model_encoder_layers_5_self_attn_k_proj.lora_down.weight, lora_te1_text_model_encoder_layers_5_self_attn_k_proj.lora_up.weight, lora_te1_text_model_encoder_layers_5_self_attn_out_proj.alpha, lora_te1_text_model_encoder_layers_5_self_attn_out_proj.lora_down.weight, lora_te1_text_model_encoder_layers_5_self_attn_out_proj.lora_up.weight, lora_te1_text_model_encoder_layers_5_self_attn_q_proj.alpha, lora_te1_text_model_encoder_layers_5_self_attn_q_proj.lora_down.weight, lora_te1_text_model_encoder_layers_5_self_attn_q_proj.lora_up.weight, lora_te1_text_model_encoder_layers_5_self_attn_v_proj.alpha, lora_te1_text_model_encoder_layers_5_self_attn_v_proj.lora_down.weight, lora_te1_text_model_encoder_layers_5_self_attn_v_proj.lora_up.weight, lora_te1_text_model_encoder_layers_6_mlp_fc1.alpha, lora_te1_text_model_encoder_layers_6_mlp_fc1.lora_down.weight, lora_te1_text_model_encoder_layers_6_mlp_fc1.lora_up.weight, lora_te1_text_model_encoder_layers_6_mlp_fc2.alpha, lora_te1_text_model_encoder_layers_6_mlp_fc2.lora_down.weight, lora_te1_text_model_encoder_layers_6_mlp_fc2.lora_up.weight, lora_te1_text_model_encoder_layers_6_self_attn_k_proj.alpha, lora_te1_text_model_encoder_layers_6_self_attn_k_proj.lora_down.weight, lora_te1_text_model_encoder_layers_6_self_attn_k_proj.lora_up.weight, lora_te1_text_model_encoder_layers_6_self_attn_out_proj.alpha, lora_te1_text_model_encoder_layers_6_self_attn_out_proj.lora_down.weight, lora_te1_text_model_encoder_layers_6_self_attn_out_proj.lora_up.weight, lora_te1_text_model_encoder_layers_6_self_attn_q_proj.alpha, lora_te1_text_model_encoder_layers_6_self_attn_q_proj.lora_down.weight, lora_te1_text_model_encoder_layers_6_self_attn_q_proj.lora_up.weight, lora_te1_text_model_encoder_layers_6_self_attn_v_proj.alpha, lora_te1_text_model_encoder_layers_6_self_attn_v_proj.lora_down.weight, lora_te1_text_model_encoder_layers_6_self_attn_v_proj.lora_up.weight, lora_te1_text_model_encoder_layers_7_mlp_fc1.alpha, lora_te1_text_model_encoder_layers_7_mlp_fc1.lora_down.weight, lora_te1_text_model_encoder_layers_7_mlp_fc1.lora_up.weight, lora_te1_text_model_encoder_layers_7_mlp_fc2.alpha, lora_te1_text_model_encoder_layers_7_mlp_fc2.lora_down.weight, lora_te1_text_model_encoder_layers_7_mlp_fc2.lora_up.weight, lora_te1_text_model_encoder_layers_7_self_attn_k_proj.alpha, lora_te1_text_model_encoder_layers_7_self_attn_k_proj.lora_down.weight, lora_te1_text_model_encoder_layers_7_self_attn_k_proj.lora_up.weight, lora_te1_text_model_encoder_layers_7_self_attn_out_proj.alpha, lora_te1_text_model_encoder_layers_7_self_attn_out_proj.lora_down.weight, lora_te1_text_model_encoder_layers_7_self_attn_out_proj.lora_up.weight, lora_te1_text_model_encoder_layers_7_self_attn_q_proj.alpha, lora_te1_text_model_encoder_layers_7_self_attn_q_proj.lora_down.weight, lora_te1_text_model_encoder_layers_7_self_attn_q_proj.lora_up.weight, lora_te1_text_model_encoder_layers_7_self_attn_v_proj.alpha, lora_te1_text_model_encoder_layers_7_self_attn_v_proj.lora_down.weight, lora_te1_text_model_encoder_layers_7_self_attn_v_proj.lora_up.weight, lora_te1_text_model_encoder_layers_8_mlp_fc1.alpha, lora_te1_text_model_encoder_layers_8_mlp_fc1.lora_down.weight, lora_te1_text_model_encoder_layers_8_mlp_fc1.lora_up.weight, lora_te1_text_model_encoder_layers_8_mlp_fc2.alpha, lora_te1_text_model_encoder_layers_8_mlp_fc2.lora_down.weight, lora_te1_text_model_encoder_layers_8_mlp_fc2.lora_up.weight, lora_te1_text_model_encoder_layers_8_self_attn_k_proj.alpha, lora_te1_text_model_encoder_layers_8_self_attn_k_proj.lora_down.weight, lora_te1_text_model_encoder_layers_8_self_attn_k_proj.lora_up.weight, lora_te1_text_model_encoder_layers_8_self_attn_out_proj.alpha, lora_te1_text_model_encoder_layers_8_self_attn_out_proj.lora_down.weight, lora_te1_text_model_encoder_layers_8_self_attn_out_proj.lora_up.weight, lora_te1_text_model_encoder_layers_8_self_attn_q_proj.alpha, lora_te1_text_model_encoder_layers_8_self_attn_q_proj.lora_down.weight, lora_te1_text_model_encoder_layers_8_self_attn_q_proj.lora_up.weight, lora_te1_text_model_encoder_layers_8_self_attn_v_proj.alpha, lora_te1_text_model_encoder_layers_8_self_attn_v_proj.lora_down.weight, lora_te1_text_model_encoder_layers_8_self_attn_v_proj.lora_up.weight, lora_te1_text_model_encoder_layers_9_mlp_fc1.alpha, lora_te1_text_model_encoder_layers_9_mlp_fc1.lora_down.weight, lora_te1_text_model_encoder_layers_9_mlp_fc1.lora_up.weight, lora_te1_text_model_encoder_layers_9_mlp_fc2.alpha, lora_te1_text_model_encoder_layers_9_mlp_fc2.lora_down.weight, lora_te1_text_model_encoder_layers_9_mlp_fc2.lora_up.weight, lora_te1_text_model_encoder_layers_9_self_attn_k_proj.alpha, lora_te1_text_model_encoder_layers_9_self_attn_k_proj.lora_down.weight, lora_te1_text_model_encoder_layers_9_self_attn_k_proj.lora_up.weight, lora_te1_text_model_encoder_layers_9_self_attn_out_proj.alpha, lora_te1_text_model_encoder_layers_9_self_attn_out_proj.lora_down.weight, lora_te1_text_model_encoder_layers_9_self_attn_out_proj.lora_up.weight, lora_te1_text_model_encoder_layers_9_self_attn_q_proj.alpha, lora_te1_text_model_encoder_layers_9_self_attn_q_proj.lora_down.weight, lora_te1_text_model_encoder_layers_9_self_attn_q_proj.lora_up.weight, lora_te1_text_model_encoder_layers_9_self_attn_v_proj.alpha, lora_te1_text_model_encoder_layers_9_self_attn_v_proj.lora_down.weight, lora_te1_text_model_encoder_layers_9_self_attn_v_proj.lora_up.weight, lora_transformer_single_transformer_blocks_0_attn_to_k.alpha, lora_transformer_single_transformer_blocks_0_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_0_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_0_attn_to_q.alpha, lora_transformer_single_transformer_blocks_0_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_0_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_0_attn_to_v.alpha, lora_transformer_single_transformer_blocks_0_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_0_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_10_attn_to_k.alpha, lora_transformer_single_transformer_blocks_10_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_10_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_10_attn_to_q.alpha, lora_transformer_single_transformer_blocks_10_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_10_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_10_attn_to_v.alpha, lora_transformer_single_transformer_blocks_10_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_10_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_11_attn_to_k.alpha, lora_transformer_single_transformer_blocks_11_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_11_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_11_attn_to_q.alpha, lora_transformer_single_transformer_blocks_11_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_11_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_11_attn_to_v.alpha, lora_transformer_single_transformer_blocks_11_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_11_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_12_attn_to_k.alpha, lora_transformer_single_transformer_blocks_12_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_12_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_12_attn_to_q.alpha, lora_transformer_single_transformer_blocks_12_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_12_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_12_attn_to_v.alpha, lora_transformer_single_transformer_blocks_12_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_12_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_13_attn_to_k.alpha, lora_transformer_single_transformer_blocks_13_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_13_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_13_attn_to_q.alpha, lora_transformer_single_transformer_blocks_13_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_13_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_13_attn_to_v.alpha, lora_transformer_single_transformer_blocks_13_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_13_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_14_attn_to_k.alpha, lora_transformer_single_transformer_blocks_14_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_14_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_14_attn_to_q.alpha, lora_transformer_single_transformer_blocks_14_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_14_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_14_attn_to_v.alpha, lora_transformer_single_transformer_blocks_14_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_14_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_15_attn_to_k.alpha, lora_transformer_single_transformer_blocks_15_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_15_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_15_attn_to_q.alpha, lora_transformer_single_transformer_blocks_15_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_15_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_15_attn_to_v.alpha, lora_transformer_single_transformer_blocks_15_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_15_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_16_attn_to_k.alpha, lora_transformer_single_transformer_blocks_16_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_16_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_16_attn_to_q.alpha, lora_transformer_single_transformer_blocks_16_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_16_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_16_attn_to_v.alpha, lora_transformer_single_transformer_blocks_16_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_16_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_17_attn_to_k.alpha, lora_transformer_single_transformer_blocks_17_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_17_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_17_attn_to_q.alpha, lora_transformer_single_transformer_blocks_17_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_17_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_17_attn_to_v.alpha, lora_transformer_single_transformer_blocks_17_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_17_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_18_attn_to_k.alpha, lora_transformer_single_transformer_blocks_18_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_18_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_18_attn_to_q.alpha, lora_transformer_single_transformer_blocks_18_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_18_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_18_attn_to_v.alpha, lora_transformer_single_transformer_blocks_18_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_18_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_19_attn_to_k.alpha, lora_transformer_single_transformer_blocks_19_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_19_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_19_attn_to_q.alpha, lora_transformer_single_transformer_blocks_19_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_19_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_19_attn_to_v.alpha, lora_transformer_single_transformer_blocks_19_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_19_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_1_attn_to_k.alpha, lora_transformer_single_transformer_blocks_1_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_1_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_1_attn_to_q.alpha, lora_transformer_single_transformer_blocks_1_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_1_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_1_attn_to_v.alpha, lora_transformer_single_transformer_blocks_1_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_1_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_20_attn_to_k.alpha, lora_transformer_single_transformer_blocks_20_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_20_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_20_attn_to_q.alpha, lora_transformer_single_transformer_blocks_20_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_20_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_20_attn_to_v.alpha, lora_transformer_single_transformer_blocks_20_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_20_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_21_attn_to_k.alpha, lora_transformer_single_transformer_blocks_21_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_21_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_21_attn_to_q.alpha, lora_transformer_single_transformer_blocks_21_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_21_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_21_attn_to_v.alpha, lora_transformer_single_transformer_blocks_21_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_21_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_22_attn_to_k.alpha, lora_transformer_single_transformer_blocks_22_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_22_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_22_attn_to_q.alpha, lora_transformer_single_transformer_blocks_22_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_22_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_22_attn_to_v.alpha, lora_transformer_single_transformer_blocks_22_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_22_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_23_attn_to_k.alpha, lora_transformer_single_transformer_blocks_23_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_23_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_23_attn_to_q.alpha, lora_transformer_single_transformer_blocks_23_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_23_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_23_attn_to_v.alpha, lora_transformer_single_transformer_blocks_23_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_23_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_24_attn_to_k.alpha, lora_transformer_single_transformer_blocks_24_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_24_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_24_attn_to_q.alpha, lora_transformer_single_transformer_blocks_24_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_24_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_24_attn_to_v.alpha, lora_transformer_single_transformer_blocks_24_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_24_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_25_attn_to_k.alpha, lora_transformer_single_transformer_blocks_25_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_25_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_25_attn_to_q.alpha, lora_transformer_single_transformer_blocks_25_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_25_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_25_attn_to_v.alpha, lora_transformer_single_transformer_blocks_25_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_25_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_26_attn_to_k.alpha, lora_transformer_single_transformer_blocks_26_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_26_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_26_attn_to_q.alpha, lora_transformer_single_transformer_blocks_26_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_26_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_26_attn_to_v.alpha, lora_transformer_single_transformer_blocks_26_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_26_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_27_attn_to_k.alpha, lora_transformer_single_transformer_blocks_27_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_27_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_27_attn_to_q.alpha, lora_transformer_single_transformer_blocks_27_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_27_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_27_attn_to_v.alpha, lora_transformer_single_transformer_blocks_27_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_27_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_28_attn_to_k.alpha, lora_transformer_single_transformer_blocks_28_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_28_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_28_attn_to_q.alpha, lora_transformer_single_transformer_blocks_28_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_28_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_28_attn_to_v.alpha, lora_transformer_single_transformer_blocks_28_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_28_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_29_attn_to_k.alpha, lora_transformer_single_transformer_blocks_29_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_29_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_29_attn_to_q.alpha, lora_transformer_single_transformer_blocks_29_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_29_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_29_attn_to_v.alpha, lora_transformer_single_transformer_blocks_29_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_29_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_2_attn_to_k.alpha, lora_transformer_single_transformer_blocks_2_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_2_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_2_attn_to_q.alpha, lora_transformer_single_transformer_blocks_2_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_2_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_2_attn_to_v.alpha, lora_transformer_single_transformer_blocks_2_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_2_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_30_attn_to_k.alpha, lora_transformer_single_transformer_blocks_30_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_30_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_30_attn_to_q.alpha, lora_transformer_single_transformer_blocks_30_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_30_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_30_attn_to_v.alpha, lora_transformer_single_transformer_blocks_30_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_30_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_31_attn_to_k.alpha, lora_transformer_single_transformer_blocks_31_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_31_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_31_attn_to_q.alpha, lora_transformer_single_transformer_blocks_31_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_31_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_31_attn_to_v.alpha, lora_transformer_single_transformer_blocks_31_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_31_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_32_attn_to_k.alpha, lora_transformer_single_transformer_blocks_32_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_32_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_32_attn_to_q.alpha, lora_transformer_single_transformer_blocks_32_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_32_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_32_attn_to_v.alpha, lora_transformer_single_transformer_blocks_32_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_32_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_33_attn_to_k.alpha, lora_transformer_single_transformer_blocks_33_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_33_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_33_attn_to_q.alpha, lora_transformer_single_transformer_blocks_33_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_33_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_33_attn_to_v.alpha, lora_transformer_single_transformer_blocks_33_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_33_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_34_attn_to_k.alpha, lora_transformer_single_transformer_blocks_34_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_34_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_34_attn_to_q.alpha, lora_transformer_single_transformer_blocks_34_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_34_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_34_attn_to_v.alpha, lora_transformer_single_transformer_blocks_34_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_34_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_35_attn_to_k.alpha, lora_transformer_single_transformer_blocks_35_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_35_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_35_attn_to_q.alpha, lora_transformer_single_transformer_blocks_35_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_35_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_35_attn_to_v.alpha, lora_transformer_single_transformer_blocks_35_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_35_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_36_attn_to_k.alpha, lora_transformer_single_transformer_blocks_36_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_36_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_36_attn_to_q.alpha, lora_transformer_single_transformer_blocks_36_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_36_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_36_attn_to_v.alpha, lora_transformer_single_transformer_blocks_36_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_36_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_37_attn_to_k.alpha, lora_transformer_single_transformer_blocks_37_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_37_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_37_attn_to_q.alpha, lora_transformer_single_transformer_blocks_37_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_37_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_37_attn_to_v.alpha, lora_transformer_single_transformer_blocks_37_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_37_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_3_attn_to_k.alpha, lora_transformer_single_transformer_blocks_3_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_3_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_3_attn_to_q.alpha, lora_transformer_single_transformer_blocks_3_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_3_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_3_attn_to_v.alpha, lora_transformer_single_transformer_blocks_3_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_3_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_4_attn_to_k.alpha, lora_transformer_single_transformer_blocks_4_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_4_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_4_attn_to_q.alpha, lora_transformer_single_transformer_blocks_4_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_4_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_4_attn_to_v.alpha, lora_transformer_single_transformer_blocks_4_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_4_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_5_attn_to_k.alpha, lora_transformer_single_transformer_blocks_5_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_5_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_5_attn_to_q.alpha, lora_transformer_single_transformer_blocks_5_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_5_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_5_attn_to_v.alpha, lora_transformer_single_transformer_blocks_5_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_5_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_6_attn_to_k.alpha, lora_transformer_single_transformer_blocks_6_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_6_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_6_attn_to_q.alpha, lora_transformer_single_transformer_blocks_6_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_6_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_6_attn_to_v.alpha, lora_transformer_single_transformer_blocks_6_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_6_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_7_attn_to_k.alpha, lora_transformer_single_transformer_blocks_7_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_7_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_7_attn_to_q.alpha, lora_transformer_single_transformer_blocks_7_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_7_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_7_attn_to_v.alpha, lora_transformer_single_transformer_blocks_7_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_7_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_8_attn_to_k.alpha, lora_transformer_single_transformer_blocks_8_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_8_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_8_attn_to_q.alpha, lora_transformer_single_transformer_blocks_8_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_8_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_8_attn_to_v.alpha, lora_transformer_single_transformer_blocks_8_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_8_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_9_attn_to_k.alpha, lora_transformer_single_transformer_blocks_9_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_9_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_9_attn_to_q.alpha, lora_transformer_single_transformer_blocks_9_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_9_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_9_attn_to_v.alpha, lora_transformer_single_transformer_blocks_9_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_9_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_0_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_0_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_0_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_0_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_0_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_0_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_0_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_0_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_0_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_0_attn_to_add_out.alpha, lora_transformer_transformer_blocks_0_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_0_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_0_attn_to_k.alpha, lora_transformer_transformer_blocks_0_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_0_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_0_attn_to_out_0.alpha, lora_transformer_transformer_blocks_0_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_0_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_0_attn_to_q.alpha, lora_transformer_transformer_blocks_0_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_0_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_0_attn_to_v.alpha, lora_transformer_transformer_blocks_0_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_0_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_10_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_10_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_10_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_10_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_10_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_10_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_10_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_10_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_10_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_10_attn_to_add_out.alpha, lora_transformer_transformer_blocks_10_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_10_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_10_attn_to_k.alpha, lora_transformer_transformer_blocks_10_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_10_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_10_attn_to_out_0.alpha, lora_transformer_transformer_blocks_10_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_10_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_10_attn_to_q.alpha, lora_transformer_transformer_blocks_10_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_10_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_10_attn_to_v.alpha, lora_transformer_transformer_blocks_10_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_10_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_11_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_11_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_11_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_11_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_11_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_11_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_11_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_11_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_11_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_11_attn_to_add_out.alpha, lora_transformer_transformer_blocks_11_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_11_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_11_attn_to_k.alpha, lora_transformer_transformer_blocks_11_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_11_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_11_attn_to_out_0.alpha, lora_transformer_transformer_blocks_11_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_11_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_11_attn_to_q.alpha, lora_transformer_transformer_blocks_11_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_11_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_11_attn_to_v.alpha, lora_transformer_transformer_blocks_11_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_11_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_12_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_12_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_12_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_12_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_12_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_12_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_12_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_12_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_12_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_12_attn_to_add_out.alpha, lora_transformer_transformer_blocks_12_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_12_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_12_attn_to_k.alpha, lora_transformer_transformer_blocks_12_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_12_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_12_attn_to_out_0.alpha, lora_transformer_transformer_blocks_12_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_12_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_12_attn_to_q.alpha, lora_transformer_transformer_blocks_12_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_12_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_12_attn_to_v.alpha, lora_transformer_transformer_blocks_12_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_12_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_13_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_13_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_13_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_13_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_13_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_13_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_13_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_13_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_13_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_13_attn_to_add_out.alpha, lora_transformer_transformer_blocks_13_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_13_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_13_attn_to_k.alpha, lora_transformer_transformer_blocks_13_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_13_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_13_attn_to_out_0.alpha, lora_transformer_transformer_blocks_13_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_13_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_13_attn_to_q.alpha, lora_transformer_transformer_blocks_13_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_13_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_13_attn_to_v.alpha, lora_transformer_transformer_blocks_13_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_13_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_14_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_14_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_14_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_14_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_14_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_14_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_14_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_14_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_14_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_14_attn_to_add_out.alpha, lora_transformer_transformer_blocks_14_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_14_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_14_attn_to_k.alpha, lora_transformer_transformer_blocks_14_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_14_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_14_attn_to_out_0.alpha, lora_transformer_transformer_blocks_14_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_14_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_14_attn_to_q.alpha, lora_transformer_transformer_blocks_14_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_14_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_14_attn_to_v.alpha, lora_transformer_transformer_blocks_14_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_14_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_15_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_15_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_15_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_15_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_15_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_15_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_15_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_15_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_15_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_15_attn_to_add_out.alpha, lora_transformer_transformer_blocks_15_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_15_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_15_attn_to_k.alpha, lora_transformer_transformer_blocks_15_attn_to_k.lora_down.weight, ...snip...

System Info

Latest git main diffusers branch.

Who can help?

@DN6 @a-r-r-o-w

@bghira bghira added the bug Something isn't working label Jan 13, 2025
@bghira
Copy link
Contributor Author

bghira commented Jan 14, 2025

@hlky i'm curious your thoughts on this one, we may want to just ignore T5 layers and continue loading the model but i think the overall key layout of this isn't getting properly detected, i didn't have enough time to dig into why that is, but it was hitting the convert sd-scripts to ai-toolkit function

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant