Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DoRA] Fix TypeError for LoRA config when use_dora is False in DreamBooth SDXL #9842

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

adhiiisetiawan
Copy link

This PR fixes a TypeError that occurs in DreamBooth SDXL training with LoRA when use_dora=False. Currently, the script always includes the use_dora parameter in LoraConfig, causing errors with PEFT versions that don't support DoRA. This PR modifies the configuration to only include use_dora when it's explicitly enabled.

Changes made:

  1. Added conditional logic for DoRA parameter in LoRA configuration
  2. Created helper function get_lora_config to handle configuration creation

Fixes #9841

@sayakpaul

Comment on lines +1194 to +1195
if use_dora:
base_config["use_dora"] = True
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This would fail for lower version of peft where use_dora isn't present in the call args.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for pointing that out! Could you clarify which version of peft lacks use_dora in the call args?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if lora_config_kwargs["use_dora"] and is_peft_version("<", "0.9.0"):

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[TypeError] in DreamBooth SDXL LoRA training when use_dora parameter is False
2 participants