Skip to content

Fix wrapped transformer config access in Flux2 Klein training#13219

Merged
sayakpaul merged 1 commit intohuggingface:mainfrom
tcaimm:fix-unwrap-guidance-config-flux2-klein
Mar 6, 2026
Merged

Fix wrapped transformer config access in Flux2 Klein training#13219
sayakpaul merged 1 commit intohuggingface:mainfrom
tcaimm:fix-unwrap-guidance-config-flux2-klein

Conversation

@tcaimm
Copy link
Contributor

@tcaimm tcaimm commented Mar 6, 2026

What does this PR do?

This PR fixes wrapped transformer config access in the Flux2 Klein DreamBooth LoRA training scripts.

When the transformer is wrapped by Accelerate, DDP, or FSDP, accessing transformer.config.guidance_embeds directly may not reliably read from the underlying model. This change uses unwrap_model(transformer).config.guidance_embeds instead, so guidance config is always read from the actual transformer module.

The fix is applied to:

  • examples/dreambooth/train_dreambooth_lora_flux2_klein.py
  • examples/dreambooth/train_dreambooth_lora_flux2_klein_img2img.py

Fixes # (issue)

Before submitting

Who can review?

@sayakpaul

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@sayakpaul sayakpaul merged commit e747fe4 into huggingface:main Mar 6, 2026
26 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants