-
Notifications
You must be signed in to change notification settings - Fork 28
FLUX Attention map height calculation #26
Copy link
Copy link
Open
Description
Hi, I have a question about the height calculation in the FluxPipeline_call method of attention_map_diffusers/modules.py.
The code passes height as:
height = 2 * (int(height) // (self.vae_scale_factor * 2)) // 2
But the saved attention map appears rectangular rather than square. In FluxPipeline_call, hidden_states seems shaped (bsz, h × w, num_channels_latents × 4), where h and w are height/vae_scale_factor and width/vae_scale_factor, respectively. Given that, the extra ×2 factors in the height calculation may be unnecessary.
Am I missing anything? Thanks.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels