Skip to content

FLUX Attention map height calculation #26

@mung3477

Description

@mung3477

Hi, I have a question about the height calculation in the FluxPipeline_call method of attention_map_diffusers/modules.py.

The code passes height as:

height = 2 * (int(height) // (self.vae_scale_factor * 2)) // 2

source.

But the saved attention map appears rectangular rather than square. In FluxPipeline_call, hidden_states seems shaped (bsz, h × w, num_channels_latents × 4), where h and w are height/vae_scale_factor and width/vae_scale_factor, respectively. Given that, the extra ×2 factors in the height calculation may be unnecessary.

Am I missing anything? Thanks.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions