Skip to content

Commit

Permalink
Fix batch > 1 in HunyuanVideo (#10548)
Browse files Browse the repository at this point in the history
  • Loading branch information
hlky authored Jan 14, 2025
1 parent aa79d7d commit 4a4afd5
Showing 1 changed file with 2 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -727,7 +727,8 @@ def forward(

for i in range(batch_size):
attention_mask[i, : effective_sequence_length[i]] = True
attention_mask = attention_mask.unsqueeze(1) # [B, 1, N], for broadcasting across attention heads
# [B, 1, 1, N], for broadcasting across attention heads
attention_mask = attention_mask.unsqueeze(1).unsqueeze(1)

# 4. Transformer blocks
if torch.is_grad_enabled() and self.gradient_checkpointing:
Expand Down

1 comment on commit 4a4afd5

@ITerydh
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the difference between this version and Jan 7, 2025?

Please sign in to comment.