[ROCM] fix native attention function call (#13650)

This commit is contained in:
Gordon Wong
2025-02-22 14:07:04 +08:00
committed by GitHub
parent 68d535ef44
commit 68d630a0c7

View File

@ -717,7 +717,6 @@ class ROCmFlashAttentionImpl(AttentionImpl):
self.num_heads,
self.head_size,
self.scale,
causal_mask,
attn_masks,
)
else: