|
fbbab794ef
|
[ONNX] Implement Attention-23 (#156431)
Implement Attention-23 using sdpa and flexattention.
- I used copilot for this.
- Also updated the conversion logic to remove trailing None inputs.
@gramalingam @kunal-vaishnavi @titaiwangms
Pull Request resolved: https://github.com/pytorch/pytorch/pull/156431
Approved by: https://github.com/titaiwangms
Co-authored-by: kunal-vaishnavi <115581922+kunal-vaishnavi@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
|
2025-06-20 23:54:57 +00:00 |
|