mirror of
https://github.com/pytorch/pytorch.git
synced 2025-10-20 12:54:11 +08:00
This PR adds paged attention for flex attention. Pull Request resolved: https://github.com/pytorch/pytorch/pull/137164 Approved by: https://github.com/drisspg
33 lines
455 B
ReStructuredText
33 lines
455 B
ReStructuredText
.. role:: hidden
|
|
:class: hidden-section
|
|
|
|
torch.nn.attention
|
|
==================
|
|
|
|
.. automodule:: torch.nn.attention
|
|
|
|
Utils
|
|
-------------------
|
|
.. autosummary::
|
|
:toctree: generated
|
|
:nosignatures:
|
|
|
|
sdpa_kernel
|
|
SDPBackend
|
|
|
|
Submodules
|
|
----------
|
|
.. autosummary::
|
|
:nosignatures:
|
|
|
|
flex_attention
|
|
bias
|
|
experimental
|
|
|
|
.. toctree::
|
|
:hidden:
|
|
|
|
nn.attention.flex_attention
|
|
nn.attention.bias
|
|
nn.attention.experimental
|