Files
pytorch/docs/source/nn.attention.rst
Boyuan Feng 68134a320e [Flex Attention] Paged Attention (#137164)
This PR adds paged attention for flex attention.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/137164
Approved by: https://github.com/drisspg
2024-10-29 17:05:22 +00:00

33 lines
455 B
ReStructuredText

.. role:: hidden
:class: hidden-section
torch.nn.attention
==================
.. automodule:: torch.nn.attention
Utils
-------------------
.. autosummary::
:toctree: generated
:nosignatures:
sdpa_kernel
SDPBackend
Submodules
----------
.. autosummary::
:nosignatures:
flex_attention
bias
experimental
.. toctree::
:hidden:
nn.attention.flex_attention
nn.attention.bias
nn.attention.experimental