Files
pytorch/docs/source/accelerator.md
Yu, Guangye 84f7e88aef Add unified memory APIs for torch.accelerator (#152932)
# Motivation
The following API will be put under torch.accelerator
- empty_cache
- max_memory_allocated
- max_memory_reserved
- memory_allocated
- memory_reserved
- memory_stats
- reset_accumulated_memory_stats
- reset_peak_memory_stats

Pull Request resolved: https://github.com/pytorch/pytorch/pull/152932
Approved by: https://github.com/albanD
ghstack dependencies: #138222
2025-08-08 17:41:22 +00:00

839 B

torch.accelerator

.. automodule:: torch.accelerator
.. currentmodule:: torch.accelerator
.. autosummary::
    :toctree: generated
    :nosignatures:

    device_count
    is_available
    current_accelerator
    set_device_index
    set_device_idx
    current_device_index
    current_device_idx
    set_stream
    current_stream
    synchronize
    device_index
.. automodule:: torch.accelerator.memory
.. currentmodule:: torch.accelerator.memory

Memory management

.. autosummary::
    :toctree: generated
    :nosignatures:

     empty_cache
     max_memory_allocated
     max_memory_reserved
     memory_allocated
     memory_reserved
     memory_stats
     reset_accumulated_memory_stats
     reset_peak_memory_stats