Files
pytorch/docs/source/accelerator.md
Yu, Guangye 84f7e88aef Add unified memory APIs for torch.accelerator (#152932)
# Motivation
The following API will be put under torch.accelerator
- empty_cache
- max_memory_allocated
- max_memory_reserved
- memory_allocated
- memory_reserved
- memory_stats
- reset_accumulated_memory_stats
- reset_peak_memory_stats

Pull Request resolved: https://github.com/pytorch/pytorch/pull/152932
Approved by: https://github.com/albanD
ghstack dependencies: #138222
2025-08-08 17:41:22 +00:00

51 lines
839 B
Markdown

# torch.accelerator
```{eval-rst}
.. automodule:: torch.accelerator
```
```{eval-rst}
.. currentmodule:: torch.accelerator
```
```{eval-rst}
.. autosummary::
:toctree: generated
:nosignatures:
device_count
is_available
current_accelerator
set_device_index
set_device_idx
current_device_index
current_device_idx
set_stream
current_stream
synchronize
device_index
```
```{eval-rst}
.. automodule:: torch.accelerator.memory
```
```{eval-rst}
.. currentmodule:: torch.accelerator.memory
```
## Memory management
```{eval-rst}
.. autosummary::
:toctree: generated
:nosignatures:
empty_cache
max_memory_allocated
max_memory_reserved
memory_allocated
memory_reserved
memory_stats
reset_accumulated_memory_stats
reset_peak_memory_stats
```