Files
pytorch/torch/_C/_aoti.pyi
Bin Bao 6680a83e89 [AOTI XPU] Support AOT Inductor for Intel GPU. (#140269)
This PR add XPU support for AOT Inductor, and reuse the corresponding UT.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/140269
Approved by: https://github.com/desertfire, https://github.com/EikanWang
ghstack dependencies: #140268

Co-authored-by: Bin Bao <binbao@meta.com>
2024-12-10 05:05:08 +00:00

25 lines
720 B
Python

from ctypes import c_void_p
from torch import Tensor
# Defined in torch/csrc/inductor/aoti_runner/pybind.cpp
# Tensor to AtenTensorHandle
def unsafe_alloc_void_ptrs_from_tensors(tensors: list[Tensor]) -> list[c_void_p]: ...
def unsafe_alloc_void_ptr_from_tensor(tensor: Tensor) -> c_void_p: ...
# AtenTensorHandle to Tensor
def alloc_tensors_by_stealing_from_void_ptrs(
handles: list[c_void_p],
) -> list[Tensor]: ...
def alloc_tensor_by_stealing_from_void_ptr(
handle: c_void_p,
) -> Tensor: ...
class AOTIModelContainerRunnerCpu: ...
class AOTIModelContainerRunnerCuda: ...
class AOTIModelContainerRunnerXpu: ...
# Defined in torch/csrc/inductor/aoti_package/pybind.cpp
class AOTIModelPackageLoader: ...