mirror of
https://github.com/pytorch/pytorch.git
synced 2025-10-21 13:44:15 +08:00
This is the entrypoint for defining an opaque/blackbox (e.g. PyTorch will never peek into it) custom op. In this PR, you can specify backend impls and the abstract impl for this op. NB: most of this PR is docstrings, please don't be intimidated by the line count. There are a number of interesting features: - we infer the schema from type hints. In a followup I add the ability to manually specify a schema. - name inference. The user needs to manually specify an op name for now. In a followup we add the ability to automatically infer a name (this is a little tricky). - custom_op registrations can override each other. This makes them more pleasant to work with in environments like colab. - we require that the outputs of the custom_op do not alias any inputs or each other. We enforce this via a runtime check, but can relax this into an opcheck test if it really matters in the future. Test Plan: - new tests Pull Request resolved: https://github.com/pytorch/pytorch/pull/122344 Approved by: https://github.com/ezyang, https://github.com/albanD
15 lines
602 B
Python
15 lines
602 B
Python
# Allows one to expose an API in a private submodule publicly as per the definition
|
|
# in PyTorch's public api policy.
|
|
#
|
|
# It is a temporary solution while we figure out if it should be the long-term solution
|
|
# or if we should amend PyTorch's public api policy. The concern is that this approach
|
|
# may not be very robust because it's not clear what __module__ is used for.
|
|
# However, both numpy and jax overwrite the __module__ attribute of their APIs
|
|
# without problem, so it seems fine.
|
|
def exposed_in(module):
|
|
def wrapper(fn):
|
|
fn.__module__ = module
|
|
return fn
|
|
|
|
return wrapper
|