[custom_ops] extend impl_abstract to work with existing torch.library ops (#106088)

This PR extends impl_abstract to work with existing
torch.library/TORCH_LIBRARY ops.

There's a question of what to do if the user calls impl_abstract
and the op already has a registration for:
- DispatchKey::Meta. We raise.
- DispatchKey::CompositeImplicitAutograd. We raise.
- DispatchKey::CompositeExplicitAutograd. To be pragmatic, we don't
raise, since the user's CompositeExplicitAutograd might work for all
other backends but Meta.

Test Plan:
- new tests
Pull Request resolved: https://github.com/pytorch/pytorch/pull/106088
Approved by: https://github.com/soulitzer
ghstack dependencies: #106075, #106076
This commit is contained in:
Richard Zou
2023-08-07 08:00:17 -07:00
committed by PyTorch MergeBot
parent cebff39fad
commit 16b6873885
3 changed files with 99 additions and 1 deletions

View File

@ -250,7 +250,7 @@ def impl_abstract(qualname, *, func=None):
"""
def inner(func):
custom_op = _find_custom_op(qualname)
custom_op = _find_custom_op(qualname, also_check_torch_library=True)
custom_op.impl_abstract(_stacklevel=3)(func)
return func