mirror of
https://github.com/pytorch/pytorch.git
synced 2025-11-07 10:01:39 +08:00
[autograd.Function] Kill the extension feature flag (#92026)
This PR removes the autograd.Function extension feature flag. This was previously used for development of the functorch <> autograd.Function interaction. It's been in master for long enough with the feature flag defaulting to True, so it's time to remove it. Test Plan: - existing tests Pull Request resolved: https://github.com/pytorch/pytorch/pull/92026 Approved by: https://github.com/soulitzer
This commit is contained in:
committed by
PyTorch MergeBot
parent
7aaad0b832
commit
81cc9bba5e
@ -912,8 +912,7 @@ PyObject* THPFunction_apply(PyObject* cls, PyObject* inputs) {
|
||||
// autograd.Function may optionally contain a setup_context staticmethod.
|
||||
// In this case, autograd.Function.forward does NOT accept a ctx object.
|
||||
bool has_separate_setup_context_fn =
|
||||
(isAutogradFunctionExtensionEnabled() &&
|
||||
PyObject_HasAttrString(cls, "setup_context"));
|
||||
PyObject_HasAttrString(cls, "setup_context");
|
||||
|
||||
auto num_args = PyTuple_GET_SIZE(inputs);
|
||||
|
||||
|
||||
Reference in New Issue
Block a user