Commit Graph

13 Commits

Author SHA1 Message Date
f10c3f4184 Fix module pre bw hooks when input doesn't req grad but gradients are changed by the user (#116454)
As per title.

FYI @vkuzo
Pull Request resolved: https://github.com/pytorch/pytorch/pull/116454
Approved by: https://github.com/mikaylagawarecki
2023-12-28 18:32:50 +00:00
6f3cd046ab [BE] remove skipIfDynamo for some module hook tests (#114387)
As titled.

Test Plan:
exiting tests.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/114387
Approved by: https://github.com/ezyang
2023-11-22 22:15:34 +00:00
2f51b9223c Make sure namedtuple are preserved when adding backward hooks on Module (#112433)
As per title.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/112433
Approved by: https://github.com/mikaylagawarecki
2023-10-31 18:40:35 +00:00
79c5e33349 [BE] Enable ruff's UP rules and autoformat nn/ mps/ and torch/ (#105436)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/105436
Approved by: https://github.com/malfet, https://github.com/albanD
2023-07-21 07:38:46 +00:00
1ad435772b Added option to always call nn.Module global/non-global forward hooks (#104278)
Fix #103997

Pull Request resolved: https://github.com/pytorch/pytorch/pull/104278
Approved by: https://github.com/albanD
2023-07-10 18:58:07 +00:00
ee1c539ecf Fix module backward pre-hooks to actually update gradient (#97983)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/97983
Approved by: https://github.com/albanD
2023-03-30 20:33:44 +00:00
2f6a371ae9 Revert "Optimize nn.Module __call__ fast path for dynamo (#95931)" (#96242)
Reverting due to concerns over silent unsoundness (skipped hooks) if users have directly added hooks dicts without using official torch APIs.

This reverts commit 26045336ca323fd27cff2a7340fe896117d5fb6e.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/96242
Approved by: https://github.com/albanD
2023-03-10 01:05:01 +00:00
26045336ca Optimize nn.Module __call__ fast path for dynamo (#95931)
This PR optimizes the guards overhead introduced by dynamo tracing module forward hooks.

It can and maybe should be followed by a wider change proposed by @voznesenskym to optimize specialized nnmodules by 'observing' any user mutations and directly invalidating the root guard, obviating the need to install other nnmodule guards.  (But this observer change seems more involved...)

Idea: maintain a flag, and keep it up to date whenever adding or
removing hooks. Use the flag rather than dict checks to enter the call fast path.
  - need to extend RemovableHandle to keep a ref to nnModule so it can update the flag on removal.
  - also need to handle the flag in ScriptModule which still uses the python call impl when called from python.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/95931
Approved by: https://github.com/ezyang, https://github.com/voznesenskym
2023-03-04 15:09:40 +00:00
046e88a291 [BE] [3/3] Rewrite super() calls in test (#94592)
Rewrite Python built-in class `super()` calls. Only non-semantic changes should be applied.

- #94587
- #94588
- #94592

Also, methods with only a `super()` call are removed:

```diff
class MyModule(nn.Module):
-   def __init__(self):
-       super().__init__()
-
    def forward(self, ...):
        ...
```

Some cases that change the semantics should be kept unchanged. E.g.:

f152a79be9/caffe2/python/net_printer.py (L184-L190)

f152a79be9/test/test_jit_fuser_te.py (L2628-L2635)

Pull Request resolved: https://github.com/pytorch/pytorch/pull/94592
Approved by: https://github.com/ezyang, https://github.com/seemethere
2023-02-12 22:20:53 +00:00
387ca598a1 [nn] full_backward{_pre}_hook: warning for Module returning dict, list, etc (#87547)
Fixes https://github.com/pytorch/pytorch/issues/87540

Pull Request resolved: https://github.com/pytorch/pytorch/pull/87547
Approved by: https://github.com/albanD
2023-01-18 06:28:00 +00:00
c651944f92 [test_nn] split hooks test from test_nn (#89201)
Ref: https://github.com/pytorch/pytorch/issues/63085

Note: Doesn't need corresponding XLA PR as the migrated tests were not run on XLA (as they weren't in TestNNDeviceType).

Pull Request resolved: https://github.com/pytorch/pytorch/pull/89201
Approved by: https://github.com/albanD
2022-11-23 08:39:45 +00:00
f5d18574a3 Allow Module forward-pre and forward hooks to take kwargs (#89389)
closes #35643

This PR is mostly borrowed from #82042. Thanks @Padarn for implementing
the first version and debugging into the errors.

Based on the discussion in #82042 this PR adds a with_kwargs
argument to register_forward_pre_hook and register_forward_hook
methods. When the arg is set to true, the provided hook must accept
kwargs args. Under the hook, this PR adds a
`_forward_pre_hooks_with_kwargs` and a `_forward_hook_with_kwargs`
set to keep track of which hooks accept kwargs.

Differential Revision: [D41431111](https://our.internmc.facebook.com/intern/diff/D41431111)

Pull Request resolved: https://github.com/pytorch/pytorch/pull/89389
Approved by: https://github.com/soulitzer
2022-11-23 02:43:32 +00:00
82698b8954 Add prepend argument to nn.Module hooks (#87370)
cc @ezyang @gchanan
Pull Request resolved: https://github.com/pytorch/pytorch/pull/87370
Approved by: https://github.com/soulitzer
2022-10-25 19:18:04 +00:00