very small typo in fsdp2 comment (#163155)

Pull Request resolved: https://github.com/pytorch/pytorch/pull/163155
Approved by: https://github.com/awgu, https://github.com/Skylion007
This commit is contained in:
Hanchen Zhang
2025-09-17 20:19:38 +00:00
committed by PyTorch MergeBot
parent 876824f174
commit dfda2dfd53

View File

@ -230,7 +230,7 @@ class FSDPState(_State):
self, module: nn.Module, args: tuple[Any, ...], kwargs: dict[str, Any]
) -> tuple[tuple[Any, ...], dict[str, Any]]:
# When composing with module-hook-based activation checkpointing, the
# the pre-backward hook is responsible for the unshard
# pre-backward hook is responsible for the unshard
if self._training_state == TrainingState.PRE_BACKWARD:
return args, kwargs
self._training_state = TrainingState.FORWARD