Issue warning with reference to user code rather than torch (#155112)

Re-raising of #129959 as that was closed.

Warning message before:
```
/home/admin/.local/share/hatch/env/virtual/toms-project-1/Qv9k_r_5/dev/lib/python3.10/site-packages/torch/cuda/amp/grad_scaler.py:120: UserWarning: torch.cuda.amp.GradScaler is enabled, but CUDA is not available.  Disabling.
```

Warning message after:
```
/path/to/my/code:91: UserWarning: torch.cuda.amp.GradScaler is enabled, but CUDA is not available.  Disabling.
```

Helps the user find where the issue stems from in their code. What do you think?

(Looks like "skip_file_prefixes" is not available until Python 3.12 minimum...)

Pull Request resolved: https://github.com/pytorch/pytorch/pull/155112
Approved by: https://github.com/Skylion007, https://github.com/cyyever
This commit is contained in:
Tom McTiernan
2025-07-14 05:24:20 +00:00
committed by PyTorch MergeBot
parent 9ca080db87
commit aa11628576

View File

@ -134,7 +134,8 @@ class GradScaler:
if self._device == "cuda": if self._device == "cuda":
if enabled and torch.cuda.amp.common.amp_definitely_not_available(): if enabled and torch.cuda.amp.common.amp_definitely_not_available():
warnings.warn( warnings.warn(
"torch.cuda.amp.GradScaler is enabled, but CUDA is not available. Disabling." "torch.cuda.amp.GradScaler is enabled, but CUDA is not available. Disabling.",
stacklevel=2,
) )
self._enabled = False self._enabled = False