[dynamo] do not issue lru_cache warning for functions in the top-level torch namespace (#157598)

`lru_cache` usage warning was being raised for `torch.get_device_module()`.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/157598
Approved by: https://github.com/Sidharth123-cpu
This commit is contained in:
William Wen
2025-07-03 21:46:19 -07:00
committed by PyTorch MergeBot
parent 64f2ec77f8
commit 52e4e41cbc
2 changed files with 2 additions and 1 deletions

View File

@ -1589,7 +1589,7 @@ class WrapperUserFunctionVariable(VariableTracker):
target_fn = getattr(self.wrapper_obj, self.attr_to_trace, None)
module_name = getattr(target_fn, "__module__", "") or ""
if not module_name.startswith("torch."):
if module_name.split(".", maxsplit=1)[0] != "torch":
msg = (
"Dynamo detected a call to a `functools.lru_cache`-wrapped "
"function. Dynamo ignores the cache wrapper and directly "