mirror of
https://github.com/pytorch/pytorch.git
synced 2025-11-11 22:34:53 +08:00
[draft_export] add LOC for data-dep error logging (#145443)
Summary:
maybe this is too much info, but it's difficult to go through old draft export reports where the stack trace is out of sync with the current codebase. Data-dependent errors now look like:
```
2. Data dependent error.
When exporting, we were unable to evaluate the value of `u306`.
This occurred at the following stacktrace:
File /data/users/pianpwk/fbsource/buck-out/v2/gen/fbcode/78204cab86e8a0fb/sigmoid/inference/ts_migration/__pt2i_readiness_main__/pt2i_readiness_main#link-tree/caffe2/torch/fb/training_toolkit/common/proxy_module_thrift/embedding_bag_proxy.py, lineno 109, in _forward_impl:
`if offsets[-1] > len(input):`
As a result, it was specialized to evaluate to `261`, and asserts were inserted into the graph.
Please add `torch._check(...)` to the original code to assert this data-dependent assumption.
Please refer to https://docs.google.com/document/d/1kZ_BbB3JnoLbUZleDT6635dHs88ZVYId8jT-yTFgf3A/edit#heading=h.boi2xurpqa0o for more details.
```
This would be even more helpful for reports on torch-packaged models, but that requires some more work on PT2I-specific stack trace processing
Test Plan: .
Differential Revision: D68534017
Pull Request resolved: https://github.com/pytorch/pytorch/pull/145443
Approved by: https://github.com/angelayi
This commit is contained in:
committed by
PyTorch MergeBot
parent
c32bafeb0b
commit
cbc4094298
@ -55,6 +55,17 @@ def hash_stack(stack: list[dict[str, str]]) -> str:
|
||||
return ";".join(f'line: {s["line"]} filename: {s["filename"]}' for s in stack)
|
||||
|
||||
|
||||
def get_loc(filename: str, lineno: int) -> Optional[str]:
|
||||
try:
|
||||
with open(filename) as f:
|
||||
for i, line in enumerate(f):
|
||||
if i == lineno - 1:
|
||||
return line.strip()
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
return None
|
||||
|
||||
|
||||
class FailureReport:
|
||||
def __init__(
|
||||
self, failure_type: FailureType, data: dict[str, Any], xfail: bool = False
|
||||
@ -90,10 +101,18 @@ class FailureReport:
|
||||
"""
|
||||
|
||||
elif self.failure_type == FailureType.DATA_DEPENDENT_ERROR:
|
||||
loc = None
|
||||
if self.data["stack"]:
|
||||
frame = self.data["stack"][-1]
|
||||
loc = (
|
||||
f"`{get_loc(str_to_filename[frame['filename']], frame['line'])}`"
|
||||
or ""
|
||||
)
|
||||
return f"""Data dependent error.
|
||||
When exporting, we were unable to figure out if the expression `{self.data["expr"]}` always holds.
|
||||
When exporting, we were unable to evaluate the value of `{self.data["expr"]}`.
|
||||
This was encountered {self.data["occurrences"]} times.
|
||||
This occurred at the following stacktrace: {prettify_stack(self.data["stack"], str_to_filename)}.
|
||||
This occurred at the following stacktrace: {prettify_stack(self.data["stack"], str_to_filename)}:
|
||||
{loc}
|
||||
As a result, it was specialized to a constant (e.g. `{self.data["result"]}` in the 1st occurrence), and asserts were inserted into the graph.
|
||||
|
||||
Please add `torch._check(...)` to the original code to assert this data-dependent assumption.
|
||||
|
||||
Reference in New Issue
Block a user