4 Commits

Author SHA1 Message Date
711c8c821e shape guards (#161178)
Summary: This PR introduces shape guards to export. Previously only value ranges,  equalities, and specializations would be tracked for symbolic expressions, and we had a forward hook to check them. Instead now we create a function to check shape guards and call it in the exported program.

Test Plan:
updated several tests

Rollback Plan:

Differential Revision: D80713603

Pull Request resolved: https://github.com/pytorch/pytorch/pull/161178
Approved by: https://github.com/tugsbayasgalan
2025-09-08 22:44:09 +00:00
0a0a73a9a9 [cond] don't trace fw and bw graph in autograd key (#148930)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/148930
Approved by: https://github.com/zou3519
2025-03-24 17:07:29 +00:00
f28bac76f5 [AOTI Minifier] Save EP instead of graphs (#141159)
Summary:
`repro.py` can have nested graph modules, e.g.

```
class Repro(torch.nn.Module):
    def __init__(self) -> None:
        super().__init__()
        self.true_graph_0 = GraphModule()

    def forward(self):
        true_graph_0 = self.true_graph_0
        return (true_graph_0,)
```

So dumping the string doesn’t always work.

So,
1) we use exported program in repro.py instead
2) we still dump the graph module string, but only put it in comments

We also added two flags to `minifier_launcher.py`
- `minifier-export-mode`: whether strict or non-strict export is used in the minifier
- `skip-export-error`: intermediate graphs that cannot be exported will be skipped.

Test Plan:
```
buck2 run  fbcode//caffe2/test/inductor:minifier_utils_cpu  -- -r string
python test/inductor/test_minifier.py
```

Differential Revision: D66175257

Pull Request resolved: https://github.com/pytorch/pytorch/pull/141159
Approved by: https://github.com/henrylhtsang
2024-11-22 01:51:10 +00:00
c05813d2a9 [AOTI Minifier] Exclude illegal graphs from minifier search (#140999)
Summary:
Some graphs produced by the minifier graph cutter cannot be used for AOTI/export (illegal graphs), these should be considered as graphs that don't fail in the minifier, so the minifier keeps searching.

One example is the following graph, where `true_graph_0` is an fx.GraphModule. Here, export.export() would give a `UserError` with `ErrorType = UserErrorType.INVALID_OUTPUT`.

```
      # graph():
        #     %true_graph_0 : [num_users=1] = get_attr[target=true_graph_0]
        #     return (true_graph_0,)
```

This graph could be obtained from the module below:

```python
    class M(torch.nn.Module):
        def forward(self, x, flag):
            flag = flag.item()

            def true_fn(x):
                return x.clone()

            return torch.cond(flag > 0, true_fn, true_fn, [x])
 ```

So we detect such errors, and exclude them from minifier's search (consider these graphs as didn't fail).

This is ok and won't miss any actual errors, since the AOTI minifier is only designed to catch errors in the AOTI phase anyway, it is not responsible to catching export bugs.

Test Plan:
```
buck2 run  fbcode//caffe2/test/inductor:test_minifier_utils  -- -r invalid_output
```

Differential Revision: D66143487

Pull Request resolved: https://github.com/pytorch/pytorch/pull/140999
Approved by: https://github.com/henrylhtsang
2024-11-20 03:20:06 +00:00