Commit Graph

19 Commits

Author SHA1 Message Date
f77a9a585c Add shape function for movedim op (#91696)
Signed-Off By: Vivek Khandelwal<vivek@nod-labs.com>

Pull Request resolved: https://github.com/pytorch/pytorch/pull/91696
Approved by: https://github.com/davidberard98
2023-01-06 18:24:52 +00:00
8695f0cced Rectify native_batch_norm schema by splitting it into two legit schemas (#88697)
Using the same repro from the issue (but with BatchNorm2D)

Rectifies native_batch_norm schema by splitting the schema into 2:
1. one will have NON-optional alias-able running_mean and running_var inputs
2. the other will just not have those parameters at all (no_stats variation)

**Calling for name suggestions!**

## test plan
I've added tests in test_functionalization.py as well as an entry in common_method_invocations.py for `native_batch_norm_legit`
CI should pass.

## next steps
Because of bc/fc reasons, we reroute native_batch_norm to call our new schemas ONLY through the python dispatcher, but in 2 weeks or so, we should make `native_batch_norm_legit` the official batch_norm.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/88697
Approved by: https://github.com/albanD
2022-11-23 23:23:17 +00:00
7ddf167ba5 Move the asserts in shape functions upsample_nearest_2d op. (#85801)
The assert check are moved to top and the function now returns out. This is needed by the downstream torch-mlir project to correctly determine the output type.

Fixes #ISSUE_NUMBER

Pull Request resolved: https://github.com/pytorch/pytorch/pull/85801
Approved by: https://github.com/eellison
2022-09-30 18:30:06 +00:00
35d4fa444b Fix for transposed convolution shape functions (#83557)
This fixes an issue with #80860 when in channels and out channels are different.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/83557
Approved by: https://github.com/Gamrix
2022-08-22 19:05:41 +00:00
652fb03355 Symbolic Shape Analaysis: Add Generalized List of Tensor Shape Support (#78679)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/78679
Approved by: https://github.com/davidberard98
2022-08-17 19:13:26 +00:00
59fccab857 [Shape Fns] Fix handling of empty dim list in sum_mean_dim shape fn (#83357)
The current implementation of the `sum_mean_dim` shape function
takes `dim=[]` and `dim=None` to mean "no reduction". However, in the
ops `torch.sum` and `torch.mean`, both `dim=[]` and `dim=None` are
equivalent to "reduce along all dimensions". This commit fixes the
handling of `dim` in the `sum_mean_dim` shape function.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/83357
Approved by: https://github.com/Gamrix
2022-08-16 17:13:21 +00:00
c177a7124c Adding additional debug logging and documentation for shape functions (#77115)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/77115
Approved by: https://github.com/eellison
2022-08-15 23:39:28 +00:00
a7e7fbab82 Add shape functions for conv_transpose2d.input and convolution (#80860)
As @silvasean requested in [this issue](https://github.com/llvm/torch-mlir/pull/917#discussion_r896154545) here is the shape code from Torch-MLIR for conv_transpose2d.input and convolution (updated for the transposed case).
Pull Request resolved: https://github.com/pytorch/pytorch/pull/80860
Approved by: https://github.com/Gamrix
2022-08-13 01:19:59 +00:00
c8eae2de52 [Shape Fns] Fix optional None for the actual shape functions (#83092)
I think the person who edited `mean.dim` edited the python and the associated Torchscript version manually in two different ways. This diff fixes that all up. It also fixes the inconsistencies in the `torch.nonzero` generated file

Pull Request resolved: https://github.com/pytorch/pytorch/pull/83092
Approved by: https://github.com/eellison
2022-08-10 18:20:18 +00:00
2bfae07a79 Enable dim=None for torch.mean (#81286)
Part of #79525

This will require coordination with XLA before merging, just like #79881
Pull Request resolved: https://github.com/pytorch/pytorch/pull/81286
Approved by: https://github.com/albanD
2022-07-28 22:34:56 +00:00
23bdb570cf Reland: Enable dim=None for torch.sum (#79881)
Part of #29137

Reland of #75845
Pull Request resolved: https://github.com/pytorch/pytorch/pull/79881
Approved by: https://github.com/albanD, https://github.com/kulinseth
2022-07-09 00:54:42 +00:00
ee6ebfc06b Revert "Enable dim=None for torch.sum (#75845)"
This reverts commit e79a51f7db181be2e6e196d6d9d90403022bc465.

Reverted https://github.com/pytorch/pytorch/pull/75845 on behalf of https://github.com/malfet due to Breaks MacOS builds, see e79a51f7db
2022-06-16 22:01:41 +00:00
e79a51f7db Enable dim=None for torch.sum (#75845)
Part of #29137

Pull Request resolved: https://github.com/pytorch/pytorch/pull/75845
Approved by: https://github.com/ezyang
2022-06-16 20:17:07 +00:00
dbee7e5499 Adding SSA support for convolution_backward
Pull Request resolved: https://github.com/pytorch/pytorch/pull/77283

Approved by: https://github.com/Krovatkin
2022-05-20 18:39:47 +00:00
2a99018147 Adding a way to register both upper and lower bound functions
Pull Request resolved: https://github.com/pytorch/pytorch/pull/77388

Approved by: https://github.com/eellison
2022-05-18 17:34:07 +00:00
1136965aa1 Upstream remaining shape functions from Torch-MLIR. (#76889)
Follow-on to https://github.com/pytorch/pytorch/pull/76592 adding the
rest.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/76889
Approved by: https://github.com/eellison
2022-05-13 17:46:58 +00:00
db21e22b4b [EASY] Quick Fix for broken shape function autogen.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/76703

Approved by: https://github.com/eellison
2022-05-03 17:34:05 +00:00
6b6c63ce5e Upstream argmax shape function.
Keeping this first commit simple to test out the flow. Will bulk-add the
rest once this one goes through.

Shape function taken from:
5192a4e9f3/python/torch_mlir/dialects/torch/importer/jit_ir/build_tools/shape_lib_gen.py (L488)

Pull Request resolved: https://github.com/pytorch/pytorch/pull/76592
Approved by: https://github.com/eellison
2022-05-03 16:52:09 +00:00
f65eb09d6b [JIT] Move Shape Function definition to python
Moves jit shape function registration to python. Like jit decompositions, a script must be run after adding new definitions which serializes them in a c++ file.

This was a request so that torch-mlir could define functions in python and upstream their shape functions. cc @silvasean  @makslevental
Pull Request resolved: https://github.com/pytorch/pytorch/pull/75546
Approved by: https://github.com/davidberard98
2022-04-19 20:59:44 +00:00