mirror of
https://github.com/pytorch/pytorch.git
synced 2025-10-20 21:14:14 +08:00
In #163455 , the `reshape` was not a pure view op. The `permute` before it created an non-contiguous tensor, which would trigger a data copy during the reshape. This PR improved the implementation by remove the `urtensor` intermediate tensor completely. By simply expanding the `xtensor` would achieve the `repeat` effect. Before this PR, there were two data copies (in `urtensor.copy_` and `urtensor.reshape`). Now, there is only one data copy in the `.copy_()`. Reshape would not copy data because it is on a contiguous tensor. One more note is that we do want at one copy because we want to duplicate the elements for the repeats. User can inplace modify single elements without afffecting others. Pull Request resolved: https://github.com/pytorch/pytorch/pull/163842 Approved by: https://github.com/Skylion007 Co-authored-by: Aaron Gokaslan <aaronGokaslan@gmail.com>