fix Dtensor doc link (#162494)

Small fix for https://docs.pytorch.org/docs/main/distributed.tensor.parallel.html
<img width="890" height="274" alt="image" src="https://github.com/user-attachments/assets/6ee7fc7c-e0fe-4f5e-ab7e-a895bb3fa79f" />

now it is:

<img width="909" height="320" alt="image" src="https://github.com/user-attachments/assets/8b2c41ef-1684-4597-8dae-144b49723796" />

Pull Request resolved: https://github.com/pytorch/pytorch/pull/162494
Approved by: https://github.com/XilunWu
This commit is contained in:
Howard Huang
2025-09-09 11:13:59 -07:00
committed by PyTorch MergeBot
parent e2545487de
commit 4d66a3b894

View File

@ -5,7 +5,7 @@
# Tensor Parallelism - torch.distributed.tensor.parallel
Tensor Parallelism(TP) is built on top of the PyTorch DistributedTensor
(DTensor)[https://github.com/pytorch/pytorch/blob/main/torch/distributed/tensor/README.md]
([DTensor](https://github.com/pytorch/pytorch/blob/main/torch/distributed/tensor/README.md))
and provides different parallelism styles: Colwise, Rowwise, and Sequence Parallelism.
:::{warning}
@ -89,4 +89,4 @@ Parallelized cross-entropy loss computation (loss parallelism), is supported via
```
:::{warning}
The loss_parallel API is experimental and subject to change.
:::
:::