Logo
Explore Help
Register Sign In
frozenleaves/pytorch
1
0
Fork 0
You've already forked pytorch
mirror of https://github.com/pytorch/pytorch.git synced 2025-10-21 05:34:18 +08:00
Code Issues Packages Projects Releases Wiki Activity
Files
dda071587f0522a16b237f92cbe27fd13a1a1c11
pytorch/torch/distributed/algorithms
History
Edward Yang dda071587f Revert "Make distributed modules importable even when backend not built (#159889)" (#162568)
This reverts commit a0d026688cd69583d5a4e0c6f3e5fda141a7f4a9.

Revert "Always build USE_DISTRIBUTED. (#160449)"

This reverts commit d80297a6846f1f2c36fd4f19e22919f2abe8fcea.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/162568
Approved by: https://github.com/huydhn
2025-09-10 04:29:42 +00:00
..
_checkpoint
PEP585 update - torch/distributed (#145164)
2025-01-21 04:23:29 +00:00
_comm_hooks
[BE]: Update mypy to 1.11.2 (#133816)
2024-09-16 19:44:11 +00:00
_optimizer_overlap
PEP585 update - torch/distributed (#145164)
2025-01-21 04:23:29 +00:00
_quantization
[BE][Easy] enable UFMT for torch/distributed/{algorithms,autograd,benchmarks,checkpoint,elastic}/ (#128866)
2024-06-18 13:51:53 +00:00
ddp_comm_hooks
Support ddp zero hook XCCL path (#159240)
2025-08-13 12:37:33 +00:00
model_averaging
Revert "Make distributed modules importable even when backend not built (#159889)" (#162568)
2025-09-10 04:29:42 +00:00
__init__.py
[BE][Easy] enable UFMT for torch/distributed/{algorithms,autograd,benchmarks,checkpoint,elastic}/ (#128866)
2024-06-18 13:51:53 +00:00
join.py
[BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547)
2025-02-28 07:35:56 +00:00
Powered by Gitea Version: 1.24.0-rc0 Page: 445ms Template: 3ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API