mirror of
https://github.com/pytorch/pytorch.git
synced 2025-10-20 21:14:14 +08:00
`Sparsity` as a term doesn't reflect the tools that are developed by the AO. The `torch/ao/sparsity` also has utilities for structured pruning, which internally we always referred to as just "pruning". To avoid any confusion, we renamed `Sparsity` to `Prune`. We will not be introducing the backwards compatibility, as so far this toolset was kept under silent development. This change will reflect the changes in the documentation as well. **TODO:** - [ ] Change the tutorials - [ ] Confirm no bc-breakages - [ ] Reflect the changes in the trackers and RFC docs Fixes #ISSUE_NUMBER Pull Request resolved: https://github.com/pytorch/pytorch/pull/84867 Approved by: https://github.com/supriyar
17 lines
398 B
Python
17 lines
398 B
Python
# torch.ao is a package with a lot of interdependencies.
|
|
# We will use lazy import to avoid cyclic dependencies here.
|
|
|
|
|
|
__all__ = [
|
|
"nn",
|
|
"ns",
|
|
"quantization",
|
|
"pruning",
|
|
]
|
|
|
|
def __getattr__(name):
|
|
if name in __all__:
|
|
import importlib
|
|
return importlib.import_module("." + name, __name__)
|
|
raise AttributeError(f"module {__name__!r} has no attribute {name!r}")
|