Summary: Ensure that we create deterministic zip archives for the same inputs to make builds deterministic.
Test Plan: CI
Reviewed By: StanislavGlebik
Differential Revision: D46417033
Pull Request resolved: https://github.com/pytorch/pytorch/pull/102903
Approved by: https://github.com/malfet
Preferring dash over underscore in command-line options. Add `--command-arg-name` to the argument parser. The old arguments with underscores `--command_arg_name` are kept for backward compatibility.
Both dashes and underscores are used in the PyTorch codebase. Some argument parsers only have dashes or only have underscores in arguments. For example, the `torchrun` utility for distributed training only accepts underscore arguments (e.g., `--master_port`). The dashes are more common in other command-line tools. And it looks to be the default choice in the Python standard library:
`argparse.BooleanOptionalAction`: 4a9dff0e5a/Lib/argparse.py (L893-L895)
```python
class BooleanOptionalAction(Action):
def __init__(...):
if option_string.startswith('--'):
option_string = '--no-' + option_string[2:]
_option_strings.append(option_string)
```
It adds `--no-argname`, not `--no_argname`. Also typing `_` need to press the shift or the caps-lock key than `-`.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/94505
Approved by: https://github.com/ezyang, https://github.com/seemethere
Summary:
This change points multipy::runtime to use multipy.package instead of torch.package by copying `_deploy.py` (which is used in order to pass objects in and out of interpreters) into multipy as well as making the neccessary changes to allow `multipy::runtime` to access `multipy.package` and `_deploy.py`.
X-link: https://github.com/pytorch/multipy/pull/111
Reviewed By: d4l3k
Differential Revision: D38337551
Pulled By: PaliC
Pull Request resolved: https://github.com/pytorch/pytorch/pull/82690
Approved by: https://github.com/d4l3k
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/71072
This PR replaces the old logic of loading frozen torch through cpython by directly loading zipped torch modules directly onto deploy interpreter. We use elf file to load the zip file as its' section and load it back in the interpreter executable. Then, we directly insert the zip file into sys.path of the each initialized interpreter. Python has implicit ZipImporter module that can load modules from zip file as long as they are inside sys.path.
Test Plan: buck test //caffe2/torch/csrc/deploy:test_deploy
Reviewed By: shunting314
Differential Revision: D32442552
fbshipit-source-id: 627f0e91e40e72217f3ceac79002e1d8308735d5