Files
pytorch/caffe2/python/optimizer_context.py
Tristan Rice dc7d8a889e caffe2: refactor context to allow being typed (#48340)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/48340

This changes the context managed classes from using a decorator to define them to using inheritance. Inheritance allows the python static type checking to work correctly.

```
context.define_context()
class Bar(object): ...

context.define_context(allow_default=True)
class Foo(object): ...
```

becomes
```
class Foo(context.Managed): ...

class Bar(context.DefaultManaged): ...
```

Behavior differences:
* arg_name has been removed since it's not used anywhere
* classes need to call `super()` in `__enter__/__exit__` methods if they override (none do)

This also defines a context.pyi file to add types for python3. python2 support should not be affected

Test Plan:
ci

  buck test //caffe2/caffe2/python:context_test //caffe2/caffe2/python:checkpoint_test

Reviewed By: dongyuzheng

Differential Revision: D25133469

fbshipit-source-id: 16368bf723eeb6ce3308d6827f5ac5e955b4e29a
2020-11-30 18:31:14 -08:00

54 lines
1.4 KiB
Python

## @package optimizer_context
# Module caffe2.python.optimizer_context
from caffe2.python import context
from caffe2.python.modifier_context import (
ModifierContext, UseModifierBase)
DEFAULT_OPTIM = 'DEFAULT'
class OptimizerContext(ModifierContext, context.DefaultManaged):
"""
provide context to allow param_info to have different optimizers
"""
def has_optimizer(self, name):
return self._has_modifier(name)
def get_optimizer(self, name):
assert self.has_optimizer(name), (
"{} optimizer is not provided!".format(name))
return self._get_modifier(name)
class UseOptimizer(UseModifierBase):
'''
context class to allow setting the current context.
Example usage with brew:
- with UseOptimizer(optim):
brew.func
- with UseOptimizer({'WEIGHT': weight_optim}):
brew.func
- with UseOptimizer({'DEFAULT': optim, 'BIAS': bias_optim,
'WEIGHT': weight_optim}):
brew.func
- with UseOptimizer(optim1):
brew.func
with UseOptimizer(optim2):
brew.func
Example usage with layer:
optimizers = {'optim1': optim1, 'optim2': optim2}
with Optimizers(optimizers):
optim = OptimizerContext.current().get_optimizer('optim1')
layer(optim=optim)
'''
def _context_class(self):
return OptimizerContext