Commit Graph

279 Commits

Author SHA1 Message Date
1bba0eb35b Add clone_instance for Module (#30168)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/30168

Previous implementation of `clone` in `script::Module` copies both the module instance and the
class type, after we enabled type sharing https://github.com/pytorch/pytorch/pull/26666 we also
need to have a function to clone instance only and share the underlying class type.

Test Plan:
tbd

Imported from OSS

Differential Revision: D18631324

fbshipit-source-id: dbadcf19695faee0f755f45093b24618c047b9d1
2019-11-21 13:00:34 -08:00
93db2b86d1 Fix type sharing on loaded ScriptModules (#29826)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/29826

After save/load, we lose concrete type information. So if you tried to
script something that contained a loaded ScriptModule as a submodule,
the following sequence happened:
1. During ConcreteType inference, the loaded submodule got a new
inferred type.
2. But it already has a type! So there was a type mismatch.

To fix this, we should generate a ConcreteType directly from the loaded
submodule type (similar to what we do for interfaces). This makes sense
too--the ConcreteModuleType should be empty, since all the "sugaredness"
was stripped out during the save/load process.

Test Plan: Imported from OSS

Differential Revision: D18575009

Pulled By: suo

fbshipit-source-id: 4d329b7e9b7e7624f459e50092e35ab0ab813791
2019-11-20 01:13:09 -08:00
558a777615 Re-unify module and interface in ConcreteModuleType (#29825)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/29825

We made `ModuleInfo` a union initially to represent the idea that a
submodule could either be a regular module or a module interface.

This PR represents module interfaces as a ConcreteModuleType with no
info (e.g.  no "sugaredness"), and with the interface type as the
underlying `jitType_`. This has the effect of reducing the special
casing around adding/maintaining module info.

Test Plan: Imported from OSS

Differential Revision: D18575011

Pulled By: suo

fbshipit-source-id: 53e297b39aa1a03bcdadd795ff225aa68fec9d70
2019-11-20 01:13:06 -08:00
63e66fd267 Split ConcreteModuleType into two types (#29824)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/29824

We have two distinct phases/uses for ConcreteModuleType:
1. We are building it up and using it to check whether we can
reuse JIT types. (RawConcreteModuleType)
2. We are using it to satisfy ModuleValue::attr queries.
(ConcreteModuleType)

These types share an underlying `ConcreteModuleTypeData` which
actually stores the relevant info.

Previously they were the same type because I was lazy, but it's been the
source of a bug. So split them to formalize the differing invariants for
the two phases.

Test Plan: Imported from OSS

Differential Revision: D18575010

Pulled By: suo

fbshipit-source-id: 3e4ebcd36e78b947150d8f0dbb74ecccad23e7c4
2019-11-20 01:13:02 -08:00
18bdf97dbb Factor Module into Object and Module
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/29500

Test Plan: Imported from OSS

Differential Revision: D18463064

Pulled By: jamesr66a

fbshipit-source-id: d37bef242a8626593d4b8754042152cfc0f0acb2
2019-11-17 22:58:50 -08:00
902c1f9ef1 Check for mutable default parameters (#29833)
Summary:
Fix for https://github.com/pytorch/pytorch/issues/21545

We we were silently giving wrong semantics previously:

Python behavior:
```
def test(x=[]):
   x.append(1)
   return len(x)

print(test()) # 1
print(test()) # 2
```

By checking at the python layer, we prevent any new models from serializing this behavior but do not break existing serialized models.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/29833

Differential Revision: D18513168

Pulled By: eellison

fbshipit-source-id: 6fe73f28e1f9d39dedeaf67a04718089d14401a1
2019-11-14 18:28:48 -08:00
627f2823e0 remove _register_* bindings from python (#29499)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/29499

This changes how DataParallel and trace module creation works so that
we no longer need to mutate Module class after it has been created.

The only remaining usage of register_* functions are now inside C++
tests.

Test Plan: Imported from OSS

Differential Revision: D18413652

Pulled By: zdevito

fbshipit-source-id: f039e5400cd016632768be4547892f6a69645c20
2019-11-11 13:52:46 -08:00
4e4e29a511 Simplify ScriptModule bindings. (#29432)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/29432

This removes a lot of the private methods on torch._C.ScriptModule,
and instead implements functionality in terms of slot_dict_impl views
to implement _parameter, _buffers, and _modules in nn.Module.

A followup PR should also remove the _register_attribute,
_register_module, and _register_parameter methods, but this requires
more refactoring of the way tracing creates modules and replication
for data parallel works.

Test Plan: Imported from OSS

Differential Revision: D18387963

Pulled By: zdevito

fbshipit-source-id: f10d47afeb30c1e05d704ae5ac4166830933125c
2019-11-11 13:52:36 -08:00
b14c5943d4 Handle warning in torchscript (#27154)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/27154

Fix for #25859

* #28283 Fix clang-tidy errors in csrc/Module.cpp

Test Plan: Imported from OSS

Differential Revision: D18249631

Pulled By: albanD

fbshipit-source-id: 4e9bbad07cc39e7c7f0546ef7587bd4ab2dd644e
2019-11-07 08:35:16 -08:00
796363147f Implement more of of the nn.Module API (#28828)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/28828

This updates torch::script::Module to more closely match the behavior
of nn.Module. In particular, it implements the (optionally recurisive)
iterators that retrieve submodules, parameters, and buffers and makes
their names match the python versions.

This also removes the individual accessors for Parameter, Module, Buffer, etc.
and replaces them with a single `attr` function which is equivalent to
writing `a.foo` in Python (`setattr` emulates `a.foo = v`).
As we build out the user-facing API for TorchScript values this will end
up matching how an  attribute is accessed on general objects.

This PR preservers the python bindings for script::Module by emulating the
old API at the binding level. A followup will clean up the usage to more
directly match the C++ API.

Test Plan: Imported from OSS

Differential Revision: D18197611

Pulled By: zdevito

fbshipit-source-id: 7ee4dcbb258605d1c988314b05d938423f1ccee5
2019-11-06 22:58:25 -08:00
309b28ee3a Trace module calls
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/29261

Test Plan: Imported from OSS

Differential Revision: D18343363

Pulled By: jamesr66a

fbshipit-source-id: 0c6394205e2c0ea8708028d20df83fe17b466ff4
2019-11-06 15:05:49 -08:00
9492994feb submodule swapping via module interface (#28409)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/28409

This PR enables submodule swapping via module interface. User could
declare a submodule as an module interface type in the ScriptModule,
during compilation we will record the module interface type in
ModuleInfo of ConcreteModuleType, the JIT type associated will have the
correct ModuleInterfaceType, and CppModule will get the correct module list

Given that we still keep the module interface type in the type system,
the graph is not inlined when we call Module::Attr and it will use
prim::CallMethod to call the method, this allow us to do module swapping
for the ScriptModule that also meet the same module interface type, and
    we only allow the module swapping through the module interface
    approach.

Test Plan: Imported from OSS

Reviewed By: driazati

Differential Revision: D18284309

fbshipit-source-id: 2cb843e4b75fa3fcd8c6020832a81014dbff4f03
2019-11-05 11:31:40 -08:00
e95dc9814e introduce module interface declaration (#28408)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/28408

This enable interface to defined on a nn.Module, and the InterfaceType
now have a field of is_module_ to distinguish if it's a module interface
or a normal interface (This is similar to what ClassType distinguish on
module and torchscript classes).

The module interface can be assigned with any ScriptModule that has the
compatible signatures on schemas. A normal object that is not a
ScriptModule will not be able to assigned to an module interface and
will error out when user explicitly doing so. Assigning a ScriptModule
to class interface will make it only available in attribute_list, not
module_list. More details on subtyping relationship documented in the
jit_type.h

If you declare an module interface inside an nn.Module that is being
compiled to a ScriptModule, behavior to our internal compilation will
be:

1. ConcreteModuleType will record it as an module attribute and add to
   the attributes_ list.
2. JitType that is created from the ConcreteModuleType will record it as
   an attribute and pre-genenerate the slot. The slot will be marked as
   EntityType::MODULE still to make sure JitType record it as a Module
   slot
3. cpp_module will also register it as a Module as the Slot type is the
   source of truth

Since JitType will record it as attribute as store its type, it will
behave normally as the class interface attribute behave now. This means
the submodule assigned to this module interface is not getting inlined
into the graph as the normal `Module::attr` behave, it will generate
interface callMethod and allow us to later swap this with another
ScriptModule that implicitly implements this module interface.

Test Plan: Imported from OSS

Differential Revision: D18284311

fbshipit-source-id: e0b8f6e8c34b2087fab337a969e5ea3fb37ec209
2019-11-02 16:39:00 -07:00
f782500ee0 Abstract tracer::enter and tracer::exit into a function
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/28473

Test Plan: Imported from OSS

Differential Revision: D18121007

Pulled By: jamesr66a

fbshipit-source-id: 4c4a4344ad9bcc4630b945d2a645a0b05928933c
2019-10-26 18:41:14 -07:00
2181dd516e fix handling of function attributes. (#28569)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/28569

Previously, the inclusion of function attributes would "poison" a
ConcreteModuleType, because we did not have a way of checking whether
they are actually the same function. This PR uses the Python function
object to perform that check. This improves our ability to reuse JIT
types between modules.

Also this PR fixes a bug where we weren't properly adding modules as
attributes when converting from ConcreteType -> JIT type (we were adding
them after the fact--another reason to switch from using `register_x` to
`set_x` during module construction, which is on my to-do list after
this).

Fixes https://github.com/pytorch/pytorch/issues/28559

Test Plan: Imported from OSS

Differential Revision: D18111331

Pulled By: suo

fbshipit-source-id: ec2cccf832d3ddd4cd4d28fe19cb265f1275325a
2019-10-24 22:23:37 -07:00
6d689e27c7 clean up NamedTuple creation API (#28189)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/28189

This makes it a separate createNamed function. The existing API resulted
in poor usage in fbcode, which in turn caused bugs in TorchScript programs.

Test Plan: Imported from OSS

Differential Revision: D17970220

Pulled By: zdevito

fbshipit-source-id: 59b082a726f56bec1c8d10d410db829f4aa271ea
2019-10-22 10:18:07 -07:00
0aa694ebe5 Move Method::lowered_graph to a separate pass out of the Method class. (#28242)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/28242

There is no reason to have it in a general API of Module/Method - it's
just another graph pass. It was there because some time ago modules were
not first class and all graphs were lowered. After that changed, this
API was added for easier transition, but now we don't need it anymore.

Test Plan: Imported from OSS

Differential Revision: D17986724

Pulled By: ZolotukhinM

fbshipit-source-id: 279a1ec450cd8fac8164ee581515b09f1d755630
2019-10-18 12:48:40 -07:00
58ed8ca9e1 clean up exported source format (#28129)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/28129

The previous PR in the stack removed the need to order classes/functions
or have correct import statements. This resolved circular depedency issues
that can arise when class constructors like ModuleList put new instances
of themselves in a common namespace.

This PR changes our export format to no longer produce this information.
By doing so we can make the logic signficantly simpler, since we just
keep track of an individual PythonPrint object per file.

Notes:
* PythonPrint was changed to manage its own stream/list of ranges. It
was doing this anyway internally, this just makes the API more clear.
* Since we are changing the serialization format, I also removed op_version_set.
It is now replaced with the VERSION number that written in the zip archive.
This further simplifies the code emission process.
* A test of op_version_set was removed since there is no longer any behavior
to test.

Test Plan: Imported from OSS

Differential Revision: D17961610

Pulled By: zdevito

fbshipit-source-id: ada362c4ca34d05393a1a7e799c94785ab9d9825
2019-10-16 22:47:24 -07:00
2265cddbd2 Cleanup torch::jit::script::Module API for accessing attributes/parameters/submodules. (#27260)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/27260

This PR has the following changes:
- Slot class is removed. In all use cases except `lower_graph` we really
just needed the attribute name and thus having an extra layer of
abstraction through Slot only made the code harder to understand.
- get_parameters, get_attributes, get_modules, and get_slots now return
a list of <name, item> pairs instead of a list of Slots.

Differential Revision: D17728910

Test Plan: Imported from OSS

Pulled By: ZolotukhinM

fbshipit-source-id: 94781611752dd88e7fddfe8b8e0252d6ec32ba68
2019-10-16 21:32:08 -07:00
fb4517132f Allow 'Any' to appear as a type argument. (#26572)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/26572

Combined with isinstance specialization this allows a degree of polymorphic
functions to work without needing to use our weirder overload hacks.

We do not define any operators on Any, so the only thing you can do with it
is to put it in containers or type refine it using an isinstance check.
Any is restricted from appearing in non-argument position because we
cannot restore type tags if it ends up as a field in a class.

Test Plan: Imported from OSS

Differential Revision: D17530643

Pulled By: zdevito

fbshipit-source-id: f06f78ce84819f7773953a492f3d4c49219ee94c
2019-10-16 11:07:08 -07:00
3de34744b3 Make PythonPrint a class (#26787)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/26787

A follow up PR will remove the need to issue import statements,
or write classes in order since they are no longer needed.
 This change allows the same PythonPrint class
to be used for an entire file which will be needed in that patch.

Test Plan: Imported from OSS

Differential Revision: D17566440

Pulled By: zdevito

fbshipit-source-id: 1ee896da0cdfe6a003298e1d4b0238403b9ed6dd
2019-10-15 16:00:34 -07:00
341262754f module dedupe (#26666)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/26666

Changes:
- Introduce a `ConcreteModuleType` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ConcreteModuleType` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes. In particular, this adds `self.training` to *every* ScriptModule, since
  it's present on every `nn.Module`.
- I believe this change to be transparent to existing users of the inheritance API, since if you had an attribute that is unscriptable that you never used, there is no error. In some cases, we will create new attributes (even if they are unused), which will increase serialized model size from before.

Test Plan: Imported from OSS

Differential Revision: D17551196

Pulled By: suo

fbshipit-source-id: b476d1c9feb3ddfd63406d90989aaf9dfe890591
2019-10-12 09:51:57 -07:00
c27853fbba Expose torch::jit::script::Module::dump_to_str to python as module._c.dump_to_str.
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/27556

Test Plan: Imported from OSS

Differential Revision: D17814331

Pulled By: ZolotukhinM

fbshipit-source-id: a25fc853897d37c6a703373838b522c64ad3aa78
2019-10-08 16:32:23 -07:00
6cf189512c Remove underscore from pybind of module._c.dump (#27555)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/27555

It is already under '_c' anyway.

Test Plan: Imported from OSS

Differential Revision: D17814333

Pulled By: ZolotukhinM

fbshipit-source-id: ca21649d553f6601be12828958a8077867d0e30e
2019-10-08 16:32:19 -07:00
0046092178 Reduce special casing around 'training' (#27109)
Summary:
Most of this was old cruft left over from special handling of `training` before we had a `bool` type. This makes all modules have a `training` attribute that is true by default and removes all other special handling.

Fixes #26884
](https://our.intern.facebook.com/intern/diff/17728129/)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/27109

Pulled By: driazati

Differential Revision: D17728129

fbshipit-source-id: 8ddc9fbb07a953dd05529538bfdd01ed88b5cb57
2019-10-07 13:52:59 -07:00
17b1faa2bf Rename jit Function to ScriptFunction
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/27219

Test Plan: Imported from OSS

Differential Revision: D17715306

Pulled By: albanD

fbshipit-source-id: d11a7634dbee6a885c7177b240958e5aed2544f3
2019-10-03 08:28:32 -07:00
0e3389dced Fix circular deps in loading (#26758)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/26758

This PR changes the order in which we import classes and functions so
that is is no longer necessary for them to defined in order in a file,
or for there to be proper import statements in the exported file.

Actually importing a function/class now is driven by the need to resolve
the entity during unpickling, type resolution, or value resolution.

While this should allow significant simplification to the code that
serializes classes, this work has not been done yet in order to avoid
inevitable forward compat issues in the transition period.

Notes:
* Individual functions have been replaced with a SourceImporter object
  that exposes a resolveType method. This method loads the type if
  it has not been loaded yet, potentially parsing  (but not loading)
  the file it exists in if that file hasn't been parsed yet.
* Some legacy functionality needed to be added as a method to this object
  since the old format still used some of this logic for class resolution.

Test Plan: Imported from OSS

Differential Revision: D17558989

Pulled By: zdevito

fbshipit-source-id: 7eae3470bcbd388c4de463e3462d527776ed46c6
2019-09-26 11:39:16 -07:00
4c40dbcb75 Resolve NamedTuple types in Python (#26443)
Summary:
When used as annotations on Python functions, `NamedTuple`s go through our Python annotation -> type mapping which previously had no way of lookup up `NamedTuple`s (which are created lazily by checking if the type has certain properties, so the lookup is creating the `TupleType` from scratch). This PR threads through the necessary data to make them work.

Fixes #26437
Pull Request resolved: https://github.com/pytorch/pytorch/pull/26443

Pulled By: driazati

Differential Revision: D17486441

fbshipit-source-id: a6bbb543ff05a5abe61f1a7f68db9ecdb652b358
2019-09-20 10:53:25 -07:00
12762cd586 Use static type information to restore type tags (#25447)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/25447

When we unpickle IValues, we lose type information for List[T]
and Dict[K, V]. We can restore this information using the static
type information contained in the top-level Module/Class type.

This ensures that even after serialization we can always get the
dynamic type of an ivalue using its type() method.

Test Plan: Imported from OSS

Differential Revision: D17127872

Pulled By: zdevito

fbshipit-source-id: 1ffb5e37a7c35c71ac9d3fb7b2edbc7ce3fbec72
2019-09-18 16:07:01 -07:00
c749be9e9f Make arguments of Module::dump easier to remember. (#25740)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/25740

Previously we had `omit_method_bodies`, `omit_attr_values` and
`omit_param_values`. They were called the same in the python bindings
and it was hard to remember their proper spelling. This PR changes them
to `code`, `attrs`, and `params` which are might easier to remember. It
also flips their meaning - now they enable printing instead of disabling
it. I also changed the default values to 'print all' from 'print
nothing', as that's the most usual way of using it.

Test Plan: Imported from OSS

Differential Revision: D17217517

Pulled By: ZolotukhinM

fbshipit-source-id: fa56e478a732ffd685d885f11c9da0457cd03d16
2019-09-10 11:42:26 -07:00
fba107f18e add serialization of interface (#25227)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/25227

Adds cases to NamedType serialization to so that interfaces are written.
Similar implementation to NamedTuples

Test Plan: Imported from OSS

Differential Revision: D17066674

Pulled By: zdevito

fbshipit-source-id: fda5419260fad29e8c4ddb92de1d3447d621d982
2019-08-27 22:54:46 -07:00
61818b8986 Add interface declarations to JIT (#25258)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/25258

this is the first commit in a series to add interfaces to JIT.
Interfaces allow the specification through a blank python class of an
abstract interface that can be used in type annotations for Script functions.
If a TorchScript class implements all the methods in the interface with
the appropriate types, then it is implicitly considered to implement
that interface.

Follows required:
* implementation of serialization
* implementation in the parser frontend
* better error reporting for explaining why a class does not meet an
  interface specification.

Test Plan: Imported from OSS

Differential Revision: D17079963

Pulled By: zdevito

fbshipit-source-id: a9986eeba2d4fdedd0064ce7d459c0251480a5a0
2019-08-27 22:54:37 -07:00
9340b155bc Revert D15901930: Add interface declarations to JIT
Test Plan: revert-hammer

Differential Revision:
D15901930

Original commit changeset: 22c82d12c9c2

fbshipit-source-id: 4009a3ce7af245d7e0f4924824ece59cdc774180
2019-08-27 06:41:32 -07:00
4b22cf6bd5 Add interface declarations to JIT (#21972)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/21972
ghimport-source-id: 280f89ca678615f915be2139d1c05cb6bc39eefc

Test Plan: Imported from OSS

Differential Revision: D15901930

Pulled By: zdevito

fbshipit-source-id: 22c82d12c9c2600e569d7083e2771fd6ec3de2b1
2019-08-26 16:57:59 -07:00
8e3c0210a5 extend torch.jit._overload to module methods (#24259)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/24259

Follow up to https://github.com/pytorch/pytorch/pull/23886, add the same overload api specified in PEP 484 to module methods to reduce the friction of adding method overloads that was brought up in #23266.

The usage is:
```
torch.jit.overload
def add(self, y: int) -> int: ...
torch.jit.overload
def add(self, y: float) -> float: ...
def add():
   ...
```

Test Plan: Imported from OSS

Differential Revision: D16921304

Pulled By: eellison

fbshipit-source-id: 784e2f26f7ca9a330a434a603c86b53725c3dc71
2019-08-20 16:47:35 -07:00
bdc57d3833 Merge ProfiledTensorType and TensorType (#24284)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/24284

This PR finishes the unification of all Tensor types into a single object.
ProfiledTensorType is renamed to TensorType and the old TensorType is
deleted.

Notes:
* Fixes bug in merge for VaryingShape by changing its representation to an
 optional list of optional ints.
* Removes ProfiledTensorType::create(type) invocations that can now
  simply be expect calls on tensor type.

Test Plan: Imported from OSS

Differential Revision: D16794034

Pulled By: zdevito

fbshipit-source-id: 10362398d0bb166d0d385d74801e95d9b87d9dfc
2019-08-20 13:01:28 -07:00
cf57f73c11 Module: add dump function that recursively prints contents of the module. (#24356)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/24356

Pull Request resolved: https://github.com/pytorch/pytorch/pull/24356

Test Plan: Imported from OSS

Differential Revision: D16864133

Pulled By: ZolotukhinM

fbshipit-source-id: 1af757334bc8e156427783bc37500de3c934378b
2019-08-16 15:13:02 -07:00
0cbd7fa46f remove CompleteTensorType
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/24169

Test Plan: Imported from OSS

Reviewed By: jamesr66a

Differential Revision: D16765329

Pulled By: zdevito

fbshipit-source-id: 88560cefba635c3d586a3e4dee67f9b1d901a642
2019-08-15 13:31:34 -07:00
8a7e57c416 clean up import_source (#24282)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/24282

This moves a test from Python to cpp, and in doing so lets us clean up a
bunch of otherwise unused code.

Test Plan: Imported from OSS

Differential Revision: D16800562

Pulled By: suo

fbshipit-source-id: ebc29bb81f4fb2538081fa309ead1739980f1093
2019-08-14 11:26:26 -07:00
c158848abe class_table_ to deps_table_ (#24281)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/24281

These are not just classes anymore, rename

Test Plan: Imported from OSS

Differential Revision: D16800564

Pulled By: suo

fbshipit-source-id: 8b8d508944c26a8916fc7642df43f22583dfcf82
2019-08-14 11:26:22 -07:00
f36c3e9e4a Revert D16684391: [jit] class_table_ to deps_table_
Differential Revision:
D16684391

Original commit changeset: af0024c0b7fb

fbshipit-source-id: c9b98ac60b460963dc50f4837100909ff8f6c3ea
2019-08-13 13:27:03 -07:00
94aae71ba9 Revert D16684390: [jit] clean up import_source
Differential Revision:
D16684390

Original commit changeset: fca81ca14d1a

fbshipit-source-id: eb229097560ab1ead43756175e552764c8a14703
2019-08-13 13:26:59 -07:00
c2549cb8d3 Remove DimensionedTensorType (#24077)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/24077

This replaces all uses of DimensionedTensorType with ProfiledTensorType.
For places where we propagate shape information, we still follow the
dimension-only propagation rules, meaning that even if full size information
is known on inputs the outputs will only have dimension information.

This fixes several bugs in existing implentations that this change uncovered:
* requires_grad was not propgated correctly across loops
* requires_grad on ProfiledTensorType returned false when requires_grad information
  is unknown but the conservative result is true
* some equality code on ProfiledTensorType contained bugs.

Test Plan: Imported from OSS

Reviewed By: suo

Differential Revision: D16729581

Pulled By: zdevito

fbshipit-source-id: bd9f823c1c6b1d06a236a1b5b2b2fcdf0245edce
2019-08-13 10:05:47 -07:00
bb4f4e4d03 clean up import_source (#23846)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/23846

This moves a test from Python to cpp, and in doing so lets us clean up a
bunch of otherwise unused code.

Test Plan: Imported from OSS

Differential Revision: D16684390

Pulled By: suo

fbshipit-source-id: fca81ca14d1ac9e4d6b47ae5eecaa42b38d69147
2019-08-12 20:30:06 -07:00
2dbd36b384 class_table_ to deps_table_ (#23845)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/23845

These are not just classes anymore, rename

Test Plan: Imported from OSS

Differential Revision: D16684391

Pulled By: suo

fbshipit-source-id: af0024c0b7fbcca68785ec3fc6dc288ec46a1b84
2019-08-12 20:30:01 -07:00
77c08aa46c serialize modules as classes
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/23098

Test Plan: Imported from OSS

Differential Revision: D16383328

Pulled By: suo

fbshipit-source-id: 36389b8e45c3febb7f224cd9c630fe643fa90bef
2019-08-11 15:50:29 -07:00
d3f6d5885d Replace Module::copy_into with Module::clone. (#24068)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/24068

The new method has a simpler interface (no arguments).

Resolves #23915.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/24068

Differential Revision: D16736379

Test Plan: Imported from OSS

Pulled By: ZolotukhinM

fbshipit-source-id: 1c1f397ce9cdaa5467fd7da3025cf44d1436ae6b
2019-08-09 18:25:38 -07:00
61d0624803 [jit[ make sure NameTuples have unique qualified names
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/23798

Test Plan: Imported from OSS

Differential Revision: D16652818

Pulled By: suo

fbshipit-source-id: c824f26427105ed5f0c553a67ab61c69a1f89655
2019-08-09 00:52:02 -07:00
7d207363bf Fix master - (#24003)
Summary:
I accidentally removed this in a merge, breaking a test. Fix for master
Pull Request resolved: https://github.com/pytorch/pytorch/pull/24003

Differential Revision: D16707108

Pulled By: eellison

fbshipit-source-id: 8b59f46e7932b88a7ae246a261c4daf17f23995f
2019-08-08 00:00:53 -07:00
451fc51d8d add support for overloading functions (#23886)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/23886

This is a series of PRs that will allow us to support adding [padding to conv](https://github.com/pytorch/pytorch/pull/22484) and also reduce the friction of adding method overloads that was brought up in  https://github.com/pytorch/pytorch/pull/23266.

Support for overloaded functions following the specification in [PEP 484](https://www.python.org/dev/peps/pep-0484/#function-method-overloading).

The usage is:
```
torch.jit.overload
def add(x: int, y: int) -> int: ...
torch.jit.overload
def add(x: float, y: float) -> float: ...

def add:
    return x + y
```

Follow up PRs:

- Add same API for methods
- A couple of cleanups for functions:
     - don't require default params specified on the overload as well
     - potentially error if invocation could be matched to multiple overloads. now it just chooses the first one, mypy does the same thing currently

Test Plan: Imported from OSS

Differential Revision: D16694863

Pulled By: eellison

fbshipit-source-id: f94f2933bc1c97fa58f31846acfe962b0630068c
2019-08-07 19:18:19 -07:00