mirror of
https://github.com/pytorch/pytorch.git
synced 2025-10-28 10:34:54 +08:00
Change activation modules in C++ from using Tensor& to Tensor (#28501)
Summary: Sequential does not like modules added to it to take Tensor& (const Tensor& and Tensor are both OK). Functional and others use Tensor when they want to potentially change things in-place. This changes ReLU and friends to also do that. Unfortunately, this seems to be BC breaking on the ABI level. On the other hand, use of the module ReLU seems rare enough outside Sequential (in particular in C++ models, the standard seems to be to use torch::relu instead). is the BC breaking OK here? (yf225 or anyone else) Pull Request resolved: https://github.com/pytorch/pytorch/pull/28501 Differential Revision: D18089978 Pulled By: yf225 fbshipit-source-id: ac9aba6dc2081117dece57cd8a15bafe14ec8f51
This commit is contained in:
committed by
Facebook Github Bot
parent
1c53a74e26
commit
09ad464d68
@ -13,6 +13,10 @@ using namespace torch::test;
|
||||
|
||||
struct SequentialTest : torch::test::SeedingFixture {};
|
||||
|
||||
TEST_F(SequentialTest, CanContainThings) {
|
||||
Sequential sequential(Linear(3, 4), ReLU(), BatchNorm(3));
|
||||
}
|
||||
|
||||
TEST_F(SequentialTest, ConstructsFromSharedPointer) {
|
||||
struct M : torch::nn::Module {
|
||||
explicit M(int value_) : value(value_) {}
|
||||
|
||||
Reference in New Issue
Block a user