Add CELU activation to pytorch (#8551)

Summary:
Also fuse input scale multiplication into ELU

Paper:
https://arxiv.org/pdf/1704.07483.pdf
Pull Request resolved: https://github.com/pytorch/pytorch/pull/8551

Differential Revision: D9088477

Pulled By: SsnL

fbshipit-source-id: 877771bee251b27154058f2b67d747c9812c696b
This commit is contained in:
Xiang Gao
2018-08-01 07:46:03 -07:00
committed by Facebook Github Bot
parent 6f6a1f2d63
commit 6fc75eadf0
21 changed files with 229 additions and 34 deletions

View File

@ -36,6 +36,7 @@ functions = [
'ReLU6',
'RReLU',
'SELU',
'CELU',
'Sigmoid',
'Softplus',
'Softshrink',