mirror of
https://github.com/pytorch/pytorch.git
synced 2025-10-20 21:14:14 +08:00
Use "length of the RNN input" instead of "length of the RNN"
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/20873 Differential Revision: D15495570 Pulled By: ezyang fbshipit-source-id: e3b4cd67ccf97d0053ac053c3bcb74415b928c0a
This commit is contained in:
committed by
Facebook Github Bot
parent
3e4f213e82
commit
a5c90aaf47
@ -64,7 +64,7 @@ earlier, you should ``del intermediate`` when you are done with it.
|
|||||||
|
|
||||||
**Don't run RNNs on sequences that are too large.**
|
**Don't run RNNs on sequences that are too large.**
|
||||||
The amount of memory required to backpropagate through an RNN scales
|
The amount of memory required to backpropagate through an RNN scales
|
||||||
linearly with the length of the RNN; thus, you will run out of memory
|
linearly with the length of the RNN input; thus, you will run out of memory
|
||||||
if you try to feed an RNN a sequence that is too long.
|
if you try to feed an RNN a sequence that is too long.
|
||||||
|
|
||||||
The technical term for this phenomenon is `backpropagation through time
|
The technical term for this phenomenon is `backpropagation through time
|
||||||
|
Reference in New Issue
Block a user