|
|
|
Boxuan Yue, Junwei Fu and Jun Liang
Recurrent neural networks (RNN) are efficient in modeling sequences for generation and classification, but their training is obstructed by the vanishing and exploding gradient issues. In this paper, we reformulate the RNN unit to learn the residual funct...
ver más
|
|
|