Default activation function in Keras
https://github.com/keras-team/keras/blob/master/keras/layers/recurrent.py#L2081
It mentions tanh here for version 2.3.0 :-)
Keras Recurrent
is an abstact class for recurrent layers. In Keras 2.0 all default activations are linear for all implemented RNNs
(LSTM
, GRU
and SimpleRNN
). In previous versions you had:
linear
forSimpleRNN
,tanh
forLSTM
andGRU
.