using leaky relu in Tensorflow
A leaky relu function has been included with release 1.4.0-rc1
as tf.nn.leaky_relu
.
Documentation page: https://www.tensorflow.org/versions/master/api_docs/python/tf/nn/leaky_relu .
You could write one based on tf.relu
, something like:
def lrelu(x, alpha):
return tf.nn.relu(x) - alpha * tf.nn.relu(-x)
EDIT
Tensorflow 1.4 now has a native tf.nn.leaky_relu
.
If alpha < 1 (it should be), you can use tf.maximum(x, alpha * x)