How can i use "leaky_relu" as an activation in Tensorflow "tf.layers.dense"?
At least on TensorFlow of version 2.3.0.dev20200515, LeakyReLU
activation with arbitrary alpha
parameter can be used as an activation
parameter of the Dense
layers:
output = tf.keras.layers.Dense(n_units, activation=tf.keras.layers.LeakyReLU(alpha=0.01))(x)
LeakyReLU
activation works as:
LeakyReLU math expression
LeakyReLU graph
More information: Wikipedia - Rectifier (neural networks)
I wanted to do something similar in tensorflow 2.0 and I used lambda notation, as in
output = tf.layers.dense(input, n_units, activation=lambda x : tf.nn.leaky_relu(x, alpha=0.01))
Could be a good way to fit it all in one line.
If you're really adamant about a one liner for this, you could use the partial()
method from the functools
module, as follow:
import tensorflow as tf
from functools import partial
output = tf.layers.dense(input, n_units, activation=partial(tf.nn.leaky_relu, alpha=0.01))
It should be noted that partial()
does not work for all operations and you might have to try your luck with partialmethod()
from the same module.
Hope this helps you in your endeavour.
You are trying to do partial evaluation, and the easiest way for you to do this is to define a new function and use it
def my_leaky_relu(x):
return tf.nn.leaky_relu(x, alpha=0.01)
and then you can run
output = tf.layers.dense(input, n_units, activation=my_leaky_relu)