Keras - using activation function with a parameter
relu
is a function and not a class and it takes the input to the activation function as the parameter x
. The activation layer takes a function as the argument, so you could initialize it with a lambda function through input x
for example:
model.add(Activation(lambda x: relu(x, alpha=0.1)))