prelu keras code example

Example 1: parametric relu in keras convolution layer

keras.layers.PReLU(alpha_initializer='zeros', alpha_regularizer=None, alpha_constraint=None, shared_axes=None)

Example 2: parametric relu in keras convolution layer

# Create the model
model = Sequential()
model.add(Conv2D(32, kernel_size=(3, 3), input_shape=input_shape))
model.add(PReLU(alpha_initializer=Constant(value=0.25)))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.25))
model.add(Conv2D(64, kernel_size=(3, 3)))
model.add(PReLU(alpha_initializer=Constant(value=0.25)))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.25))
model.add(Flatten())
model.add(Dense(256))
model.add(PReLU(alpha_initializer=Constant(value=0.25)))
model.add(Dense(no_classes, activation='softmax'))