How to use leaky relu keras?

658    Asked by ColemanGarvin in Data Science , Asked on Feb 13, 2023

How do you use LeakyRelu as an activation function in sequence DNN in keras? If I want to write something similar to:

 model = Sequential()
    model.add(Dense(90, activation='LeakyRelu'))
What is the solution? Is LeakyRelu similar to Relu?

Second question is: what is the best general setting for tuning the parameters of LeakyRelu? When is its performance significantly better than Relu?

Answered by David EDWARDS

You can also write something like this to use leaky relu keras:


import tensorflow as tf
keras = tf.keras
layer1 = keras.layers.Dense(units=90, activation=keras.layers.LeakyReLU(alpha=0.01))
model = keras.Sequential([layer1])
or
model= keras.Sequential([
          keras.layers.Dense(units=90,
              activation=keras.layers.LeakyReLU(alpha=0.01))
])

However, passing 'advanced activation' layers through the 'activation' argument of a layer is not a good practice and is best to be avoided.



Your Answer

Interviews

Parent Categories