A leaky ReLU is a rectified linear unit (ReLU) with a slope in the negative part of its input space. The motivation for leaky ReLUs is that vanilla ReLUs have a gradient of zero in the negative part of their input space which can harm learning. Corresponds to the Keras Leaky ReLU Layer .
- Type: Keras Deep Learning NetworkKeras NetworkThe Keras deep learning network to which to add a Leaky ReLU layer.