Keras PReLU Layer
Like the leaky ReLU, the parametric ReLU introduces a slope in the negative part of the input space to improve learning dynamics compared to ordinary ReLUs. The difference to leaky ReLUs is that here the slope alpha is treated as a parameter that is trained alongside the rest of the network's weights. Alpha is usually a vector containing a dedicated slope for each feature of the input. (also see the Shared axes option). Corresponds to the Keras PReLU Layer.
- Type: Keras Deep Learning Network The Keras deep learning network to which to add a PReLU layer.
- Type: Keras Deep Learning Network The Keras deep learning network with an added PReLU layer.