Keras PReLU Layer

Other

Like the leaky ReLU, the parametric ReLU introduces a slope in the negative part of the input space to improve learning dynamics compared to ordinary ReLUs. The difference to leaky ReLUs is that here the slope alpha is treated as a parameter that is trained alongside the rest of the network's weights. Alpha is usually a vector containing a dedicated slope for each feature of the input. (also see the Shared axes option). Corresponds to the Keras PReLU Layer.

Input Ports

  1. Type: DLKerasNetworkPortObjectBase The Keras deep learning network to which to add a <tt>PReLU</tt> layer.

Output Ports

  1. Type: DLKerasNetworkPortObjectBase The Keras deep learning network with an added <tt>PReLU</tt> layer.

Find here

KNIME Labs > Deep Learning > Keras > Layers > Advanced Activations

Make sure to have this extension installed:

KNIME Deep Learning - Keras Integration

Update site for KNIME Analytics Platform 3.7:
KNIME Analytics Platform 3.7 Update Site

How to install extensions