Keras PReLU Layer

Other

Like the leaky ReLU, the parametric ReLU introduces a slope in the negative part of the input space to improve learning dynamics compared to ordinary ReLUs. The difference to leaky ReLUs is that here the slope alpha is treated as a parameter that is trained alongside the rest of the network's weights. Alpha is usually a vector containing a dedicated slope for each feature of the input. (also see the Shared axes option). Corresponds to the Keras PReLU Layer.

Input Ports

  1. Type: Keras Deep Learning Network
    The Keras deep learning network to which to add a PReLU layer.

Output Ports

  1. Type: Keras Deep Learning Network
    The Keras deep learning network with an added PReLU layer.

Extension

This node is part of the extension

KNIME Deep Learning - Keras Integration

v4.0.0

Short Link

Drag node into KNIME Analytics Platform