Keras ELU Layer

Other

Exponential linear units were introduced to alleviate the disadvantages of ReLU and LeakyReLU units, namely to push the mean activation closer to zero while still saturating to a negative value which increases robustness against noise if the unit is in an off state (i.e. the input is very negative). The formula is f(x) = alpha * (exp(x) - 1) for x < 0 and f(x) = x for x >= 0. For the exact details see the corresponding paper. Corresponds to the Keras ELU Layer.

Input Ports

  1. Type: Keras Deep Learning Network
    The Keras deep learning network to which to add a ELU layer.

Output Ports

  1. Type: Keras Deep Learning Network
    The Keras deep learning network with an added ELU layer.

Extension

This node is part of the extension

KNIME Deep Learning - Keras Integration

v4.0.0

Short Link

Drag node into KNIME Analytics Platform