Keras ELU Layer

Other

Exponential linear units were introduced to alleviate the disadvantages of ReLU and LeakyReLU units, namely to push the mean activation closer to zero while still saturating to a negative value which increases robustness against noise if the unit is in an off state (i.e. the input is very negative). The formula is f(x) = alpha * (exp(x) - 1) for x < 0 and f(x) = x for x >= 0. For the exact details see the corresponding paper. Corresponds to the Keras ELU Layer.

Input Ports

  1. Type: DLKerasNetworkPortObjectBase The Keras deep learning network to which to add a <tt>ELU</tt> layer.

Output Ports

  1. Type: DLKerasNetworkPortObjectBase The Keras deep learning network with an added <tt>ELU</tt> layer.

Find here

KNIME Labs > Deep Learning > Keras > Layers > Advanced Activations

Make sure to have this extension installed:

KNIME Deep Learning - Keras Integration

Update site for KNIME Analytics Platform 3.7:
KNIME Analytics Platform 3.7 Update Site

How to install extensions