Hub
  • Software
  • Blog
  • Forum
  • Events
  • Documentation
  • About KNIME
  • KNIME Hub
  • Nodes
  • Keras ELU Layer
NodeNode / Other

Keras ELU Layer

Analytics Integrations Deep Learning Keras Layers
+1
Drag & drop
Like
Copy short link

Exponential linear units were introduced to alleviate the disadvantages of ReLU and LeakyReLU units, namely to push the mean activation closer to zero while still saturating to a negative value which increases robustness against noise if the unit is in an off state (i.e. the input is very negative). The formula is f(x) = alpha * (exp(x) - 1) for x < 0 and f(x) = x for x >= 0 . For the exact details see the corresponding paper . Corresponds to the Keras ELU Layer .

External resources

  • KNIME Deep Learning Keras Integration

Node details

Input ports
  1. Type: Keras Deep Learning Network
    Keras Network
    The Keras deep learning network to which to add a ELU layer.
Output ports
  1. Type: Keras Deep Learning Network
    Keras Network
    The Keras deep learning network with an added ELU layer.

Extension

The Keras ELU Layer node is part of this extension:

  1. Go to item

Related workflows & nodes

    1. Go to item

    KNIME
    Open for Innovation

    KNIME AG
    Hardturmstrasse 66
    8005 Zurich, Switzerland
    • Software
    • Getting started
    • Documentation
    • E-Learning course
    • Solutions
    • KNIME Hub
    • KNIME Forum
    • Blog
    • Events
    • Partner
    • Developers
    • KNIME Home
    • KNIME Open Source Story
    • Careers
    • Contact us
    Download KNIME Analytics Platform Read more on KNIME Server
    © 2022 KNIME AG. All rights reserved.
    • Trademarks
    • Imprint
    • Privacy
    • Terms & Conditions
    • Credits