Hub
Pricing About
NodeNode / Other

Keras ELU Layer

AnalyticsIntegrationsDeep LearningKerasLayers
+1
Drag & drop
Like

Exponential linear units were introduced to alleviate the disadvantages of ReLU and LeakyReLU units, namely to push the mean activation closer to zero while still saturating to a negative value which increases robustness against noise if the unit is in an off state (i.e. the input is very negative). The formula is f(x) = alpha * (exp(x) - 1) for x < 0 and f(x) = x for x >= 0 . For the exact details see the corresponding paper . Corresponds to the Keras ELU Layer .

External resources

  • KNIME Deep Learning Keras Integration

Node details

Input ports
  1. Type: Keras Deep Learning Network
    Keras Network
    The Keras deep learning network to which to add a ELU layer.
Output ports
  1. Type: Keras Deep Learning Network
    Keras Network
    The Keras deep learning network with an added ELU layer.

Extension

The Keras ELU Layer node is part of this extension:

  1. Go to item

Related workflows & nodes

  1. Go to item
  2. Go to item
  3. Go to item

KNIME
Open for Innovation

KNIME AG
Talacker 50
8001 Zurich, Switzerland
  • Software
  • Getting started
  • Documentation
  • Courses + Certification
  • Solutions
  • KNIME Hub
  • KNIME Forum
  • Blog
  • Events
  • Partner
  • Developers
  • KNIME Home
  • Careers
  • Contact us
Download KNIME Analytics Platform Read more about KNIME Business Hub
© 2025 KNIME AG. All rights reserved.
  • Trademarks
  • Imprint
  • Privacy
  • Terms & Conditions
  • Data Processing Agreement
  • Credits