Similar to ordinary ReLUs but shifted by theta. f(x) = x for x > theta, f(x) = 0 otherwise . Corresponds to the Keras Thresholded ReLU Layer .
- Type: Keras Deep Learning NetworkKeras NetworkThe Keras deep learning network to which to add a Thresholded ReLU layer.