Sentiment Analysis with(out) attention - First working solution
This workflow shows how to train neural networks for text classification, in this case sentiment analysis, with and without attention. The used networks learn a 128 dimensional word embedding followed by a simple RNN, an LSTM and an attention-enhanced RNN.
This example, initially adapted from the Keras example script at https://github.com/keras-team/keras/blob/master/examples/imdb_lstm.py, is then enhanced to include attention as per discussion at https://towardsdatascience.com/create-your-own-custom-attention-layer-understand-all-flavours-2201b5e8be9e
Please find a thorough description of the solution on the KNIME Community Journal, "Low Code for Data Science" (see link below)
In order to run the example, please make sure you have the following KNIME extensions installed:
* KNIME Deep Learning - Keras Integration (Labs)
You also need a local Python installation that includes Keras. Please refer to https://www.knime.com/deeplearning#keras for installation recommendations and further information.