This workflow trains a neural machine translation model on character level using an encoder-decoder LSTM network.
The encoder network reads the input sentence character by character and summarizes the sentence in its state. This state is then used as initial state of the decoder network to produce the translated sentence one character at a time. During prediction, the decoder also recieves its previous output as input to the next time. For training we use a technique called "teacher forcing" i.e. we feed the actual previous character instead of the previous prediction which greatly benefits the training.
This example is an adaptation of the following Keras blog post to KNIME: https://blog.keras.io/a-ten-minute-introduction-to-sequence-to-sequence-learning-in-keras.html
Workflow
Neural Machine Translation from English to German: Training Workflow
Used extensions & nodes
Created with KNIME Analytics Platform version 4.3.0
- Go to item
- Go to item
- Go to item
- Go to item
- Go to item
- Go to item
Legal
By using or downloading the workflow, you agree to our terms and conditions.