Generate Text Using a Many-To-One LSTM Network (Training)

Workflow preview
The workflow builds, trains, and saves an RNN with an LSTM layer to generate new fictive fairy tales. The brown nodes define the network structure. The "Pre-Processing" metanode reads fairy tales and index-encodes them, and creates semi-overlapping sequences. The Keras Network Learner node trains the network using the index-encoded fairy tales. Finally, the trained network is converted into a TensorFlow model, and saved to a file.
hosted by

Download workflow

By downloading the workflow, you agree to our terms and conditions.

CC-BY-4.0

Discussion