Generate Text Using a Many-To-One LSTM Network (Deployment)
This workflows shows two options how the previously trained TensorFlow network to generate fairy tales can be used to generates text in fairy tale style.
Both options read the previously trained TensorFlow network and predict a sequences of index-encoded characters within a loop.
The difference between the two options is in the Extract Index metanode.
The metanode uses probability distribution over all possible indexes to make the predictions. In the Deployment Workflow I the index with the highest probability is extracted. In the Deployment workflow II the next index based is picked based on the given probability distribution.
This workflows shows two options how the previously trained TensorFlow network to generate fairy tales can be used to generates text in fairy tale style.
Both options read the previously trained TensorFlow network and predict a sequences of index-encoded characters within a loop.
The difference between the two options is in the Extract Index metanode.
The metanode uses probability distribution over all possible indexes to make the predictions. In the Deployment Workflow I the index with the highest probability is extracted. In the Deployment workflow II the next index based is picked based on the given probability distribution.