Term Document Entropy

Learner

This node computes the informational entropy of each term in each document. The nodes requires a bag of words table as input and appends an additional column to the output table, containing the entropy values. If a term occurs once in every document, its entropy for each document is 0. Any other combination of frequencies determines an entropy weight between 0 and 1. Please note, that the computational complexity of of the entropy calculation is greater than the number of terms times the number of documents. For big bag of words input tables the computation can be quite time consuming.

Input Ports

  1. Type: Data
    The input table which contains terms and documents.

Output Ports

  1. Type: Data
    The output table with terms, documents and a corresponding entropy value.

Extension

This node is part of the extension

KNIME Textprocessing

v4.0.0

Short Link

Drag node into KNIME Analytics Platform