This node loads a KNIME data table into Hive. Hive requires imported data to be present on the Hive server, therefore this node first copies the data onto the Hive server. You can use any of the protocols supported by the file handling nodes, e.g. SSH/SCP or FTP. The data is then loaded into a Hive table and the uploaded file is deleted.
Additionally the data can be partitioned by selecting one or more compatible columns (e.g. integer or string). The node relies on Hive's dynamic partitioning.
- Type: Remote Connection A connection to the remote Hive server
- Type: Data The data table that should be loaded into Hive
- Type: Database Connection A connection to a Hive database
- Type: Database Query A database connection with the imported table
Make sure to have this extension installed:
KNIME Big Data Connectors
Update site for KNIME Analytics Platform 3.7:
KNIME Analytics Platform 3.7 Update Site