Creates a fully functional local big data environment including Apache Hive, Apache Spark and HDFS.
The Spark WebUI of the created local Spark context is available via the Spark context outport view. Simply click on the Click here to open link and the Spark WebUI is opened in the internal web browser.
Note : Executing this node only creates a new Spark context, when no local Spark context with the same Context name currently exists. Resetting the node does not destroy the context. Whether closing the KNIME workflow will destroy the context or not, depends on the configured Action to perform on dispose . Spark contexts created by this node can be shared between KNIME workflows.