Hub
  • Software
  • Blog
  • Forum
  • Events
  • Documentation
  • About KNIME
  • KNIME Hub
  • Nodes
  • Create Local Big Data Environment (Legacy) (deprecated)
NodeNode / Source

Create Local Big Data Environment (Legacy) (deprecated)

Tools & Services Apache Spark
Drag & drop
Like
Copy short link

Creates a fully functional local big data environment including Apache Hive, Apache Spark and HDFS.

The Spark WebUI of the created local Spark context is available via the Spark context outport view. Simply click on the Click here to open link and the Spark WebUI is opened in the internal web browser.

Note : Executing this node only creates a new Spark context, when no local Spark context with the same Context name currently exists. Resetting the node does not destroy the context. Whether closing the KNIME workflow will destroy the context or not, depends on the configured Action to perform on dispose . Spark contexts created by this node can be shared between KNIME workflows.

Note: This node uses the old database connection based Hive output port.

Node details

Output ports
  1. Type: Database Connection
    Hive Connection
    JDBC connection to a local Hive instance. This port can be connected to the KNIME database nodes.
  2. Type: Remote Connection
    HDFS Connection
    HDFS connection that points to the local file system. This port can be connected for example to the Spark nodes that read/write files.
  3. Type: Spark Context
    Spark Context
    Local Spark context, that can be connected to all Spark nodes.

Extension

The Create Local Big Data Environment (Legacy) (deprecated) node is part of this extension:

  1. Go to item

Related workflows & nodes

  1. Go to item
    2.Modelling
    iris > Public > KNIME_Workshop_ECB_Analytics > workflows > 2.Modelling
  2. Go to item
    add fields to Hive table
    Hive Sql Big data
    There has been no description set for this workflow's metadata.
    mlauber71 > Public > kn_example_bigdata_hive_add_column
  3. Go to item
    Hive - how to get from DB-Connectors to Hive (or Impala) tables - Legacy nodes up to vers. 3.7.x
    Big data Hive Impala
    +2
    Hive - how to get from DB-Connectors to Hive (or Impala) tables
    mlauber71 > Public > kn_example_hive_db_loader_37
  4. Go to item
    2.Modelling - solution
    iris > Public > KNIME_Workshop_ECB_Analytics > solutions > 2.Modelling - solution
  5. Go to item
    Fetch and Transform PubChem Data
    BigData Spark JSON
    +2
    This workflow prepares a data set using Local Big Data Environment for Data Chefs Battle:…
    b_eslami > Public > 02_Chemistry_and_Life_Sciences > 02_Fetch_And_Transform_PubChem_Data > 02_Fetch_And_Transform_PubChem_Data
  6. Go to item
    Interactive Big Data Exploration and Visualization
    Big data Data exploration Visualization
    +3
    This workflow shows how to perform data exploration and visualization on a large dataset …
    knime > Examples > 50_Applications > 49_NYC_Taxi_Visualization > Taxi_Visualization
  1. Go to item
  2. Go to item
  3. Go to item
  4. Go to item
  5. Go to item
  6. Go to item

KNIME
Open for Innovation

KNIME AG
Hardturmstrasse 66
8005 Zurich, Switzerland
  • Software
  • Getting started
  • Documentation
  • E-Learning course
  • Solutions
  • KNIME Hub
  • KNIME Forum
  • Blog
  • Events
  • Partner
  • Developers
  • KNIME Home
  • KNIME Open Source Story
  • Careers
  • Contact us
Download KNIME Analytics Platform Read more on KNIME Server
© 2022 KNIME AG. All rights reserved.
  • Trademarks
  • Imprint
  • Privacy
  • Terms & Conditions
  • Credits