Hub
  • Software
  • Blog
  • Forum
  • Events
  • Documentation
  • About KNIME
Sign in
  • KNIME Hub
  • knime
  • Spaces
  • Examples
  • 10_Big_Data
  • 02_Spark_Executor
  • 05_Hive_to_Spark_to_Hive
WorkflowWorkflow

Hive to Spark and Spark to Hive

Spark Hive Hadoop Big Data SQL

Last update: 

Drag Workflow
Workflow preview
This workflow demonstrates the usage of the Hive to Spark and Spark to Hive nodes that allow you to transfer data between Apache Spark and Apache Hive. To run this workflow on a remote cluster, use an HDFS Connection node, Hive Connector node, and Create Spark Context (Livy) node (available in the KNIME Big Data Connectors Extension) in place of the Create Local Big Data Environment node.

Used extensions & nodes

Created with KNIME Analytics Platform version 4.1.0
  • KNIME Core Trusted extension

    KNIME AG, Zurich, Switzerland

    Version 4.1.0

  • KNIME Database Trusted extension

    KNIME AG, Zurich, Switzerland

    Version 4.1.0

  • KNIME Extension for Apache Spark Trusted extension

    KNIME AG, Zurich, Switzerland

    Version 4.1.0

  • KNIME Extension for Local Big Data Environments Trusted extension

    KNIME AG, Zurich, Switzerland

    Version 4.1.0

Legal

By downloading the workflow, you agree to our terms and conditions.

License (CC-BY-4.0)
Short link
Discussion
Discussions are currently not available, please try again later.

KNIME
Open for Innovation

KNIME AG
Hardturmstrasse 66
8005 Zurich, Switzerland
  • Software
  • Getting started
  • Documentation
  • E-Learning course
  • Solutions
  • KNIME Hub
  • KNIME Forum
  • Blog
  • Events
  • Partner
  • Developers
  • KNIME Home
  • KNIME Open Source Story
  • Careers
  • Contact us
Download KNIME Analytics Platform Read more on KNIME Server
© 2021 KNIME AG. All rights reserved.
  • Trademarks
  • Imprint
  • Privacy
  • Terms & Conditions
  • Credits