Hub
Pricing About
WorkflowWorkflow

Hive to Spark and Spark to Hive

SparkHiveHadoopBig DataSQL
gosia profile image
Draft Latest edits on 
Dec 6, 2024 9:17 AM
Drag & drop
Like
Download workflow
Workflow preview
This workflow demonstrates the usage of the Hive to Spark and Spark to Hive nodes that allow you to transfer data between Apache Spark and Apache Hive.

To run this workflow on a remote cluster, use an HDFS Connection node, Hive Connector node, and Create Spark Context (Livy) node (available in the KNIME Big Data Connectors Extension) in place of the Create Local Big Data Environment node.

Loading deploymentsLoading ad hoc jobs

Used extensions & nodes

Created with KNIME Analytics Platform version 4.4.0
  • Go to item
    KNIME Base nodesTrusted extension

    KNIME AG, Zurich, Switzerland

    Version 4.4.0

    knime
  • Go to item
    KNIME DatabaseTrusted extension

    KNIME AG, Zurich, Switzerland

    Version 4.4.0

    knime
  • Go to item
    KNIME Extension for Apache SparkTrusted extension

    KNIME AG, Zurich, Switzerland

    Version 4.4.0

    knime
  • Go to item
    KNIME Extension for Local Big Data EnvironmentsTrusted extension

    KNIME AG, Zurich, Switzerland

    Version 4.4.0

    knime

Legal

By using or downloading the workflow, you agree to our terms and conditions.

KNIME
Open for Innovation

KNIME AG
Talacker 50
8001 Zurich, Switzerland
  • Software
  • Getting started
  • Documentation
  • Courses + Certification
  • Solutions
  • KNIME Hub
  • KNIME Forum
  • Blog
  • Events
  • Partner
  • Developers
  • KNIME Home
  • Careers
  • Contact us
Download KNIME Analytics Platform Read more about KNIME Business Hub
© 2025 KNIME AG. All rights reserved.
  • Trademarks
  • Imprint
  • Privacy
  • Terms & Conditions
  • Data Processing Agreement
  • Credits