Hub
Pricing About
NodeNode / Source

Create Databricks Environment

Tools & ServicesApache Spark
Drag & drop
Like

Creates a Databricks Environment connected to an existsing Databricks cluster. See AWS or Azure Databricks documentation for more information.

Note: To avoid an accidental cluster startup, this node creates a dummy DB and Spark port if loaded in executed state from a stored workflow. Reset and execute the node to start the cluster and create a Spark execution context.

Cluster access control : KNIME uploads additional libraries to the cluster. This requires manage cluster-level permissions if your cluster is secured with access control. See the Databricks documentation on how to set up the permission.

Node details

Output ports
  1. Type: DB Session
    DB Connection
    JDBC connection, that can be connected to the KNIME database nodes.
  2. Type: File System
    DBFS Connection
    DBFS connection, that can be connected to the Spark nodes to read/write files.
  3. Type: Spark Context
    Spark Context
    Spark context, that can be connected to all Spark nodes.
Databricks Workspace Connection (Dynamic Inport)
Databricks Workspace Connection, that can be connected to the Databricks Workspace Connector.
  1. Type: org.knime.credentials.base.CredentialPortObject

Extension

The Create Databricks Environment node is part of this extension:

  1. Go to item

Related workflows & nodes

  1. Go to item
  2. Go to item
  3. Go to item

KNIME
Open for Innovation

KNIME AG
Talacker 50
8001 Zurich, Switzerland
  • Software
  • Getting started
  • Documentation
  • Courses + Certification
  • Solutions
  • KNIME Hub
  • KNIME Forum
  • Blog
  • Events
  • Partner
  • Developers
  • KNIME Home
  • Careers
  • Contact us
Download KNIME Analytics Platform Read more about KNIME Business Hub
© 2025 KNIME AG. All rights reserved.
  • Trademarks
  • Imprint
  • Privacy
  • Terms & Conditions
  • Data Processing Agreement
  • Credits