Hub
  • Software
  • Blog
  • Forum
  • Events
  • Documentation
  • About KNIME
  • KNIME Hub
  • Nodes
  • HDFS Connector
NodeNode / Source

HDFS Connector

IO Connectors
Drag & drop
Like
Copy short link

This node connects to a Hadoop Distributed File System using HDFS, WebHDFS or HTTPFS. The resulting output port allows downstream nodes to access the files of the remote file system, e.g. to read or write, or to perform other file system operations (browse/list files, copy, move, ...).

Path syntax: Paths for HDFS are specified with a UNIX-like syntax, /myfolder/myfile. An absolute for HDFS consists of:

  1. A leading slash ("/").
  2. Followed by the path to the file ("myfolder/myfile" in the above example).

SSL: This node uses the JVM SSL settings.

Node details

Output ports
  1. Type: File System
    HDFS File System Connection
    HDFS File System Connection.

Extension

The HDFS Connector node is part of this extension:

  1. Go to item

Related workflows & nodes

  1. Go to item
    Spark Java snippet nodes
    Spark Hadoop Big Data
    This workflow demonstrates the usage of the different Spark Java Snippet nodes to read a …
    knime > Education > Courses > L4-BD Introduction to Big Data with KNIME Analytics Platform > 3_Spark > 4_Examples > 06_Modularized_Spark_Scripting
  2. Go to item
    EC2_connection
    mraraman > Public > IPF > EC2_connection
  3. Go to item
    EC2_connection
    alamisrar > Public > IPF > EC2_connection
  4. Go to item
    Working with Utility Nodes
    File handling Zip Unzip
    +4
    Download compressed file, extract it, read extracted file and finaly delete extracted fil…
    knime > Examples > 01_Data_Access > 01_Common_Type_Files > 11_Working_with_Utility_Nodes
  5. Go to item
    Incremental Data Processing with Parquet
    Parquet Incremental loading NYC taxi dataset
    +3
    In this workflow, we will use the NYC taxi dataset to show case a continous preprocessing…
    knime > Examples > 01_Data_Access > 01_Common_Type_Files > 12_Incremental_processing_Parquet_file
  6. Go to item
    Data Transfer between Clouds
    File handling Google Sharepoint
    +1
    This workflow demonstrates the utilization of the new file system connection nodes within…
    knime > Examples > 01_Data_Access > 06_ZIP_and_Remote_Files > 09_Data_Transfer_between_Clouds

No known nodes available

KNIME
Open for Innovation

KNIME AG
Hardturmstrasse 66
8005 Zurich, Switzerland
  • Software
  • Getting started
  • Documentation
  • E-Learning course
  • Solutions
  • KNIME Hub
  • KNIME Forum
  • Blog
  • Events
  • Partner
  • Developers
  • KNIME Home
  • KNIME Open Source Story
  • Careers
  • Contact us
Download KNIME Analytics Platform Read more on KNIME Server
© 2022 KNIME AG. All rights reserved.
  • Trademarks
  • Imprint
  • Privacy
  • Terms & Conditions
  • Credits