Hub
  • Software
  • Blog
  • Forum
  • Events
  • Documentation
  • About KNIME
  • KNIME Hub
  • Nodes
  • Azure Blob Storage Connector
NodeNode / Source

Azure Blob Storage Connector

IO Connectors
Drag & drop
Like
Copy short link

This node connects to Azure Blob Storage. The resulting output port allows downstream nodes to access the Azure Blob Storage data as a file system, e.g. to read or write files and folders, or to perform other file system operations (browse/list files, copy, move, ...).

This node requires the Microsoft Authentication to perform authentication. The following authentication modes are supported:

  • Interactive Authentication
  • Username/password Authentication
  • Shared key authentication (Azure Storage only)
  • Shared access signature (SAS) authentication (Azure Storage only)

Path syntax: Paths for Azure Blob Storage are specified with a UNIX-like syntax, /mycontainer/myfolder/myfile . An absolute for Azure Blob Storage consists of:

  1. A leading slash ( / ).
  2. Followed by the name of a container ( mycontainer in the above example), followed by a slash.
  3. Followed by the name of an object within the container ( myfolder/myfile in the above example).

URI formats: When you apply the Path to URI node to paths coming from this connector, you can create URIs with the following formats:

  1. Shared Access Signature (SAS) URLs which contain credentials, that allow to access files for a certain amount of time (see Azure documentation ).
  2. wasbs:// URLs to access Azure Blob Storage from inside Hadoop environments.

Node details

Input ports
  1. Type: Microsoft Credential
    Microsoft Connection
    Microsoft Connection that provides user authentication.
Output ports
  1. Type: File System
    Azure Blob Storage File System Connection
    Azure Blob Storage File System Connection

Extension

The Azure Blob Storage Connector node is part of this extension:

  1. Go to item

Related workflows & nodes

  1. Go to item
    Read Data from Microsoft Azure Cloud
    Azure Azure Cloud Microsoft Azure
    +7
    This workflow demonstrates how to read data files (Text, Excel, KNIME Table and CSV) from…
    knime > Beginners Space > 01_Read > 05_Read_Data_from_Microsoft_Azure_Cloud
  2. Go to item
    Read Data from Microsoft Azure Cloud
    Azure Azure Cloud Microsoft Azure
    +7
    This workflow demonstrates how to read data files (Text, Excel, KNIME Table and CSV) from…
    scebesoy > Public > 01_Read > 05_Read_Data_from_Microsoft_Azure_Cloud
  3. Go to item
    Will They Blend? Amazon S3 meets MS Blob Storage plus Excel
    Data blending S3 Microsoft
    +6
    The challenge here is to blend S3 formatted data from the Amazon Cloud with Blob Storage …
    knime > Examples > 01_Data_Access > 06_ZIP_and_Remote_Files > 04_AmazonS3-MSBlobStorage_Census_Data
  4. Go to item
    Will They Blend? Amazon S3 meets MS Blob Storage plus Excel
    Data blending S3 Microsoft
    +6
    The challenge here is to blend S3 formatted data from the Amazon Cloud with Blob Storage …
    haoran > Public > 01_Data_Access > 06_ZIP_and_Remote_Files > 04_AmazonS3-MSBlobStorage_Census_Data
  5. Go to item
    Working with Utility Nodes
    File handling Zip Unzip
    +4
    Download compressed file, extract it, read extracted file and finaly delete extracted fil…
    knime > Examples > 01_Data_Access > 01_Common_Type_Files > 11_Working_with_Utility_Nodes
  6. Go to item
    Working with Utility Nodes
    File handling Zip Unzip
    +4
    Download compressed file, extract it, read extracted file and finaly delete extracted fil…
    haoran > Public > 01_Data_Access > 01_Common_Type_Files > 11_Working_with_Utility_Nodes
  7. Go to item
    Incremental Data Processing with Parquet
    Parquet Incremental loading NYC taxi dataset
    +3
    In this workflow, we will use the NYC taxi dataset to show case a continous preprocessing…
    knime > Examples > 01_Data_Access > 01_Common_Type_Files > 12_Incremental_processing_Parquet_file
  8. Go to item
    Incremental Data Processing with Parquet
    Parquet Incremental loading NYC taxi dataset
    +3
    In this workflow, we will use the NYC taxi dataset to show case a continous preprocessing…
    haoran > Public > 01_Data_Access > 01_Common_Type_Files > 12_Incremental_processing_Parquet_file
  9. Go to item
    Data Transfer between Clouds
    File handling Google Sharepoint
    +1
    This workflow demonstrates the utilization of the new file system connection nodes within…
    knime > Examples > 01_Data_Access > 06_ZIP_and_Remote_Files > 09_Data_Transfer_between_Clouds
  10. Go to item
    Data Transfer between Clouds
    File handling Google Sharepoint
    +1
    This workflow demonstrates the utilization of the new file system connection nodes within…
    haoran > Public > 01_Data_Access > 06_ZIP_and_Remote_Files > 09_Data_Transfer_between_Clouds

No known nodes available

KNIME
Open for Innovation

KNIME AG
Hardturmstrasse 66
8005 Zurich, Switzerland
  • Software
  • Getting started
  • Documentation
  • E-Learning course
  • Solutions
  • KNIME Hub
  • KNIME Forum
  • Blog
  • Events
  • Partner
  • Developers
  • KNIME Home
  • KNIME Open Source Story
  • Careers
  • Contact us
Download KNIME Analytics Platform Read more on KNIME Server
© 2022 KNIME AG. All rights reserved.
  • Trademarks
  • Imprint
  • Privacy
  • Terms & Conditions
  • Credits