Connect to Azure Blob Storage using a Storage Account and an Access Key
Folders created on root level are created as containers on Azure Blob Storage. The virtual file system functionality is fully supported by the file handling nodes.
In the first step a container is created followed by a directory inside that container. Next, a file (iris.csv) is uploaded to the directory inside the container (path: /examplebucket***/exampledirectory). Finally all the files and folders inside the example container are listed. The result is a list of URIs, which can be used to download files/directories.
In the second step we download the iris.csv which was uploaded in the previous step. This saves the file to disk, this file can then be read using a csv reader. Since it might be unnecessary to cache the file to disk, the Azure Blob Store File Picker provides the functionality to create URLs from which files can be read directly. This works with all file readers that support reading files from a URL. These URLs have an expiration date which can be set in the dialog.
Stream reading also becomes useful in conjunction with the KNIME Streaming Executor as the (possibly large) data is read from the remote side and is immediately processed without being downloaded to the local machine.
After having seen how we upload and download files and how we use the File Picker to read files directly we can delete our example container from Azure Blob Storage
Workflow
Azure Blob Store Remote File Example
Used extensions & nodes
Created with KNIME Analytics Platform version 4.1.1
Note: Not all extensions may be displayed.
- Go to item
KNIME Azure Cloud Connectors
This is an unpublished or unknown extension.
KNIME AG, Zurich, Switzerland
Version 4.1.0
- Go to item
- Go to item
- Go to item
- Go to item
- Go to item
- Go to item
Loading deployments
Loading ad hoc executions
Legal
By using or downloading the workflow, you agree to our terms and conditions.
Discussion
Discussions are currently not available, please try again later.