Hub
  • Software
  • Blog
  • Forum
  • Events
  • Documentation
  • About KNIME
Sign in
  • KNIME Hub
  • Nodes
  • Create Temp Folder
NodeNode / Other

Create Temp Folder

This node supports the path flow variable to specify the base location. To convert the created path variables to string variables which are required by some nodes that have not been migrated yet you can use the Path to String (Variable) node. For further information about file handling in general see the File Handling Guide.

Creates a temporary folder upon execute and exposes its path as flow variable. This can be useful in (a) demo applications where the actual path of the output is not all that relevant, e.g. the KNIME public workflow server and (b) KNIME WebPortal and quickform flows, where some data is written, which is later downloaded by means of, e.g. a web link. The folder is deleted upon closing the workflow. The node can be also configured in a way that the created temporary folder is deleted upon reset.
Note: By default the temporary folder is created directly in the workflow data area, as defined by “.“ in the Folder field

Node details

Input ports
  1. Type: Flow Variable
    Input variables (optional)
    Input variables (optional).
Output ports
  1. Type: Flow Variable
    Flow Variables with path information
    Flow Variables with path information.
File System Connection (Dynamic Inport)
The file system connection.
  1. Type: File System

Extension

The Create Temp Folder node is part of this extension:

Related workflows & nodes

  1. Zenodo GET
    stelfrich > Public > Zenodo GET
  2. Column Expression Example
    willem > Public > Column Expressions > Column Expression Example
  3. Images to Movie
    kevin_sturm > Public > Images to Movie > Images to Movie
  4. Big Data preprocessing
    Hive Hadoop Big Data +2
    This workflow demonstrates the usage of the DB nodes in conjunction with the Create Local Big Data Environment node, wh…
    knime > Education > Courses > L4-BD Introduction to Big Data with KNIME Analytics Platform > 2_Hadoop > 4_Examples > 01_Big_Data_Preprocessing_Example
  5. SpeedTestKNIMEvsKNIME
    iris > Public > SpeedTestKNIMEvsKNIME
  6. 05_WebPortal_Data_Mining
    knime > Education > Courses > L2-DS KNIME Analytics Platform for Data Scientists - Advanced > Supplementary Workflows > 05_WebPortal_Data_Mining
  7. Read one or more files widgets
    iris > Public > ExamplesForTheForum > 2021_03_08_Read one or more files widgets
  8. Chunk-based file compression
    There has been no description set for this workflow's metadata.
    julian.bunzel > Public > Forum > Examples > CompressChunks
  9. Get Images from SiLA2 Server and Send Back Results
    Laboratory Data SiLA2 Life Sciences +1
    This workflow demonstrates how to get raw images from a (custom) SiLA2 server, how to process the data with KNIME nodes…
    knime > Life Sciences > Laboratory Data > SiLA Prototype > SiLA_Images
  10. HDFS file handling
    HDFS Hadoop Big Data
    This workflow demonstrates the HDFS file handling capabilites using the file handling nodes in conjunction with an HDFS…
    knime > Education > Courses > L4-BD Introduction to Big Data with KNIME Analytics Platform > 2_Hadoop > 4_Examples > 02_HDFS_and_File_Handling_Example

No known nodes available

KNIME
Open for Innovation

KNIME AG
Hardturmstrasse 66
8005 Zurich, Switzerland
  • Software
  • Getting started
  • Documentation
  • E-Learning course
  • Solutions
  • KNIME Hub
  • KNIME Forum
  • Blog
  • Events
  • Partner
  • Developers
  • KNIME Home
  • KNIME Open Source Story
  • Careers
  • Contact us
Download KNIME Analytics Platform Read more on KNIME Server
© 2021 KNIME AG. All rights reserved.
  • Trademarks
  • Imprint
  • Privacy
  • Terms & Conditions
  • Credits