Hub
  • Software
  • Blog
  • Forum
  • Events
  • Documentation
  • About KNIME
  • KNIME Hub
  • Nodes
  • Call Workflow Service
NodeNode / Other

Call Workflow Service

KNIME Labs Workflow Abstraction
Drag & drop
Like
Copy short link

Calls another workflow and obtains the results for further processing in this workflow. The workflow can receive inputs via Workflow Service Input nodes and return outputs using Workflow Service Output nodes.

Each Workflow Service Input node in the workflow to be called will create an input port on this node when finishing the configuration of the Call Workflow Service node. Similarly, each Workflow Service Output node will create an output port on the node.

The called workflow can be located on a KNIME Server or in the Local Workspace. If the workflow is located on a KNIME Server, the execution will take place on the server. In contrast, when using the Workflow Executor node, the execution will always be performed in the workflow containing the Workflow Executor node.

The difference between this node and the Call Workflow (Table Based) node is the set of supported workflow input and output nodes. This node supports the Workflow Service Input and Workflow Service Output nodes, which support arbitrary port types and are more efficient than the various Container Input and Container Output nodes. The container input and output nodes on the other hand expect and produce data in a format that can easily be produced by third-party software, whereas Workflow Service Input and Workflow Service Output nodes are designed exclusively to be used by other KNIME workflows.

To define which Workflow Service Input node receives data from which of this node's input ports, each input node defines a parameter identifier. The parameter identifier is supposed to be unique, but it might happen that a workflow has multiple input nodes defining the same parameter name. In this case, KNIME will make the parameter names unique by appending the input node's node ID to the parameter name, e.g., "input-table" becomes "input-table-7".

Node details

Input ports
  1. Type: File System
    KNIME server connection
    Optional connection to a KNIME Server where the workflow to be executed is located. If connected, the execution of the workflow will be performed on that server.
Inputs (Dynamic Inport)
One output port for each Workflow Service Input node in the workflow to be executed. The ports are automatically created when finishing the configuration of the node.
  1. Type: Google Sheets
  2. Type: Shape
  3. Type: FilterDefinition
  4. Type: Weka Cluster
  5. Type: Spark Context Legacy
  6. Type: Gradient Boosting Model
  7. Type: Weka 3.7 Cluster
  8. Type: Cluster Tree
  9. Type: Spark Context
  10. Type: Word Vector Model
  11. Type: Word Vector Model (deprecated)
  12. Type: ONNX Deep Learning Network
  13. Type: H2O Model
  14. Type: KNFST Model
  15. Type: Naive Bayes
  16. Type: Google Analytics
  17. Type: Spark Model Legacy
  18. Type: DLNetworkPortObject
  19. Type: DL4J Model
  20. Type: Keras Deep Learning Network
  21. Type: DB Session
  22. Type: PMML
  23. Type: Google API
  24. Type: OPTICS Clustering Port
  25. Type: Spark Data Legacy
  26. Type: Lucene
  27. Type: AWS Comprehend Connection
  28. Type: Database Query
  29. Type: Inactive branch
  30. Type: AWSConnection
  31. Type: LibSVM
  32. Type: Density Scorer Model
  33. Type: PMML Preprocessing
  34. Type: Regression Tree
  35. Type: Python
  36. Type: Radial Basis Function
  37. Type: Semantic Web Connection
  38. Type: Spark Data
  39. Type: URI Object
  40. Type: Weka 3.7 Classifier
  41. Type: Fuzzy Basis Function
  42. Type: Spark Model (legacy com)
  43. Type: Normalizer
  44. Type: VectorHashingPortObject
  45. Type: Rowset Filter
  46. Type: Tree Ensembles
  47. Type: H2O Frame
  48. Type: Color
  49. Type: R Workspace
  50. Type: TensorFlow 2 Model
  51. Type: Weka 3.7 Classifier
  52. Type: Flow Variable
  53. Type: Weka 3.7 Classifier
  54. Type: Weka 3.6 Classifier
  55. Type: Regression Tree
  56. Type: Salesforce OAuth2 Connection
  57. Type: Weka 3.6 Cluster
  58. Type: H2O Context
  59. Type: Weka 3.7 Cluster
  60. Type: Size
  61. Type: Large Filestore
  62. Type: Correlation
  63. Type: Keras Deep Learning Network
  64. Type: org.knime.mongodb.connection.port.MongoDBConnectionPortObject
  65. Type: Spark Data (legacy com)
  66. Type: Network
  67. Type: Workflow Port Object
  68. Type: DLPythonNetworkPortObject
  69. Type: CAIM
  70. Type: Transformation
  71. Type: Sota
  72. Type: PCA
  73. Type: OpenNlpNerTaggerModelPortObject
  74. Type: Fingerprint Bayes
  75. Type: KnoxHdfsConnection
  76. Type: Google Cloud Storage Connection
  77. Type: Remote Connection
  78. Type: AzureConnection
  79. Type: URI Object
  80. Type: Google Drive
  81. Type: Arima
  82. Type: Twitter
  83. Type: Python
  84. Type: Microsoft Credential
  85. Type: XGBoostModel
  86. Type: DB Data
  87. Type: Weka Classifier
  88. Type: Compiled PMML
  89. Type: PMML Discretization
  90. Type: Keras Deep Learning Network
  91. Type: Spark ML Model
  92. Type: Database Connection
  93. Type: Feature Elimination
  94. Type: Tree Ensembles
  95. Type: Weak Label Model
  96. Type: DocumentVectorPortObject
  97. Type: Neighbourgram
  98. Type: StanfordNERModelPortObject
  99. Type: TensorFlow Deep Learning Network
  100. Type: org.knime.python3.nodes.PythonBinaryBlobFileStorePortObject
  101. Type: PortObject
  102. Type: Spark Context (legacy com)
  103. Type: Python (deprecated)
  104. Type: KnimeConnection
  105. Type: Distance Measure
  106. Type: MOJO
  107. Type: KafkaConnection
  108. Type: Spark MLlib Model
  109. Type: Image
  110. Type: Universe
  111. Type: Weka 3.7 Cluster
  112. Type: File System
  113. Type: Feature Selection Model
  114. Type: Outlier
  115. Type: Table
Outputs (Dynamic Outport)
One output port for each Workflow Service Output node in the workflow to be executed. The ports are automatically created when finishing the configuration of the node.
  1. Type: Google Sheets
  2. Type: Shape
  3. Type: FilterDefinition
  4. Type: Weka Cluster
  5. Type: Spark Context Legacy
  6. Type: Gradient Boosting Model
  7. Type: Weka 3.7 Cluster
  8. Type: Cluster Tree
  9. Type: Spark Context
  10. Type: Word Vector Model
  11. Type: Word Vector Model (deprecated)
  12. Type: ONNX Deep Learning Network
  13. Type: H2O Model
  14. Type: KNFST Model
  15. Type: Naive Bayes
  16. Type: Google Analytics
  17. Type: Spark Model Legacy
  18. Type: DLNetworkPortObject
  19. Type: DL4J Model
  20. Type: Keras Deep Learning Network
  21. Type: DB Session
  22. Type: PMML
  23. Type: Google API
  24. Type: OPTICS Clustering Port
  25. Type: Spark Data Legacy
  26. Type: Lucene
  27. Type: AWS Comprehend Connection
  28. Type: Database Query
  29. Type: Inactive branch
  30. Type: AWSConnection
  31. Type: LibSVM
  32. Type: Density Scorer Model
  33. Type: PMML Preprocessing
  34. Type: Regression Tree
  35. Type: Python
  36. Type: Radial Basis Function
  37. Type: Semantic Web Connection
  38. Type: Spark Data
  39. Type: URI Object
  40. Type: Weka 3.7 Classifier
  41. Type: Fuzzy Basis Function
  42. Type: Spark Model (legacy com)
  43. Type: Normalizer
  44. Type: VectorHashingPortObject
  45. Type: Rowset Filter
  46. Type: Tree Ensembles
  47. Type: H2O Frame
  48. Type: Color
  49. Type: R Workspace
  50. Type: TensorFlow 2 Model
  51. Type: Weka 3.7 Classifier
  52. Type: Flow Variable
  53. Type: Weka 3.7 Classifier
  54. Type: Weka 3.6 Classifier
  55. Type: Regression Tree
  56. Type: Salesforce OAuth2 Connection
  57. Type: Weka 3.6 Cluster
  58. Type: H2O Context
  59. Type: Weka 3.7 Cluster
  60. Type: Size
  61. Type: Large Filestore
  62. Type: Correlation
  63. Type: Keras Deep Learning Network
  64. Type: org.knime.mongodb.connection.port.MongoDBConnectionPortObject
  65. Type: Spark Data (legacy com)
  66. Type: Network
  67. Type: Workflow Port Object
  68. Type: DLPythonNetworkPortObject
  69. Type: CAIM
  70. Type: Transformation
  71. Type: Sota
  72. Type: PCA
  73. Type: OpenNlpNerTaggerModelPortObject
  74. Type: Fingerprint Bayes
  75. Type: KnoxHdfsConnection
  76. Type: Google Cloud Storage Connection
  77. Type: Remote Connection
  78. Type: AzureConnection
  79. Type: URI Object
  80. Type: Google Drive
  81. Type: Arima
  82. Type: Twitter
  83. Type: Python
  84. Type: Microsoft Credential
  85. Type: XGBoostModel
  86. Type: DB Data
  87. Type: Weka Classifier
  88. Type: Compiled PMML
  89. Type: PMML Discretization
  90. Type: Keras Deep Learning Network
  91. Type: Spark ML Model
  92. Type: Database Connection
  93. Type: Feature Elimination
  94. Type: Tree Ensembles
  95. Type: Weak Label Model
  96. Type: DocumentVectorPortObject
  97. Type: Neighbourgram
  98. Type: StanfordNERModelPortObject
  99. Type: TensorFlow Deep Learning Network
  100. Type: org.knime.python3.nodes.PythonBinaryBlobFileStorePortObject
  101. Type: PortObject
  102. Type: Spark Context (legacy com)
  103. Type: Python (deprecated)
  104. Type: KnimeConnection
  105. Type: Distance Measure
  106. Type: MOJO
  107. Type: KafkaConnection
  108. Type: Spark MLlib Model
  109. Type: Image
  110. Type: Universe
  111. Type: Weka 3.7 Cluster
  112. Type: File System
  113. Type: Feature Selection Model
  114. Type: Outlier
  115. Type: Table

Extension

The Call Workflow Service node is part of this extension:

  1. Go to item

Related workflows & nodes

  1. Go to item
    06.3 Execute deployed Workflow - solution
    Remote workflow Integrated deployment Call workflow
    Solution to the exercise 6.3 for KNIME User Training - Call deployed workflow - Inspect t…
    martyna > Training > L2-DS Training > solutions > 06.3 Execute deployed Workflow - solution
  2. Go to item
    Cell Segmentation (Master)
    Cell segmentation Image processing Microscopy
    The workflow aims to segment cell nuclei from cytoplasm. As a first step the images need …
    knime > Life Sciences > Image_Processing > KNIME Executors on HPC > cell segmentation usecase > cell-segmentation-master
  3. Go to item
    Consumer of Anonymization Workflow Service
    Integrated Deployment Call Workflow Workflow Service
    This workflow makes use of the workflow 01_Anonymizer. It sends a SQL query and a workflo…
    knime > Examples > 06_Control_Structures > 07_Workflow_Orchestration > 03_Process_Anonymized_Data > 02_Consumer
  4. Go to item
    04.3 Orchestration
    Best practices Data engineer Data engineering
    +4
    Once the workflow the ETL on Customers data is executed successfully, i.e., customer data…
    knime > Education > Courses > L4-DE Best Practices for Data Engineering > solutions > Session_4_Orchestration > 04.3_Orchestration
  5. Go to item
    Prediction Service Consumer
    Workflow Service Random Forest
    This workflow calls a workflow service which applies a model to data. Data are generated …
    knime > Examples > 06_Control_Structures > 07_Workflow_Orchestration > 04_Call_Workflow_Service > 02_Prediction_Service_Consumer
  6. Go to item
    Prediction Service Consumer
    carlwitt > Public > Call Workflow > Prediction Service Consumer
  7. Go to item
    Orchestration Testflows
    Best practices Data engineer Data engineering
    +5
    In this workflow, we call all the tetflows, import their statuses and metadata and create…
    lada > Public > Testing Framework > testflows > report > 03_Orchestration_Testflows
  8. Go to item
    01 Automatic Retraining Check
    Data drift Synthetic data Model monitoring
    This workflow calls a workflow service to perform data quality assessment on the data. It…
    emilio_s > Blogposts > Data Drift > 01 Automatic Retraining Check
  9. Go to item
    02 Stock Prediction - Deployment
    Stock prediction Finance Time series
    This workflow calls one workflow that accesses historical stock price data and another wo…
    knime > Codeless Time Series Analysis with KNIME > Chapter 14 > 03 Stock Prediction - Deployment

No known nodes available

KNIME
Open for Innovation

KNIME AG
Hardturmstrasse 66
8005 Zurich, Switzerland
  • Software
  • Getting started
  • Documentation
  • E-Learning course
  • Solutions
  • KNIME Hub
  • KNIME Forum
  • Blog
  • Events
  • Partner
  • Developers
  • KNIME Home
  • KNIME Open Source Story
  • Careers
  • Contact us
Download KNIME Analytics Platform Read more on KNIME Server
© 2022 KNIME AG. All rights reserved.
  • Trademarks
  • Imprint
  • Privacy
  • Terms & Conditions
  • Credits