Hub
Pricing About
  • Software
  • Blog
  • Forum
  • Events
  • Documentation
  • About KNIME
  • KNIME Community Hub
  • Nodes
  • Call Workflow Service (deprecated)
NodeNode / Other

Call Workflow Service (deprecated)

Workflow Abstraction Workflow Invocation

This node has been deprecated and its use is not recommended. Please search for updated nodes instead.

Like
Copy short link

Calls another workflow and obtains the results for further processing in this workflow. The workflow can receive inputs via Workflow Service Input nodes and return outputs using Workflow Service Output nodes.

Each Workflow Service Input node in the workflow to be called will create an input port on this node when finishing the configuration of the Call Workflow Service node. Similarly, each Workflow Service Output node will create an output port on the node.

The called workflow can be located on a KNIME Server or in the Local Workspace. If the workflow is located on a KNIME Server, the execution will take place on the server. In contrast, when using the Workflow Executor node, the execution will always be performed in the workflow containing the Workflow Executor node.

The difference between this node and the Call Workflow (Table Based) node is the set of supported workflow input and output nodes. This node supports the Workflow Service Input and Workflow Service Output nodes, which support arbitrary port types and are more efficient than the various Container Input and Container Output nodes. The container input and output nodes on the other hand expect and produce data in a format that can easily be produced by third-party software, whereas Workflow Service Input and Workflow Service Output nodes are designed exclusively to be used by other KNIME workflows.

To define which Workflow Service Input node receives data from which of this node's input ports, each input node defines a parameter identifier. The parameter identifier is supposed to be unique, but it might happen that a workflow has multiple input nodes defining the same parameter name. In this case, KNIME will make the parameter names unique by appending the input node's node ID to the parameter name, e.g., "input-table" becomes "input-table-7".

Node details

Input ports
  1. Type: File System
    KNIME server connection
    Optional connection to a KNIME Server where the workflow to be executed is located. If connected, the execution of the workflow will be performed on that server.
Inputs (Dynamic Inport)
One output port for each Workflow Service Input node in the workflow to be executed. The ports are automatically created when finishing the configuration of the node.
  1. Type: Workflow Port Object
  2. Type: VectorHashingPortObject
  3. Type: DL4J Model
  4. Type: FilterDefinition
  5. Type: Normalizer
  6. Type: Database Query
  7. Type: Weka 3.7 Classifier
  8. Type: Large Filestore
  9. Type: Remote Connection
  10. Type: Cluster Tree
  11. Type: Lucene
  12. Type: Radial Basis Function
  13. Type: Distance Measure
  14. Type: Spark ML Model
  15. Type: Flow Variable
  16. Type: Spark Context (legacy com)
  17. Type: TensorFlow 2 Model
  18. Type: Compiled PMML
  19. Type: Weka 3.6 Cluster
  20. Type: Word Vector Model
  21. Type: Python
  22. Type: Density Scorer Model
  23. Type: Spark Data
  24. Type: URI Object
  25. Type: Database Connection
  26. Type: File System
  27. Type: Regression Tree
  28. Type: Network
  29. Type: KnoxHdfsConnection
  30. Type: Keras Deep Learning Network
  31. Type: Neighbourgram
  32. Type: Weak Label Model
  33. Type: Fingerprint Bayes
  34. Type: Rowset Filter
  35. Type: PortObject
  36. Type: Weka 3.7 Classifier
  37. Type: LibSVM
  38. Type: Gradient Boosting Model
  39. Type: ONNX Deep Learning Network
  40. Type: Word Vector Model (deprecated)
  41. Type: H2O Driverless AI Dataset Connection
  42. Type: DocumentVectorPortObject
  43. Type: Google Cloud Storage Connection
  44. Type: H2O Model
  45. Type: Google Sheets
  46. Type: Weka 3.7 Cluster
  47. Type: Twitter
  48. Type: PMML Discretization
  49. Type: Spark Context
  50. Type: Python
  51. Type: Image
  52. Type: Weka Classifier
  53. Type: Spark MLlib Model
  54. Type: Feature Elimination
  55. Type: TensorFlow Deep Learning Network
  56. Type: Weka Cluster
  57. Type: KafkaConnection
  58. Type: DLNetworkPortObject
  59. Type: Correlation
  60. Type: Google API
  61. Type: KNFST Model
  62. Type: URI Object
  63. Type: AWS Comprehend Connection
  64. Type: Spark Model Legacy
  65. Type: DLPythonNetworkPortObject
  66. Type: Regression Tree
  67. Type: AzureConnection
  68. Type: Spark Context Legacy
  69. Type: org.knime.python3.nodes.PythonBinaryBlobFileStorePortObject
  70. Type: Inactive branch
  71. Type: Weka 3.7 Cluster
  72. Type: Universe
  73. Type: Size
  74. Type: R Workspace
  75. Type: Tree Ensembles
  76. Type: Spark Data Legacy
  77. Type: Spark Data (legacy com)
  78. Type: KnimeConnection
  79. Type: Table
  80. Type: Salesforce OAuth2 Connection
  81. Type: OPTICS Clustering Port
  82. Type: AWSConnection
  83. Type: OpenNlpNerTaggerModelPortObject
  84. Type: XGBoostModel
  85. Type: Shape
  86. Type: H2O Driverless AI MOJO
  87. Type: PMML Preprocessing
  88. Type: DB Session
  89. Type: Fuzzy Basis Function
  90. Type: Keras Deep Learning Network
  91. Type: MOJO
  92. Type: Google Drive
  93. Type: Weka 3.7 Cluster
  94. Type: Sota
  95. Type: Semantic Web Connection
  96. Type: H2O Context
  97. Type: Outlier
  98. Type: Arima
  99. Type: DB Data
  100. Type: Weka 3.6 Classifier
  101. Type: Python (deprecated)
  102. Type: Microsoft Credential
  103. Type: StanfordNERModelPortObject
  104. Type: Weka 3.7 Classifier
  105. Type: Feature Selection Model
  106. Type: Naive Bayes
  107. Type: PCA
  108. Type: Tree Ensembles
  109. Type: PMML
  110. Type: H2O Frame
  111. Type: Keras Deep Learning Network
  112. Type: Google Analytics
  113. Type: KNIME Hub Credentials
  114. Type: Color
  115. Type: org.knime.mongodb.connection.port.MongoDBConnectionPortObject
  116. Type: Transformation
  117. Type: Spark Model (legacy com)
  118. Type: CAIM
Outputs (Dynamic Outport)
One output port for each Workflow Service Output node in the workflow to be executed. The ports are automatically created when finishing the configuration of the node.
  1. Type: Workflow Port Object
  2. Type: VectorHashingPortObject
  3. Type: DL4J Model
  4. Type: FilterDefinition
  5. Type: Normalizer
  6. Type: Database Query
  7. Type: Weka 3.7 Classifier
  8. Type: Large Filestore
  9. Type: Remote Connection
  10. Type: Cluster Tree
  11. Type: Lucene
  12. Type: Radial Basis Function
  13. Type: Distance Measure
  14. Type: Spark ML Model
  15. Type: Flow Variable
  16. Type: Spark Context (legacy com)
  17. Type: TensorFlow 2 Model
  18. Type: Compiled PMML
  19. Type: Weka 3.6 Cluster
  20. Type: Word Vector Model
  21. Type: Python
  22. Type: Density Scorer Model
  23. Type: Spark Data
  24. Type: URI Object
  25. Type: Database Connection
  26. Type: File System
  27. Type: Regression Tree
  28. Type: Network
  29. Type: KnoxHdfsConnection
  30. Type: Keras Deep Learning Network
  31. Type: Neighbourgram
  32. Type: Weak Label Model
  33. Type: Fingerprint Bayes
  34. Type: Rowset Filter
  35. Type: PortObject
  36. Type: Weka 3.7 Classifier
  37. Type: LibSVM
  38. Type: Gradient Boosting Model
  39. Type: ONNX Deep Learning Network
  40. Type: Word Vector Model (deprecated)
  41. Type: H2O Driverless AI Dataset Connection
  42. Type: DocumentVectorPortObject
  43. Type: Google Cloud Storage Connection
  44. Type: H2O Model
  45. Type: Google Sheets
  46. Type: Weka 3.7 Cluster
  47. Type: Twitter
  48. Type: PMML Discretization
  49. Type: Spark Context
  50. Type: Python
  51. Type: Image
  52. Type: Weka Classifier
  53. Type: Spark MLlib Model
  54. Type: Feature Elimination
  55. Type: TensorFlow Deep Learning Network
  56. Type: Weka Cluster
  57. Type: KafkaConnection
  58. Type: DLNetworkPortObject
  59. Type: Correlation
  60. Type: Google API
  61. Type: KNFST Model
  62. Type: URI Object
  63. Type: AWS Comprehend Connection
  64. Type: Spark Model Legacy
  65. Type: DLPythonNetworkPortObject
  66. Type: Regression Tree
  67. Type: AzureConnection
  68. Type: Spark Context Legacy
  69. Type: org.knime.python3.nodes.PythonBinaryBlobFileStorePortObject
  70. Type: Inactive branch
  71. Type: Weka 3.7 Cluster
  72. Type: Universe
  73. Type: Size
  74. Type: R Workspace
  75. Type: Tree Ensembles
  76. Type: Spark Data Legacy
  77. Type: Spark Data (legacy com)
  78. Type: KnimeConnection
  79. Type: Table
  80. Type: Salesforce OAuth2 Connection
  81. Type: OPTICS Clustering Port
  82. Type: AWSConnection
  83. Type: OpenNlpNerTaggerModelPortObject
  84. Type: XGBoostModel
  85. Type: Shape
  86. Type: H2O Driverless AI MOJO
  87. Type: PMML Preprocessing
  88. Type: DB Session
  89. Type: Fuzzy Basis Function
  90. Type: Keras Deep Learning Network
  91. Type: MOJO
  92. Type: Google Drive
  93. Type: Weka 3.7 Cluster
  94. Type: Sota
  95. Type: Semantic Web Connection
  96. Type: H2O Context
  97. Type: Outlier
  98. Type: Arima
  99. Type: DB Data
  100. Type: Weka 3.6 Classifier
  101. Type: Python (deprecated)
  102. Type: Microsoft Credential
  103. Type: StanfordNERModelPortObject
  104. Type: Weka 3.7 Classifier
  105. Type: Feature Selection Model
  106. Type: Naive Bayes
  107. Type: PCA
  108. Type: Tree Ensembles
  109. Type: PMML
  110. Type: H2O Frame
  111. Type: Keras Deep Learning Network
  112. Type: Google Analytics
  113. Type: KNIME Hub Credentials
  114. Type: Color
  115. Type: org.knime.mongodb.connection.port.MongoDBConnectionPortObject
  116. Type: Transformation
  117. Type: Spark Model (legacy com)
  118. Type: CAIM

Extension

The Call Workflow Service (deprecated) node is part of this extension:

  1. Go to item

Related workflows & nodes

  1. Go to item
  2. Go to item
  3. Go to item

No known nodes available

KNIME
Open for Innovation

KNIME AG
Talacker 50
8001 Zurich, Switzerland
  • Software
  • Getting started
  • Documentation
  • E-Learning course
  • Solutions
  • KNIME Hub
  • KNIME Forum
  • Blog
  • Events
  • Partner
  • Developers
  • KNIME Home
  • KNIME Open Source Story
  • Careers
  • Contact us
Download KNIME Analytics Platform Read more on KNIME Business Hub
© 2023 KNIME AG. All rights reserved.
  • Trademarks
  • Imprint
  • Privacy
  • Terms & Conditions
  • Credits