121 results
- Go to itemThis workflow demonstrates the HDFS file handling capabilites using the file handling nodes in conjunction with an HDFS connectio…1
- Go to itemThis workflow demonstrates how to connect to various Azure services such as HDInsight clusters, Azure Blob Storage, and AzureSQL …1
- Go to itemThis workflow shows how using a no-code/low-code tool like KNIME Analytics Platform can substitute, expand and improve considerab…1
- Go to itemThis workflow demonstrates the usage of the Create Databricks Environment node which allows you to connect to a Databricks Cluste…1
- Go to itemThis workflow demonstrates how to connect to various Google Cloud Services such as Google BigQuery, Google Dataproc, and Google C…1
- Go to itemThis workflow handles the preprocessing of the NYC taxi dataset (loading, cleaning, filtering, etc). The NYC taxi dataset contain…1
- Go to itemThis workflow demonstrates the usage of the Spark Compiled Model Predictor node which converts a given PMML model into machine co…1
- Go to itemThis workflow performs classification on data sets that were reduced using the following dimensionality reduction techniques: - L…1
- Go to itemThis workflow sets up a local big data environment for the next exercise. It creates a local big data environment and loads the u…0
- Go to itemThis workflow resets the database by overwriting the customers and statistics tables and sets up a local big data environment and…0
- Go to itemL4-BD SELF-PACED COURSE exercise: - Train a ML model in Spark - Read the prediction results into KNIME0
- Go to itemHive/Big Data - a simple "self-healing" (automated) ETL or analytics system on a big data cluster The scenario: you have a table …0
- Go to itemUse KNIME and Hive/SQL syntax to add columnd to a Big Data table (migrated to KNIME 4.0 and new DB nodes)0
- Go to itemHive - upload data in several ORC files to HDFS and bring them together as an EXTERNAL table You have several ORC files with the …0
- Go to item- load data into (local) Big Data environment - load data into Spark context - load data into H2O.ai Sparkling Water context - bu…0
- Go to itemHive - how to get from DB-Connectors to Hive (or Impala) tables - KNIME 4.5+ There does not seem to be a direct way to get from t…0
- Go to itemHive - how to get from DB-Connectors to Hive (or Impala) tables There does not seem to be a direct way to get from the comfortabl…0
- Go to itemCreate a Big Data Hive/Parquet table with a partition based on an existing KNIME table and add more partitions later You can crea…0
- Go to items_620 - this is how your daily production workflow could look like. You have the stored lists and rules how to prepare the data a…0