Hub
Pricing About
  • Software
  • Blog
  • Forum
  • Events
  • Documentation
  • About KNIME
  • KNIME Community Hub
  • Search

121 results

Filter
Big data
Education Hive Spark Hadoop Best practices Data engineer Data engineering
  1. Go to item
    Workflow
    04 DB WritingToDB Exercise
    Big Data Education
    Big Data Course DB Exercise #4
    knime > Education > Courses > L4-BD Introduction to Big Data with KNIME Analytics Platform > 1_DB > 2_Exercises > 04_DB_WritingToDB
    1
    knime
  2. Go to item
    Workflow
    HDFS file handling
    HDFS Hadoop Big Data
    This workflow demonstrates the HDFS file handling capabilites using the file handling nodes in conjunction with an HDFS connectio…
    knime > Education > Courses > L4-BD Introduction to Big Data with KNIME Analytics Platform > 2_Hadoop > 4_Examples > 02_HDFS_and_File_Handling_Example
    1
    knime
  3. Go to item
    Workflow
    Working with Azure services
    Azure Microsoft Hdinsight
    +6
    This workflow demonstrates how to connect to various Azure services such as HDInsight clusters, Azure Blob Storage, and AzureSQL …
    andisa.dewi > Public > 09_AzureExample
    1
    andisa.dewi
  4. Go to item
    Workflow
    Tool Migration: From Excel to Value with KNIME
    Automation Excel Machine Learning
    +4
    This workflow shows how using a no-code/low-code tool like KNIME Analytics Platform can substitute, expand and improve considerab…
    roberto_cadili > Public > Tool Migration - From Excel to Value with KNIME
    1
    roberto_cadili
  5. Go to item
    Workflow
    Working with Databricks
    Big data Databricks Spark
    +2
    This workflow demonstrates the usage of the Create Databricks Environment node which allows you to connect to a Databricks Cluste…
    knime > Examples > 10_Big_Data > 01_Big_Data_Connectors > 03_DatabricksExample
    1
    knime
  6. Go to item
    Workflow
    Working with Google cloud services
    Google Bigquery Cloud storage
    +2
    This workflow demonstrates how to connect to various Google Cloud Services such as Google BigQuery, Google Dataproc, and Google C…
    knime > Examples > 10_Big_Data > 01_Big_Data_Connectors > 04_GoogleCloudExample
    1
    knime
  7. Go to item
    Workflow
    Cleaning the NYC taxi dataset on Spark
    Big data Exploration Visualization
    +4
    This workflow handles the preprocessing of the NYC taxi dataset (loading, cleaning, filtering, etc). The NYC taxi dataset contain…
    knime > Examples > 50_Applications > 49_NYC_Taxi_Visualization > Data_Preparation
    1
    knime
  8. Go to item
    Workflow
    Spark Compiled Model Predictor
    Spark Hadoop Big Data
    This workflow demonstrates the usage of the Spark Compiled Model Predictor node which converts a given PMML model into machine co…
    knime > Examples > 10_Big_Data > 02_Spark_Executor > 03_PMML_to_Spark_Comprehensive_Mode_Learning_Mass_Prediction
    1
    knime
  9. Go to item
    Workflow
    Techniques for Dimensionality Reduction
    ETL Big data Data preprocessing
    +11
    This workflow performs classification on data sets that were reduced using the following dimensionality reduction techniques: - L…
    knime > Examples > 04_Analytics > 01_Preprocessing > 02_Techniques_for_Dimensionality_Reduction > 02_Techniques_for_Dimensionality_Reduction
    1
    knime
  10. Go to item
    Workflow
    03.0_Setup_Local_Big_Data_Environment
    Education Data engineering Data engineer
    +3
    This workflow sets up a local big data environment for the next exercise. It creates a local big data environment and loads the u…
    chemgirl36 > Public Space > L4-DE Best Practices for Data Engineering > solutions > Session_3_ELT_on_Big_Data > 03.0_Setup_Local_Big_Data_Environment
    0
    chemgirl36
  11. Go to item
    Workflow
    04.0_Reset_DB&Big_Data_Environment
    Education Data engineering Data engineer
    +3
    This workflow resets the database by overwriting the customers and statistics tables and sets up a local big data environment and…
    chemgirl36 > Public Space > L4-DE Best Practices for Data Engineering > exercises > Session_4_Orchestration > 04.0_Reset_DB&Big_Data_Environment
    0
    chemgirl36
  12. Go to item
    Workflow
    04 Model Building on Big Data
    Big data Spark
    L4-BD SELF-PACED COURSE exercise: - Train a ML model in Spark - Read the prediction results into KNIME
    mferdous2012 > Public > 04 Model Building on Big Data
    0
    mferdous2012
  13. Go to item
    Workflow
    Hive/Big Data - a simple "self-healing" (automated) ETL or analytics system on a big data cluster
    Hive Big data Etl
    +2
    Hive/Big Data - a simple "self-healing" (automated) ETL or analytics system on a big data cluster The scenario: you have a table …
    mlauber71 > Public > kn_example_bigdata_hive_self_healing_jobs
    0
    mlauber71
  14. Go to item
    Workflow
    add fields to Hive table
    Hive Sql Big data
    +2
    Use KNIME and Hive/SQL syntax to add columnd to a Big Data table (migrated to KNIME 4.0 and new DB nodes)
    mlauber71 > Public > kn_example_bigdata_hive_add_column_db_40
    0
    mlauber71
  15. Go to item
    Workflow
    Hive - upload data in several ORC files to HDFS and bring them together as an EXTERNAL table
    Big data Hive Impala
    +5
    Hive - upload data in several ORC files to HDFS and bring them together as an EXTERNAL table You have several ORC files with the …
    mlauber71 > Public > kn_example_hive_orc_external_table
    0
    mlauber71
  16. Go to item
    Workflow
    Combine Big Data, Spark and H2O.ai Sparkling Water
    Knime H2o H2o.ai
    +8
    - load data into (local) Big Data environment - load data into Spark context - load data into H2O.ai Sparkling Water context - bu…
    mlauber71 > Public > kn_example_h2o_sparkling_water
    0
    mlauber71
  17. Go to item
    Workflow
    Hive - how to get from DB-Connectors to Hive (or Impala) tables - KNIME 4.5+
    Big data Hive Impala
    +3
    Hive - how to get from DB-Connectors to Hive (or Impala) tables - KNIME 4.5+ There does not seem to be a direct way to get from t…
    mlauber71 > Public > kn_example_hive_db_loader_45
    0
    mlauber71
  18. Go to item
    Workflow
    Hive - how to get from DB-Connectors to Hive (or Impala) tables - KNIME 4.3+
    Big data Hive Impala
    +2
    Hive - how to get from DB-Connectors to Hive (or Impala) tables There does not seem to be a direct way to get from the comfortabl…
    mlauber71 > Public > kn_example_hive_db_loader_43
    0
    mlauber71
  19. Go to item
    Workflow
    Create a Big Data Hive/Parquet table with a partition based on an existing KNIME table and add more partitions later
    Hive Partition Big data
    +3
    Create a Big Data Hive/Parquet table with a partition based on an existing KNIME table and add more partitions later You can crea…
    mlauber71 > Public > kn_example_bigdata_hive_partitions > m_001_hive_partitions
    0
    mlauber71
  20. Go to item
    Workflow
    s_620 - Apply H2O.ai model to Big Data with KNIME and Spark
    H2o.ai Ml ops Apply
    +5
    s_620 - this is how your daily production workflow could look like. You have the stored lists and rules how to prepare the data a…
    mlauber71 > Public > kn_example_bigdata_h2o_automl_spark_46 > s_620_spark_h2o_apply
    0
    mlauber71

KNIME
Open for Innovation

KNIME AG
Talacker 50
8001 Zurich, Switzerland
  • Software
  • Getting started
  • Documentation
  • E-Learning course
  • Solutions
  • KNIME Hub
  • KNIME Forum
  • Blog
  • Events
  • Partner
  • Developers
  • KNIME Home
  • KNIME Open Source Story
  • Careers
  • Contact us
Download KNIME Analytics Platform Read more on KNIME Business Hub
© 2023 KNIME AG. All rights reserved.
  • Trademarks
  • Imprint
  • Privacy
  • Terms & Conditions
  • Credits