Hub
  • Software
  • Blog
  • Forum
  • Events
  • Documentation
  • About KNIME
  • KNIME Hub
  • Search

16 results

Filter
Parquet
Knime Arrow Import Library Read
+3
  1. Go to item
    Workflow
    Meta Collection about KNIME and Python
    Knime Python Meta
    +10
    KNIME and Python - just good friends
    mlauber71 > Public > _knime_and_python_meta_collection
    3
  2. Go to item
    Workflow
    Incremental Data Processing with Parquet
    Parquet Incremental loading NYC taxi dataset
    +3
    In this workflow, we will use the NYC taxi dataset to show case a continous preprocessing and publishing of event data. Instead o…
    knime > Examples > 01_Data_Access > 01_Common_Type_Files > 12_Incremental_processing_Parquet_file
    2
  3. Go to item
    Workflow
    KNIME and Hive - load multiple Parquet files at once via external table
    Knime Hive External
    +5
    This workflow demonstrates how to import several Parquet files at once without iteration using an external HIVE table. The initia…
    mlauber71 > Public > kn_example_bigdata_hive_parquet_loader > m_001_import_hive_parquet
    1
  4. Go to item
    Workflow
    Connecting to Databricks
    Databricks Databricks delta Dbfs
    +1
    This workflow shows how to connect to a Databricks cluster and utilize various KNIME nodes to interact with Databricks from withi…
    knime > Examples > 10_Big_Data > 01_Big_Data_Connectors > 06_Connecting_to_Databricks
    1
  5. Go to item
    Workflow
    Incremental Data Processing with Parquet
    Parquet Incremental loading NYC taxi dataset
    +3
    In this workflow, we will use the NYC taxi dataset to show case a continous preprocessing and publishing of event data. Instead o…
    haoran > Public > 01_Data_Access > 01_Common_Type_Files > 12_Incremental_processing_Parquet_file
    0
  6. Go to item
    Workflow
    use Python to read Parquet file into KNIME, export it again, put it into SQLite database and read it back
    Parquet Arrow Import
    +5
    use Python to read parquet file into KNIME, export it again, put it into SQLite database and read it back
    mlauber71 > Public > kn_example_python_read_parquet_file
    0
  7. Go to item
    Workflow
    use R library(readr) to read (messy) CSV file into KNIME
    Parquet Arrow Import
    +7
    use R library(readr) to read (messy) CSV file into KNIME Export the data to SQLite, ARFF and Parquet and demonstrate to read the …
    mlauber71 > Public > kn_example_r_read_single_csv_file
    0
  8. Go to item
    Workflow
    use R library(readxl) to read XLSX/Excel file into KNIME
    Parquet Arrow Import
    +7
    use R library(readxl) to read XLSX/Excel file into KNIME Export the data to SQLite, ARFF and Parquet and demonstrate to read the …
    mlauber71 > Public > kn_example_r_read_single_xlsx_file
    0
  9. Go to item
    Workflow
    H2O AutoML on Spark
    Spark H2o Automl
    +4
    This workflow trains classification models for the Airlines Delay dataset using H2O AutoML on Spark. The dataset is expected to b…
    knime > Examples > 10_Big_Data > 02_Spark_Executor > 13_H2O_AutoML_on_Spark
    0
  10. Go to item
    Workflow
    Local big data Irish meter
    Hive Spark Spark PCA
    +7
    This workflow uses a portion of the Irish Energy Meter dataset, and presents a simple analysis based on the whitepaper "Big Data,…
    knime > Examples > 10_Big_Data > 02_Spark_Executor > 09_Big_Data_Irish_Meter_on_Spark_only
    0
  11. Go to item
    Workflow
    use the new (KNIME 4.5) Python Script node to read Parquet file into KNIME, export it again, put it into SQLite database and read it back
    Parquet Arrow Import
    +5
    use the new (KNIME 4.5) Python Script node to read Parquet file into KNIME, export it again, put it into SQLite database and read…
    mlauber71 > Public > kn_example_python_read_parquet_file_2021
    0
  12. Go to item
    Workflow
    use the new (KNIME 4.5) Python Script node to read and write ARFF file into KNIME, export it again as Parquet, put it into SQLite database and read it back
    Parquet Arrow Import
    +6
    use the new (KNIME 4.5) Python Script node to read and write ARFF file into KNIME, export it again as Parquet, put it into SQLite…
    mlauber71 > Public > kn_example_python_read_arff_file
    0
  13. Go to item
    Workflow
    use R library(arrow) to read parquet file into KNIME
    Parquet Arrow Import
    +5
    use R library(arrow) to read parquet file into KNIME Export the data to SQLite, ARFF and again to another Parquet file. Also: spl…
    mlauber71 > Public > kn_example_r_read_parquet_file
    0
  14. Go to item
    Workflow
    use R library (readr) to read CSV file into KNIME with UTF-16LE encoding
    Parquet Arrow Import
    +7
    use R library (readr) to read CSV file into KNIME with UTF-16LE encoding locale = locale(encoding = "UTF-16LE")
    mlauber71 > Public > kn_example_r_readr_locale_encoding
    0
  15. Go to item
    Workflow
    Check out Date and Datetime variable types between KNIME and Python
    Python Knime Date
    +5
    Check out Date and Datetime variable types between KNIME and Python A Jupyter notebook to toy around with data variable is in the…
    mlauber71 > Public > kn_example_python_date_time
    0
  16. Go to item
    Workflow
    Local big data Irish meter
    Hive Spark Spark PCA
    +7
    This workflow uses a portion of the Irish Energy Meter dataset, and presents a simple analysis based on the whitepaper "Big Data,…
    knime > Education > Courses > L4-BD Introduction to Big Data with KNIME Analytics Platform > 3_Spark > 4_Examples > 09_Big_Data_Irish_Meter_on_Spark_only
    0

KNIME
Open for Innovation

KNIME AG
Hardturmstrasse 66
8005 Zurich, Switzerland
  • Software
  • Getting started
  • Documentation
  • E-Learning course
  • Solutions
  • KNIME Hub
  • KNIME Forum
  • Blog
  • Events
  • Partner
  • Developers
  • KNIME Home
  • KNIME Open Source Story
  • Careers
  • Contact us
Download KNIME Analytics Platform Read more on KNIME Server
© 2022 KNIME AG. All rights reserved.
  • Trademarks
  • Imprint
  • Privacy
  • Terms & Conditions
  • Credits