Hub
Pricing About
  • Software
  • Blog
  • Forum
  • Events
  • Documentation
  • About KNIME
  • KNIME Community Hub
  • Search

19 results

Filter
Filter by tag
Parquet
Knime Arrow Import Library Read Gzip Python R
  1. Go to item
    Workflow
    Meta Collection about KNIME and Python
    Knime Python Meta
    +11
    KNIME and Python - just good friends
    mlauber71 > Public > _knime_and_python_meta_collection
    3
    mlauber71
  2. Go to item
    Workflow
    Connecting to Databricks
    Databricks Databricks delta Dbfs
    +1
    This workflow shows how to connect to a Databricks cluster and utilize various KNIME nodes to interact with Databricks from withi…
    knime > Examples > 10_Big_Data > 01_Big_Data_Connectors > 06_Connecting_to_Databricks
    2
    knime
  3. Go to item
    Workflow
    Incremental Data Processing with Parquet
    Parquet Incremental loading NYC taxi dataset
    +3
    In this workflow, we will use the NYC taxi dataset to show case a continous preprocessing and publishing of event data. Instead o…
    knime > Examples > 01_Data_Access > 01_Common_Type_Files > 12_Incremental_processing_Parquet_file
    2
    knime
  4. Go to item
    Workflow
    KNIME and Hive - load multiple Parquet files at once via external table
    Knime Hive External
    +5
    This workflow demonstrates how to import several Parquet files at once without iteration using an external HIVE table.
    mlauber71 > Public > kn_example_bigdata_hive_parquet_loader > m_001_import_hive_parquet
    1
    mlauber71
  5. Go to item
    Workflow
    use the new (KNIME 4.5) Python Script node to read and write ARFF file into KNIME, export it again as Parquet, put it into SQLite database and read it back
    Parquet Arrow Import
    +6
    use the new (KNIME 4.5) Python Script node to read and write ARFF file into KNIME, export it again as Parquet, put it into SQLite…
    mlauber71 > Public > kn_example_python_read_arff_file
    0
    mlauber71
  6. Go to item
    Workflow
    use R library(readxl) to read XLSX/Excel file into KNIME
    Parquet Arrow Import
    +7
    use R library(readxl) to read XLSX/Excel file into KNIME Export the data to SQLite, ARFF and Parquet and demonstrate to read the …
    mlauber71 > Public > kn_example_r_read_single_xlsx_file
    0
    mlauber71
  7. Go to item
    Workflow
    Parquet file format, KNIME and Jupyter notebook
    Knime Python Jupyter
    +4
    Parquet file format, KNIME and Jupyter notebook - work in progress - /data/kn_example_python_parquet_jupyter.ipynb a jupyter work…
    mlauber71 > Public > kn_example_python_parquet_jupyter
    0
    mlauber71
  8. Go to item
    Workflow
    Check out Date and Datetime variable types between KNIME and Python
    Python Knime Date
    +5
    Check out Date and Datetime variable types between KNIME and Python A Jupyter notebook to toy around with data variable is in the…
    mlauber71 > Public > kn_example_python_date_time
    0
    mlauber71
  9. Go to item
    Workflow
    use R library (readr) to read CSV file into KNIME with UTF-16LE encoding
    Parquet Arrow Import
    +7
    use R library (readr) to read CSV file into KNIME with UTF-16LE encoding locale = locale(encoding = "UTF-16LE")
    mlauber71 > Public > kn_example_r_readr_locale_encoding
    0
    mlauber71
  10. Go to item
    Workflow
    use R library(readr) to read (messy) CSV file into KNIME
    Parquet Arrow Import
    +7
    use R library(readr) to read (messy) CSV file into KNIME Export the data to SQLite, ARFF and Parquet and demonstrate to read the …
    mlauber71 > Public > kn_example_r_read_single_csv_file
    0
    mlauber71
  11. Go to item
    Workflow
    Create a Big Data Hive/Parquet table with a partition based on an existing KNIME table and add more partitions later
    Hive Partition Big data
    +3
    Create a Big Data Hive/Parquet table with a partition based on an existing KNIME table and add more partitions later You can crea…
    mlauber71 > Public > kn_example_bigdata_hive_partitions > m_001_hive_partitions
    0
    mlauber71
  12. Go to item
    Workflow
    use Python to read Parquet file into KNIME, export it again, put it into SQLite database and read it back
    Parquet Arrow Import
    +5
    use Python to read parquet file into KNIME, export it again, put it into SQLite database and read it back
    mlauber71 > Public > kn_example_python_read_parquet_file
    0
    mlauber71
  13. Go to item
    Workflow
    use R library(arrow) to read parquet file into KNIME
    Parquet Arrow Import
    +5
    use R library(arrow) to read parquet file into KNIME
    mlauber71 > Public > kn_example_r_read_parquet_file
    0
    mlauber71
  14. Go to item
    Workflow
    Connecting to Databricks
    Databricks Databricks delta Dbfs
    +1
    This workflow shows how to connect to a Databricks cluster and utilize various KNIME nodes to interact with Databricks from withi…
    geethanjali > Public > 06_Connecting_to_Databricks
    0
    geethanjali
  15. Go to item
    Workflow
    Incremental Data Processing with Parquet
    Parquet Incremental loading NYC taxi dataset
    +3
    In this workflow, we will use the NYC taxi dataset to show case a continous preprocessing and publishing of event data. Instead o…
    haoran > Public > 01_Data_Access > 01_Common_Type_Files > 12_Incremental_processing_Parquet_file
    0
    haoran
  16. Go to item
    Workflow
    use the new (KNIME 4.6+) Python Script node and bundled Python version to read Parquet file into KNIME, export it again, put it into SQLite database and read it back
    Parquet Arrow Import
    +5
    use the new (KNIME 4.6+) Python Script node and bundled Python version to read Parquet file into KNIME, export it again, put it i…
    mlauber71 > Public > kn_example_python_read_parquet_file_2021
    0
    mlauber71
  17. Go to item
    Workflow
    Local big data Irish meter
    Hive Spark Spark PCA
    +7
    This workflow uses a portion of the Irish Energy Meter dataset, and presents a simple analysis based on the whitepaper "Big Data,…
    knime > Education > Courses > L4-BD Introduction to Big Data with KNIME Analytics Platform > 3_Spark > 4_Examples > 09_Big_Data_Irish_Meter_on_Spark_only
    0
    knime
  18. Go to item
    Workflow
    Local big data Irish meter
    Hive Spark Spark PCA
    +7
    This workflow uses a portion of the Irish Energy Meter dataset, and presents a simple analysis based on the whitepaper "Big Data,…
    knime > Examples > 10_Big_Data > 02_Spark_Executor > 09_Big_Data_Irish_Meter_on_Spark_only
    0
    knime
  19. Go to item
    Workflow
    H2O AutoML on Spark
    Spark H2o Automl
    +4
    This workflow trains classification models for the Airlines Delay dataset using H2O AutoML on Spark. The dataset is expected to b…
    knime > Examples > 10_Big_Data > 02_Spark_Executor > 13_H2O_AutoML_on_Spark
    0
    knime

KNIME
Open for Innovation

KNIME AG
Talacker 50
8001 Zurich, Switzerland
  • Software
  • Getting started
  • Documentation
  • E-Learning course
  • Solutions
  • KNIME Hub
  • KNIME Forum
  • Blog
  • Events
  • Partner
  • Developers
  • KNIME Home
  • KNIME Open Source Story
  • Careers
  • Contact us
Download KNIME Analytics Platform Read more on KNIME Business Hub
© 2023 KNIME AG. All rights reserved.
  • Trademarks
  • Imprint
  • Privacy
  • Terms & Conditions
  • Credits