53 results
- Go to itemThis workflow demonstrates how to connect to various Azure services such as HDInsight clusters, Azure Blob Storage, and AzureSQL …1
- Go to itemThis workflow demonstrates how to import several Parquet files at once without iteration using an external HIVE table.1
- Go to itemA meta collection of KNIME and databases (SQL, Big Data/Hive/Impala and Spark/PySpark) KNIME as a powerful data anaylytics platfo…1
- Go to itemL4-BD SELF-PACED COURSE exercise: - Create a local big data environment - Create and load data into a Hive table0
- Go to itemThis workflow demonstrates several methods to import one or many CSV file into Hive Demonstrated are direct Uploads where you cre…0
- Go to itemL4-BD SELF-PACED COURSE exercise: - Create a local big data environment - Create and load data into a Hive table0
- Go to itemL4-BD SELF-PACED COURSE exercise: - Manipulate data on Hive with the DB nodes - Perform ETL operations in Spark with the Spark no…0
- Go to itemL4-BD SELF-PACED COURSE exercise: - Manipulate data on Hive with the DB nodes - Perform ETL operations in Spark with the Spark no…0
- Go to itemSolution to an L4-BD SELF-PACED COURSE exercise: - Manipulate data on Hive with the DB nodes - Perform ETL operations in Spark wi…0
- Go to itemHive/Big Data - a simple "self-healing" (automated) ETL or analytics system on a big data cluster The scenario: you have a table …0
- Go to itemSolution to an L4-BD SELF-PACED COURSE exercise: - Create a local big data environment - Create and load data into a Hive table0
- Go to itemHive - how to get from DB-Connectors to Hive (or Impala) tables - KNIME 4.5+ There does not seem to be a direct way to get from t…0
- Go to itemL4-BD SELF-PACED COURSE exercise: - Create a local big data environment - Create and load data into a Hive table0
- Go to itemSolution to an L4-BD SELF-PACED COURSE exercise: - Create a local big data environment - Create and load data into a Hive table0
- Go to itemSolution to an L4-BD SELF-PACED COURSE exercise: - Manipulate data on Hive with the DB nodes - Perform ETL operations in Spark wi…0
- Go to itemUse KNIME and Hive/SQL syntax to add columnd to a Big Data table (migrated to KNIME 4.0 and new DB nodes)0
- Go to itemuse try and catch for generic ports. The existing table will be at the resulting port Try some generic SQL operation on your Hive…0
- Go to itemHive - upload data in several ORC files to HDFS and bring them together as an EXTERNAL table You have several ORC files with the …0