28 results
- Go to itemThis workflow shows how to perform in-database processing on Micrsoft SQL Server. It starts with a dedicated SQL Server Connector…0
- Go to itemGenerates SQL queries from PMML models. The following model types are currently supported: Regression (linear, polynomial, logist…0
- Go to itemTitanic: Machine Learning from Disaster https://www.kaggle.com/c/titanic Einführung in KNIME (deutsch) - https://hub.knime.com/ml…0
- Go to itemThis workflow reads CENSUS data from a SQL Server database; it then performs some In-Database Processing on SQL Server; it trains…0
- Go to itemThis node allows running R code on a Microsoft SQL Server (using a SQL query behind the scenes). Input and output data to the R s…0
- Go to itemSchool of Hive - with KNIME's local Big Data environment (SQL for Big Data) Demonstrates a collection of Hive functions using KNI…0
- Go to itemThis workflow reads CENSUS data from a Hive database in HDInsight; it then moves to Spark where it performs some ETL operations; …0
- Go to itemAn overview of KNIME based functions to access big data systems (with KNIME's local big data environment) Use SQL with Impala/Hiv…0
- Go to itemAn overview of KNIME based functions to access big data systems - use it on your own big data system (including PySpark) Use SQL …0
- Go to itemKNIME and Hive - load multiple CSV files at once via external table. Also: toy around with internal and external tables and then …0
- Go to itemThis workflow demonstrates several methods to import one or many CSV file into Hive Demonstrated are direct Uploads where you cre…0
- Go to itemThis workflow implements a DWH operation, configuring and launching a Snowflake in-cloud data warehouse instance. Sales orders ar…0
- Go to itemTitanic: Machine Learning from Disaster https://www.kaggle.com/c/titanic Einführung in KNIME (deutsch) - https://hub.knime.com/ml…0
- Go to itemSparkling predictions and encoded labels - "the poor man's ML Ops" (on a Big Data System) Use Big Data Technologies like Spark to…0
- Go to items_401 - prepare label encoding with spark prepare the preparation of data in a big data environment - label encode string variabl…0
- Go to itemHive - how to get from DB-Connectors to Hive (or Impala) tables - KNIME 4.5+ There does not seem to be a direct way to get from t…0
- Go to itemThis workflow demonstrates how to import several Parquet files at once without iteration using an external HIVE table.1