Gradient Boosted Trees Learner (Regression)

Learner

Learns Gradient Boosted Trees with the objective of regression. The algorithm uses very shallow regression trees and a special form of boosting to build an ensemble of trees. The implementation follows the algorithm in section 4.4 of the paper "Greedy Function Approximation: A Gradient Boosting Machine" by Jerome H. Friedman (1999). For more information you can also take a look at this.

Sampling

This node allows to perform row sampling (bagging) and attribute sampling (attribute bagging) similar to the random forest and tree ensemble nodes. If sampling is used this is usually referred to as Stochastic Gradient Boosted Trees. The respective settings can be found in the Advanced Options tab.

Input Ports

  1. Type: Data The data to learn from. It must contain at least one numeric target column and either a fingerprint (bitvector) column or another numeric or nominal column.

Output Ports

  1. Type: Gradient Boosting Model The trained model.

Find here

Analytics > Mining > Decision Tree Ensemble > Gradient Boosting > Regression

Make sure to have this extension installed: