This classifier generates a two-class kernel logistic regression model
The model is fit by minimizing the negative log-likelihood with a quadratic penalty using BFGS optimization, as implemented in the Optimization class.Alternatively, conjugate gradient optimization can be applied.
The user can specify the kernel function and the value of lambda, the multiplier for the quadractic penalty.Using a linear kernel (the default) this method should give the same result as ridge logistic regression implemented in Logistic, assuming the ridge parameter is set to the same value as lambda, and not too small.
By replacing the kernel function, we can learn non-linear decision boundaries.
Note that the data is filtered using ReplaceMissingValues, RemoveUseless, NominalToBinary, and Standardize (in that order).
If a CachedKernel is used, this class will overwrite the manually specified cache size and use a full cache instead.
To apply this classifier to multi-class problems, use the MultiClassClassifier.
This implementation stores the full kernel matrix at training time for speed reasons.
(based on WEKA 3.7)
For further options, click the 'More' - button in the dialog.
All weka dialogs have a panel where you can specify classifier-specific parameters.