AODE achieves highly accurate classification by averaging over all of a small space of alternative naive-Bayes-like models that have weaker (and hence less detrimental) independence assumptions than naive Bayes
The resulting algorithm is computationally efficient while delivering highly accurate classification on many learning tasks.
For more information, see
Boughton, Z.Wang (2005).
Not So Naive Bayes: Aggregating One-Dependence Estimators.Machine Learning.
Further papers are available athttp://www.csse.monash.edu.au/~webb/.
Use m-estimate for smoothing base probability estimates witha default of 1 (m value can changed via option -M).
Default mode is non-incremental that is probabilites are computed at learning time.An incremental version can be used via option -I.
Default frequency limit set to 1.
Subsumption Resolution can be achieved by using -S option.Weighting of SPODE can be done by using -W option.
Weights are calculated based on mutual information between attribute and the class.The weighting scheme is developed by L.
Jiang and H.Zhang
(based on WEKA 3.7)
For further options, click the 'More' - button in the dialog.
All weka dialogs have a panel where you can specify classifier-specific parameters.
- Type: Data Training data
- Type: Weka 3.7 Classifier Trained model
Analytics > Mining > Weka > Weka (3.7) > Classification Algorithms > bayes > AveragedNDependenceEstimators
Make sure to have this extension installed: