Class for building and using a decision table/naive bayes hybrid classifier
At each point in the search, the algorithm evaluates the merit of dividing the attributes into two disjoint subsets: one for the decision table, the other for naive Bayes.A forward selection search is used, where at each step, selected attributes are modeled by naive Bayes and the remainder by the decision table, and all attributes are modelled by the decision table initially.
At each step, the algorithm also considers dropping an attribute entirely from the model.
For more information, see:
Mark Hall, Eibe Frank: Combining Naive Bayes and Decision Tables.In: Proceedings of the 21st Florida Artificial Intelligence Society Conference (FLAIRS), 318-319, 2008.
(based on WEKA 3.7)
For further options, click the 'More' - button in the dialog.
All weka dialogs have a panel where you can specify classifier-specific parameters.
- Type: Data Training data
- Type: Weka 3.7 Classifier Trained model