A classification ensemble is a predictive model composed of a weighted combination of multiple classification models. In general, combining multiple classification models increases predictive performance.
To explore classification ensembles interactively, use the Classification Learner app.
For greater flexibility, use
fitcensemble in the
command-line interface to boost or bag classification trees, or to
grow a random forest .
For details on all supported ensembles, see Ensemble Algorithms. To reduce a multiclass problem
into an ensemble of binary classification problems, train an error-correcting
output codes (ECOC) model. For details, see
|Classification Learner||Train models to classify data using supervised machine learning|
|Discriminant analysis classifier template|
|Error-correcting output codes learner template|
|Ensemble learning template|
|k-nearest neighbor classifier template|
|Linear classification learner template|
|Naive Bayes classifier template|
|Support vector machine template|
|Create decision tree template|
|Multiclass model for support vector machines or other classifiers|
|Compact multiclass model for support vector machines or other classifiers|
|Cross-validated multiclass ECOC model for support vector machines (SVMs) and other classifiers|
Create and compare ensemble classifiers, and export trained models to make predictions for new data.
Obtain highly accurate predictions by using many weak learners.
Learn about different algorithms for ensemble learning.
Train a simple classification ensemble.
Learn methods to evaluate the predictive quality of an ensemble.
Learn how to set prior class probabilities and misclassification costs.
Use the RUSBoost algorithm for classification when one or more classes are over-represented in your data.
Train an ensemble of classification trees using data containing predictors with many categorical levels.
Create small ensembles by using the LPBoost and TotalBoost algorithms. (LPBoost and TotalBoost require Optimization Toolbox™.)
Tune RobustBoost parameters for better predictive accuracy. (RobustBoost requires Optimization Toolbox.)
Gain better predictions when you have missing data by using surrogate splits.
Create a TreeBagger ensemble for classification.
This example shows how to build an automated credit rating tool.
Increase the accuracy of classification by using a random subspace ensemble.