To interactively grow a classification tree, use the Classification Learner app. For greater flexibility, grow a classification tree using
fitctree at the command line. After growing a classification tree, predict labels by passing the tree and new predictor data to
|Classification Learner||Train models to classify data using supervised machine learning|
|Classification error by cross validation|
|Compute partial dependence|
|Create partial dependence plot (PDP) and individual conditional expectation (ICE) plots|
|Estimates of predictor importance for classification tree|
|Mean predictive measure of association for surrogate splits in classification tree|
|View classification tree|
|Cross-validated decision tree|
|Classification edge for observations not used for training|
|Classification loss for observations not used for training|
|Cross validate function|
|Classification margins for observations not used for training|
|Predict response for observations not used for training|
Create and compare classification trees, and export trained models to make predictions for new data.
Understand the steps for supervised learning and the characteristics of nonparametric classification and regression functions.
Understand decision trees and how to fit them to data.
To grow decision trees,
fitrtree apply the standard CART algorithm by default to
the training data.
Create and view a text or graphic description of a trained decision tree.
This example shows how to visualize the decision surface for different classification algorithms.
Learn about the heuristic algorithms for optimally splitting categorical variables with many levels while growing decision trees.
Tune trees by setting name-value pair arguments in
Predict class labels or responses using trained classification and regression trees.
Predict responses for new data using a trained regression tree, and then plot the results.