Resume training of classification ensemble model
specifies options using one or more name-value arguments. For example, you can track
the number of weak learners as they are trained, and perform computations in
ens1 = resume(___,
ens — Classification ensemble model
ClassificationEnsemble model object
nlearn — Number of additional training cycles
Number of additional training cycles for
ens, specified as a positive
Specify optional pairs of arguments as
the argument name and
Value is the corresponding value.
Name-value arguments must appear after other arguments, but the order of the
pairs does not matter.
Before R2021a, use commas to separate each name and value, and enclose
Name in quotes.
specifies to train
ens for an additional 10 cycles, display a
message to the command line every time
training 5 weak learners, and to perform computations in parallel.
NPrint — Printout frequency
"off" (default) | positive integer
Printout frequency, specified as a positive integer
resume displays a message to the command
line every time it finishes training
m weak learners.
If you specify
resume does not display a message when it
completes training weak learners.
For the fastest training of some boosted decision trees, when the
"off" (the default value).
Options — Options for computing in parallel and setting random number streams
Options for computing in parallel and setting random number streams, specified as a
structure. Create the
Options structure using
You need Parallel Computing Toolbox™ to run computations in parallel.
You can use the same parallel options for
resume as you used for the
original training. Use the
Options argument to change the parallel options,
as needed. This table describes the option fields and their values.
Set this value to
Set this value to
To compute reproducibly, set
|Specify this value as a
RandStream object or cell array of such objects. Use a single object
except when the
UseParallel value is
UseSubstreams value is
false. In that case,
use a cell array that has the same size as the parallel pool.
|If you do not specify
resume uses the default stream or streams.
For dual-core systems and above,
resume parallelizes training
using Intel® Threading Building Blocks (TBB). Therefore, setting
true might not provide a significant
increase in speed on a single computer. For details on Intel TBB, see https://www.intel.com/content/www/us/en/developer/tools/oneapi/onetbb.html.
Train Classification Ensemble for Additional Cycles
Train a classification ensemble for three cycles, and compare the resubstitution error obtained after training the ensemble for more cycles.
ionosphere data set.
Train a classification ensemble for three cycles and examine the resubstitution error.
ens = fitcensemble(X,Y,Method="GentleBoost",NumLearningCycles=3);
L = resubLoss(ens)
L = 0.0085
Train for three more cycles and examine the new resubstitution error.
ens1 = resume(ens,3); L = resubLoss(ens1)
L = 0
The resubstitution error is much lower in the new ensemble than the original.
Automatic Parallel Support
Accelerate code by automatically running computation in parallel using Parallel Computing Toolbox™.
resume supports parallel training
'Options' name-value argument. Create options using
statset, such as
options = statset('UseParallel',true).
Parallel ensemble training requires you to set the
'Bag'. Parallel training is available only for tree learners, the
default type for
Accelerate code by running on a graphics processing unit (GPU) using Parallel Computing Toolbox™.
This function fully supports GPU arrays. For more information, see Run MATLAB Functions on a GPU (Parallel Computing Toolbox).
Introduced in R2011a