ClassificationDiscriminant
Discriminant analysis classification
Description
A ClassificationDiscriminant
object
encapsulates a discriminant analysis classifier, which is a Gaussian mixture model for
data generation. A ClassificationDiscriminant
object
can predict responses for new data using the predict
method. The object contains the data used for training, so can
compute resubstitution predictions.
Creation
Create a ClassificationDiscriminant
object by using fitcdiscr
.
Properties
Discriminant Analysis Properties
This property is read-only.
Between-class covariance, specified as a p
-by-p
matrix, where p
is the number of predictors.
Data Types: double
This property is read-only.
Coefficient matrices, specified as a k
-by-k
structure, where k
is the number of classes. If fitcdiscr
had the FillCoeffs
name-value pair set to
'off'
when constructing the classifier, Coeffs
is empty ([]
).
Coeffs(i,j)
contains coefficients of the linear or quadratic
boundaries between classes i
and j
. Fields in
Coeffs(i,j)
:
DiscrimType
Class1
—ClassNames
(i)
Class2
—ClassNames
(j)
Const
— A scalarLinear
— A vector withp
components, wherep
is the number of columns inX
Quadratic
—p
-by-p
matrix, exists for quadraticDiscrimType
The equation of the boundary between class i
and class
j
is
Const
+ Linear
* x
+ x'
* Quadratic
* x
=
0
,
where x
is a column vector of length p
.
Data Types: struct
Value of the Delta threshold for a linear discriminant model, specified as a
nonnegative scalar. If a coefficient of obj
has magnitude smaller
than Delta
, obj
sets this coefficient to
0
, and so you can eliminate the corresponding predictor from the
model. Set Delta
to a higher value to eliminate more
predictors.
Delta
must be 0
for quadratic discriminant
models.
Change Delta
using dot notation: obj.Delta =
newDelta
.
Data Types: double
This property is read-only.
Minimum value of Delta coefficient for predictor to be in model, specified as a row
vector of length p
, where p
is the number of
predictors in obj
. If
DeltaPredictor(i) < Delta
then coefficient
i
of the model is 0
.
If obj
is a quadratic discriminant model, all elements of
DeltaPredictor
are 0
.
Data Types: double
Discriminant type, specified as a character vector or string. Available values:
'linear'
'quadratic'
'diagLinear'
'diagQuadratic'
'pseudoLinear'
'pseudoQuadratic'
Change DiscrimType
using dot notation: obj.DiscrimType =
newDiscrimType
. You can change between linear types, or between quadratic
types, but cannot change between linear and quadratic types.
Data Types: char
| string
Value of the Gamma regularization parameter, specified as a scalar from
0
through 1
. Change Gamma
using dot notation: obj.Gamma = newGamma
.
If you set
1
for linear discriminant, the discriminant sets its type to'diagLinear'
.If you set a value between
MinGamma
and1
for linear discriminant, the discriminant sets its type to'linear'
.You cannot set values below the value of the
MinGamma
property.For quadratic discriminant, you can set either
0
(forDiscrimType
'quadratic'
) or1
(forDiscrimType
'diagQuadratic'
).
Data Types: double
This property is read-only.
Logarithm of the determinant of the within-class covariance matrix, returned as a
scalar or vector. The type of LogDetSigma
depends on the discriminant
type:
Scalar for linear discriminant analysis
Vector of length
K
for quadratic discriminant analysis, whereK
is the number of classes
Data Types: double
This property is read-only.
Minimal value of the Gamma parameter so that the correlation matrix is
invertible, specified as a nonnegative scalar. If the correlation matrix
is not singular, MinGamma
is
0
.
Data Types: double
This property is read-only.
Parameters used in training the model, returned as a
DiscriminantParams
object. The returned
parameters have the following properties.
Property | Value |
---|---|
DiscrimType |
|
Gamma | scalar from 0 through
1 |
Delta | nonnegative scalar |
FillCoeffs | logical scalar |
SaveMemory | logical scalar |
Version | scalar |
Method | 'Discriminant' |
Type | 'classification' |
Predictor Properties
This property is read-only.
Categorical predictor indices, which is always empty ([]
).
This property is read-only.
Class means, specified as a K
-by-p
matrix of
real values. K
is the number of classes, and p
is
the number of predictors. Each row of Mu
represents the mean of the
multivariate normal distribution of the corresponding class. The class indices are in
the ClassNames
attribute.
Data Types: double
This property is read-only.
Names of predictor variables, returned as a cell array. The names are in the order in
which they appear in the training data X
.
Data Types: cell
This property is read-only.
Within-class covariance, returned as a numeric array. The dimensions depend on
DiscrimType
:
'linear'
(default) — Matrix of sizep
-by-p
, wherep
is the number of predictors'quadratic'
— Array of sizep
-by-p
-by-K
, whereK
is the number of classes'diagLinear'
— Row vector of lengthp
'diagQuadratic'
— Array of size1
-by-p
-by-K
'pseudoLinear'
— Matrix of sizep
-by-p
'pseudoQuadratic'
— Array of sizep
-by-p
-by-K
Data Types: double
This property is read-only.
Predictor values, returned as a real matrix. Each column of
X
represents one predictor (variable), and each
row represents one observation.
Data Types: single
| double
This property is read-only.
X
data with class means subtracted, returned as a
real matrix. If Y(i)
is of class
j
,
Xcentered(i,:) =
X(i,:) –
Mu(j,:) , | (1) |
where Mu
is the class mean property.
Data Types: single
| double
Response Properties
This property is read-only.
Class names in the training data Y
with duplicates removed.
ClassNames
has the same data type as the data in the argument
Y
in the training data. ClassNames
can have
the following data types:
Categorical array
Cell array of character vectors
Character array
Logical vector
Numeric vector
(The software treats string arrays as cell arrays of character vectors.)
Data Types: single
| double
| logical
| char
| string
| cell
| categorical
This property is read-only.
Name of the response variable Y
, returned as a character
vector.
Data Types: char
| string
This property is read-only.
Row classifications, returned as a categorical array, cell array of
character vectors, character array, logical vector, or numeric vector
with the same number of rows as X
. Each row of
Y
represents the classification of the
corresponding row of X
.
Data Types: single
| double
| logical
| char
| string
| cell
| categorical
Other Data Properties
This property is read-only.
Description of the cross-validation optimization of hyperparameters, returned as a
BayesianOptimization
object or a table of
hyperparameters and associated values. This property is nonempty if the
OptimizeHyperparameters
name-value argument is nonempty when you
create the model. The value of HyperparameterOptimizationResults
depends on the setting of the Optimizer
option in
HyperparameterOptimizationOptions
when you create the
model.
"bayesopt"
(default) — Object of classBayesianOptimization
"gridsearch"
or"randomsearch"
— Table of hyperparameters used, observed objective function values (cross-validation loss), and rank of observations from lowest (best) to highest (worst)
This property is read-only.
Number of observations in the training data, returned as a positive integer.
NumObservations
can be less than the number of rows of input data
when there are missing values in the input data or response data.
Data Types: double
This property is read-only.
Rows of the original predictor data X
used for fitting, returned as
an n
-element logical vector, where n
is the number
of rows of X
. If the software uses all rows of X
to create the object, then RowsUsed
is an empty array
([]
).
Data Types: logical
This property is read-only.
Scaled observation weights
, returned as a numeric
vector of length n
, where n
is the
number of rows in X
.
Data Types: double
Other Classification Properties
Cost of classifying a point, specified as a square matrix.
Cost(i,j)
is the cost of classifying a point into class
j
if its true class is i
(the rows correspond
to the true class and the columns correspond to the predicted class). The order of the
rows and columns of Cost
corresponds to the order of the classes in
ClassNames
. The number of rows and columns in
Cost
is the number of unique classes in the response.
Change a Cost
matrix using dot notation: obj.Cost =
costMatrix
.
Data Types: double
Prior probabilities for each class, returned as a numeric vector. The order of the
elements of Prior
corresponds to the order of the classes in
ClassNames
.
Add or change a Prior
vector using dot notation: obj.Prior
= priorVector
.
Data Types: double
Score transformation function, specified as a character vector or string representing
a built-in transformation function, or as a function handle for transforming scores.
'none'
means no transformation; equivalently,
'none'
means @(x)x
. For a list of built-in
transformation functions and the syntax of custom transformation functions, see
fitcdiscr
.
Implement dot notation to add or change a ScoreTransform
function
using one of the following:
cobj.ScoreTransform = '
function
'cobj.ScoreTransform = @
function
Data Types: char
| string
| function_handle
Object Functions
compact | Reduce size of machine learning model |
compareHoldout | Compare accuracies of two classification models using new data |
crossval | Cross-validate machine learning model |
cvshrink | Cross-validate regularization of linear discriminant |
edge | Classification edge for discriminant analysis classifier |
lime | Local interpretable model-agnostic explanations (LIME) |
logp | Log unconditional probability density for discriminant analysis classifier |
loss | Classification loss for discriminant analysis classifier |
mahal | Mahalanobis distance to class means of discriminant analysis classifier |
margin | Classification margins for discriminant analysis classifier |
nLinearCoeffs | Number of nonzero linear coefficients in discriminant analysis classifier |
partialDependence | Compute partial dependence |
plotPartialDependence | Create partial dependence plot (PDP) and individual conditional expectation (ICE) plots |
predict | Predict labels using discriminant analysis classifier |
resubEdge | Resubstitution classification edge for discriminant analysis classifier |
resubLoss | Resubstitution classification loss for discriminant analysis classifier |
resubMargin | Resubstitution classification margins for discriminant analysis classifier |
resubPredict | Classify observations in discriminant analysis classifier by resubstitution |
shapley | Shapley values |
testckfold | Compare accuracies of two classification models by repeated cross-validation |
Examples
Load Fisher's iris data set.
load fisheriris
Train a discriminant analysis model using the entire data set.
Mdl = fitcdiscr(meas,species)
Mdl = ClassificationDiscriminant ResponseName: 'Y' CategoricalPredictors: [] ClassNames: {'setosa' 'versicolor' 'virginica'} ScoreTransform: 'none' NumObservations: 150 DiscrimType: 'linear' Mu: [3×4 double] Coeffs: [3×3 struct] Properties, Methods
Mdl
is a ClassificationDiscriminant
model. To access its properties, use dot notation. For example, display the group means for each predictor.
Mdl.Mu
ans = 3×4
5.0060 3.4280 1.4620 0.2460
5.9360 2.7700 4.2600 1.3260
6.5880 2.9740 5.5520 2.0260
To predict labels for new observations, pass Mdl
and predictor data to predict
.
More About
The model for discriminant analysis is:
Each class (
Y
) generates data (X
) using a multivariate normal distribution. That is, the model assumesX
has a Gaussian mixture distribution (gmdistribution
).For linear discriminant analysis, the model has the same covariance matrix for each class, only the means vary.
For quadratic discriminant analysis, both means and covariances of each class vary.
predict
classifies so as to minimize the expected
classification cost:
where
is the predicted classification.
K is the number of classes.
is the posterior probability of class k for observation x.
is the cost of classifying an observation as y when its true class is k.
For details, see Prediction Using Discriminant Analysis Models.
Regularization is the process of finding a small set of predictors
that yield an effective predictive model. For linear discriminant
analysis, there are two parameters, γ and δ,
that control regularization as follows. cvshrink
helps
you select appropriate values of the parameters.
Let Σ represent the covariance matrix of the data X, and let be the centered data (the data X minus the mean by class). Define
The regularized covariance matrix is
Whenever γ ≥ MinGamma
, is nonsingular.
Let μk be the mean vector for those elements of X in class k, and let μ0 be the global mean vector (the mean of the rows of X). Let C be the correlation matrix of the data X, and let be the regularized correlation matrix:
where I is the identity matrix.
The linear term in the regularized discriminant analysis classifier for a data point x is
The parameter δ enters into this equation as a threshold on the final term in square brackets. Each component of the vector is set to zero if it is smaller in magnitude than the threshold δ. Therefore, for class k, if component j is thresholded to zero, component j of x does not enter into the evaluation of the posterior probability.
The DeltaPredictor
property is a vector related
to this threshold. When δ ≥ DeltaPredictor(i)
, all classes k have
Therefore, when δ ≥ DeltaPredictor(i)
, the regularized
classifier does not use predictor i
.
References
[1] Guo, Y., T. Hastie, and R. Tibshirani. "Regularized linear discriminant analysis and its application in microarrays." Biostatistics, Vol. 8, No. 1, pp. 86–100, 2007.
Extended Capabilities
Usage notes and limitations:
The
predict
function supports code generation.When you train a discriminant analysis model by using
fitcdiscr
or create a compact discriminant analysis model by usingmakecdiscr
, the value of the'ScoreTransform'
name-value pair argument cannot be an anonymous function.
For more information, see Introduction to Code Generation.
Version History
Introduced in R2011bStarting in R2023b, training observations with missing predictor values are
included in the X
, Xcentered
,
Y
, and W
data properties. The
RowsUsed
property indicates the training observations
stored in the model, rather than those used for training. Observations with missing
predictor values continue to be omitted from the model training process.
In previous releases, the software omitted training observations that contained missing predictor values from the data properties of the model.
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: United States.
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)