Eigenvalues
Eigenvalue Decomposition
An eigenvalue and eigenvector of a square matrix A are, respectively, a scalar λ and a nonzero vector υ that satisfy
Aυ = λυ.
With the eigenvalues on the diagonal of a diagonal matrix Λ and the corresponding eigenvectors forming the columns of a matrix V, you have
AV = VΛ.
If V is nonsingular, this becomes the eigenvalue decomposition
A = VΛV–1.
A good example is the coefficient matrix of the differential equation dx/dt = Ax:
A = 0 -6 -1 6 2 -16 -5 20 -10
The solution to this equation is expressed in terms of the matrix exponential x(t) = etAx(0). The statement
lambda = eig(A)
produces a column vector containing the eigenvalues of A
. For this matrix,
the eigenvalues are complex:
lambda = -3.0710 -2.4645+17.6008i -2.4645-17.6008i
The real part of each of the eigenvalues is negative, so eλt approaches zero as t increases. The nonzero imaginary part of two of the eigenvalues, ±ω, contributes the oscillatory component, sin(ωt), to the solution of the differential equation.
With two output arguments, eig
computes the eigenvectors and stores the eigenvalues in a diagonal matrix:
[V,D] = eig(A)
V = -0.8326 0.2003 - 0.1394i 0.2003 + 0.1394i -0.3553 -0.2110 - 0.6447i -0.2110 + 0.6447i -0.4248 -0.6930 -0.6930 D = -3.0710 0 0 0 -2.4645+17.6008i 0 0 0 -2.4645-17.6008i
The first eigenvector is real and the other two vectors are complex conjugates of each other. All three vectors are normalized to have Euclidean length, norm(v,2)
, equal to one.
The matrix V*D*inv(V)
, which can be written more succinctly as V*D/V
, is within round-off error of A
. And, inv(V)*A*V
, or V\A*V
, is within round-off error of D
.
Multiple Eigenvalues
Some matrices do not have an eigenvector decomposition. These matrices are not diagonalizable. For example:
A = [ 1 -2 1 0 1 4 0 0 3 ]
For this matrix
[V,D] = eig(A)
produces
V = 1.0000 1.0000 -0.5571 0 0.0000 0.7428 0 0 0.3714 D = 1 0 0 0 1 0 0 0 3
There is a double eigenvalue at λ = 1. The first and second columns of V
are the same. For this matrix, a full set of linearly independent eigenvectors does not exist.
Schur Decomposition
Many advanced matrix computations do not require eigenvalue decompositions. They are based, instead, on the Schur decomposition
A = USU ′ ,
where U is an orthogonal matrix and S is a block upper-triangular matrix with 1-by-1 and 2-by-2 blocks on the diagonal. The eigenvalues are revealed by the diagonal elements and blocks of S, while the columns of U provide an orthogonal basis, which has much better numerical properties than a set of eigenvectors.
For example, compare the eigenvalue and Schur decompositions of this defective matrix:
A = [ 6 12 19 -9 -20 -33 4 9 15 ]; [V,D] = eig(A)
V = -0.4741 + 0.0000i -0.4082 - 0.0000i -0.4082 + 0.0000i 0.8127 + 0.0000i 0.8165 + 0.0000i 0.8165 + 0.0000i -0.3386 + 0.0000i -0.4082 + 0.0000i -0.4082 - 0.0000i D = -1.0000 + 0.0000i 0.0000 + 0.0000i 0.0000 + 0.0000i 0.0000 + 0.0000i 1.0000 + 0.0000i 0.0000 + 0.0000i 0.0000 + 0.0000i 0.0000 + 0.0000i 1.0000 - 0.0000i
[U,S] = schur(A)
U = -0.4741 0.6648 0.5774 0.8127 0.0782 0.5774 -0.3386 -0.7430 0.5774 S = -1.0000 20.7846 -44.6948 0 1.0000 -0.6096 0 0.0000 1.0000
The matrix A
is defective since it does not have a full set of linearly
independent eigenvectors (the second and third columns of V
are the same).
Since not all columns of V
are linearly independent, it has a large
condition number of about ~1e8
. However, schur
is able
to calculate three different basis vectors in U
. Since U
is orthogonal, cond(U) = 1
.
The matrix S
has the real eigenvalue as the first entry on the diagonal
and the repeated eigenvalue represented by the lower right 2-by-2 block. The eigenvalues of
the 2-by-2 block are also eigenvalues of A
:
eig(S(2:3,2:3))
ans = 1.0000 + 0.0000i 1.0000 - 0.0000i