You are on page 1of 41

Chapter 4: Eigenvalues and Eigenvectors

Throughout this discussion, we will be dealing with


transformations from a space into itself, so all
matrices will be square.

Recall that a linear transformation is a rule that makes


one vector into another vector.

Ax
In special situations, when A operates on x, we get a scaled
version of x back again:

x
So we have Ax = x

Ax This only happens for special x 's


and special 's.

When Ax = x

"Eigenvalue"
x "Eigenvector"
For an n x n matrix A, there are:

ll nneigenvalues,
eigenvalues,some
someofofwhich
whichmay
maybe
be
complex
complexand/or
and/orrepeated.
repeated.
ll At
Atleast
leastone
oneeigenvector
eigenvectorcorresponding
corresponding
to
toeach
eachdistinct
distincteigenvalue.
eigenvalue. These
Thesewill
will
be
becomplex
complexififthe
theeigenvalues
eigenvaluesare
are
complex.
complex.
ll Sometimes
Sometimesrepeated
repeatedeigenvalues
eigenvalueshave
have
associated
associatedwith
withthem
themgeneralized
generalized
eigenvectors.
eigenvectors.
How to find eigenvalues and eigenvectors:
Ax = x

( A I ) x = 0

We know from the previous chapter that this system of


linear equations will have a solution (other than zero)
whenever
x N ( A I )

For this to happen, the degeneracy of A I must be at


least one. Equivalently, the rank of the matrix A I
must be less than full (n).
When the rank of a square matrix is deficient, then the
determinant of that matrix is zero.
So A I = 0 ( = I A )

If we do this computation and solve for then we get


the eigenvalues. We will always get an n th order
polynomial in , so its n roots will be the eigenvalues
. i , i = 1,K , n
We then solve the linear equation ( A I ) x = 0 to get the
corresponding eigenvectors.

Example: Find the eigenvalues and eigenvectors of


3 0 2
A = 0 3 2

2 2 1
3 0 2
A I = 0 3 2 = (3 )[(3 )(1 ) 4] + 2[ 2(3 )]

2 2 1
This happens to be the
characteristic equation
= + 7 7 15
3 2
of the system; the
= (5 )( 3 )( 1 ) = 0 eigenvalues are the
poles!

So the three eigenvalues are: 1 = 5, 2 = 3, 3 = 1

Now find the eigenvector associated with 1 :

2 0 2 How would
( A I ) = = 5 x1 = ( A 5 I ) x1 = 0 2 2 x1 = 0 you solve
1 this?
2 2 4
F
Using your favorite method; e.g., Gaussian elimination,
echelon forms, etc., we can get:
1
x1 = 1

1

Because the equation Ax = x can be multiplied on both


sides by an arbitrary constant and still be true, any
scalar multiple of an eigenvector is still an eigenvector.
We often express eigenvectors with such a constant:

c
x1 = c

c
Better yet, we normalize all eigenvectors so that they all
have length 1:
1 3
(corres. to 1 = 5 )
x1 = 1 3
1 3

Identical procedures give us the other two eigenvectors:


1 2
x2 = 1 2 (corres. to 2 = 3)

0

1 6

x3 = 1 6 (corres. to 3 = 1)
2 6
Pause here to explore an application for the
eigenvalues/vectors: Consider the previous example.
Note that the eigenvectors are linearly independent.
They therefore form a basis for a 3-D vector space.
no input
Suppose we had a (homogeneous) system:
x& = Ax

What will happen if we use the eigenvectors as the basis of the


vector space that x belongs to?
Let x denote the state vector in the new basis ( x denotes the
old state vector). The relationship between x and x is:

x = Mx
where M is a matrix whose columns are the new basis
vectors.
When M is formed by columns that are eigenvectors, it is
called a modal matrix.

Proceeding, x = Mx so x& = Mx& (substitute)

x& = Ax Mx& = AMx

F
x& = M 1 AM x

Recall that M 1 AM is called a similarity transform on A .

1 6
1
6
2
6
3 0 2 1
6
1
2
1
3

M 1 AM = 1 1 0 0 3 2 1 6 1 1 3
2 2 2
1 3
1
3
1
3

2 2 1

2
6
0 1
3

(happens to be orthonormal)
1 0 0
= 0 3 0

0 0 5

= A These are the eigenvalues of A!

The modal matrix has diagonalized the system.


"New" system is

x&1 x1 1 0 0 x1 x1
x& = A x = 0 3 0 x = 3 x
2 2 2 2
x& 3 x3 0 0 5 x3 5 x3

These are three "decoupled" first order linear


differential equations.
A more general system would transform as:

A B
x& = Ax + Bu x = Mx x& = M 1 AM x + M 1 Bu
y = Cx + Du x& = Mx& y = CM x + Du
C D
Something different happens when we have one or more
eigenvalues that are repeated (multiple roots) of the
characteristic equation.

Example: Find the eigenvalues and eigenvectors of

1 2 0
A = 0 1 0

3 3 5

0 = A I = (1 ) 2 (5 ) So 1 = 5, 2 = 3 = 1

(1 is an eigenvalue of algebraic multiplicity 2)

Find the eigenvectors: F


Eigenvector corresponding to 1 = 5 :
4 2 0 0 0
( A 5I ) x = 0 4 0 x = 0 x1 = 0(or 0 )

3 3 0 c 1

Now for the eigenvector(s?) corresponding to =:1

0 2 0 a

( A I ) x = 0 0 0 x = 0 x2 = 0

3 3 4 34 a

rank=2, so there will be only 1 nontrivial solution.

Three eigenvalues, but only two eigenvectors??


Without three linearly independent eigenvectors, we cannot
diagonalize. We can do the next best thing by using
"generalized eigenvectors."
There are three common ways to compute them:
I. "Bottom-up": For repeated eigenvalue i , find all
solutions xi to:
( A i I ) xi = 0
(these will be the regular eigenvectors)
Then for each of these xi 's, solve the equation
( A i I ) xi +1 = xi

If you can find xi+1 's that are linearly independent of all
previous vectors xi , then the new vectors are
generalized eigenvectors.
If the xi+1's are not linearly independent of previously
found vectors, continue on by solving:
( A i I ) xi + 2 = xi +1

and checking for linear independence. Continue this


process until a complete set of n vectors are
available.
Returning to the example:
0 2 0 a

( A I ) x = 0 0 0 x = 0 x1 = 0

3 3 4 43 a

0 2 0 4 5
Solve: 0 0 0 x = 0 x2 = 2
2
3 3 4 3 3
This vector is linearly independent of the previous
eigenvectors, so it is a generalized eigenvector.
Now form the modal matrix using the two regular
eigenvectors and the generalized eigenvector:
Put this in the
0 4 5 order you found
M = 0 0 2 them

1 3 3
generalized
regular
Compute similarity transformation:

3 4 3
8 1 1 2 0 0 4 5 5 0 0
A = M 1 AM = 1 4 5 0 0 1 0 0 0 2 = 0 1 1
8
0 1
2 0 3 3 5 1 3 3 0 0 1
5 0 0
A = 0 1 1

0 0 1

This is called a Jordan (canonical) form. There are two


"Jordan blocks." In "block form", this is a "block-
diagonal" matrix. A Jordan block has the general form:
1 0 L 0
0 1 O M

O O O 0
M O O 1

0 L 0

for the repeated eigenvalue .


NOTE that just because an eigenvalue is repeated doesn't
mean we will need generalized eigenvectors.
Example: 1 0 1
A = 0 1 0 For
Fortriangular
triangularmatrices,
matrices,the
the
eigenvalues
eigenvalueswill
willlie
lieon
onthe
the
0 0 2 diagonal.
diagonal.

1 0 1
A I = 0 1 0 = (1 ) 2 ( 2 ) = 0
0 0 2

Find eigenvectors: For = 2:

1 0 1 1
( A 2I )x = 0 1 0 x = 0 x1 = 0

0 0 0 1
For = 1:
0 0 1 1 0
( A I ) x = 0 0 0 x = 0 x 2 = 0, x3 = 1

0 0 1 0 0

The rank deficiency of this matrix is TWO, so the


dimension of its null space is TWO, so there are
TWO linearly independent vectors such that the
equality holds.
Altogether, we have three vectors, giving a modal
matrix of:
1 0 1
M = 0 1 0

0 0 1
1 0 0
A = M 1
AM = 0 1 0
And a diagonalized form:
0 0 2
Another way to compute the generalized eigenvectors is
somewhat more algorithmic:

II. "Top down" method: First we will need the definition of a


special integer known as the index i of the eigenvalue i .

i = smallest such that rank ( A i I ) = n m i

(= matrix size - algebraic multiplicity)

i is also the size of the largest Jordan block


Now for the algorithm itself: search for all linearly
independent solutions of the equations:

( A i I ) i x = 0
( A i I ) i 1 x 0

1 1
denote these solutions v 1 , K , v m i . There will
be no more than m i of them (why?).
Rank ( A i I ) i = n mi

# solution = n (n mi )
Now compute a different chain of generalized
eigenvectors for each j = 1, K , mi :
( A i I )v1j = v 2j

( A i I ) v 2j = v 3j chain of generalized
M
eigenvectors
1

( A i I )v j i = vji

( A i I )v j i = 0 REGULAR eigenvector
ending each chain

These will be chains of length i . If chains of shorter


length are needed, start with

( A i I )i 1 x = 0
( A i I )i 2 x 0 etc.
EXAMPLE:
1 2 0
A = 0 1 0 1 = 2 = 1 3 = 5

3 3 5

n=3 m1 = 2 n m1 = 1

Consider =1
0 2 0
( A I ) = 0 0 0 r( A I ) = 2

3 3 4
0 0 0
( A I )2 = 0 0 0 r( A I ) 2 = 1

12 6 16
Index of = 1 is 2.
So solve ( A I )2 x1 = 0
( A I ) x1 0

1
to get x1 = 2 (generalized)

0
4
Now generating the chain: x 2 = ( A I ) x1 = 0

3

The chain stops here, and x2 is a regular eigenvector, as


can be verified by

( A I ) x2 = 0
Note that if there are other linearly independent
solutions to

( A I )2 x = 0

we can initiate different chains.

T
Finally, for this example, we note that x = 0 0 1
is the regular eigenvector corresponding to 3 = 5 ,
so we get:
reverse order
0 4 1 5 0 0
M = 0 0 2 A = M 1 AM = 0 1 1

1 3 0 0 0 1
generalized
regular
ANOTHER EXAMPLE:
0 0 1 0
0 0 0 1
A= 4 = 0, m1 = 4, n m1 = 0
0 0 0 0

0 0 0 0

0 0 1 0

0 0 0 1
r( A 0 I ) = r =2
0 0 0 0

0 0 0 0
0

0 0 0
i= 2 =

0 0 0 0
r ( A 0 I )2 =r =0 longest chain
0 0 0 0
of eigenvectors

0 0 0 0
Find 2 linearly independent solutions to
( A I )2 x1 = ( A)2 x1 = 0
( A I ) x1 = ( A) x1 0

try x1 = [1 0 0 0 ] :
T
( A 0 I )x1 = 0 D No

same for x1 = 0 1 0 0
T
D No

try 0 0 1
0 0 0
x1 = : ( A 0 I ) =
1 1 0
0 0 0

generalized regular
0 0 0
and also: 0 0 1
x1 = : ( A 0 I ) =
0 0 0

1 1 0

generalized regular
so now,
1 0 0 0 0 1 0 0
0 0 1 0 0 0 0 0
M = , A = M 1 AM =
0 1 0 0 0 0 0 1
0 0 0 1 0 0 0 0

reg reg
gen gen
III. "Adjoint" method. This method requires computation
of the adjoint of A and must be done "by hand." It is
relatively tedious to do, but there is an example in
Brogan (page 257).

More discussion of eigenvalues, eigenvectors, generalized


eigenvectors, and Jordan forms:
Some facts:

The eigenvectors of a matrix that correspond to distinct


eigenvalues are linearly independent.
When an eigenvalue is repeated (algebraic multiplicity >1),
we don't always require generalized eigenvectors. For
example, an eigenvalue with algebraic multiplicity m may
have p ( m ) linearly independent regular eigenvectors.
We then would have to find only m-p generalized
eigenvectors, for that eigenvalue, to get the modal matrix.

The geometric multiplicity of eigenvalue is defined to be


equal to the rank deficiency (degeneracy) of the matrix A I .
It is the number of linearly independent (regular)
eigenvectors we can find associated with the eigenvalue.
Recall from the example:
0 2 0 n=3
( A I ) =1 x = 0 0 0 x
rank=2
3 3 4
rank
deficiency=1
has algebraic multiplicity 2

has geometric multiplicity 1 n r ( A I ) = gm


so we can find one regular eigenvector

When we use the modal matrix to find the Jordan form of a


matrix, we will get one Jordan block for each regular
eigenvector we can find. Similarly, the number of Jordan
blocks associated with one repeated eigenvalue will be
equal to the geometric multiplicity of that eigenvalue.
The algebraic multiplicity of will therefore be the sum of
the sizes of all the Jordan blocks associated with .
(am is the order of root) (obvious)
Example:
The matrix
3 1 1 1 0 0
1 1 1 1 0 0

0 0 2 0 1 1
A=
0 0 0 2 1 1

0 0 0 0 1 1
0 0
0 0 1 1

has A I = ( 2 )5
So it has eigenvalues: 1 = 2, algebraic multiplicity 5
2 = 0, algebraic multiplicity 1

NOTE:
NOTE: The The AAmatrix
matrix
itself
itselfhas
hasrank
rank
If we compute the rank of deficiency
deficiencyequal
equaltoto
A 2I 1 = 2 the
thegeometric
geometric
multiplicity
multiplicityof
ofany
any
we get 4, for a rank deficiency zero
zeroeigenvalues.
eigenvalues.
of 6-4=2. (Why?)
gm = 2 (Why?)

Because the
Therefore, in the Jordan canonical form for this
column
matrix, we will have 2 Jordan blocks for the of the modal
eigenvalue = 2 (and of course one trivial block matrix = 0
corresponding to = 0 ). We can calculate 2
regular eigenvectors and will need 3 generalized
eigenvectors. F
When we go through the exercise of finding the Jordan
form, we get:
2 1 0 0 0 0
0 2 1 0 0 0

0 0 2 0 0 0
0 0 0 2 1 0

0 0 0 0 2 0
0 0 0 0 0 0

How could you tell there is a 3x3 block and a 2x2 block, rather
than a 4x4 and a 1x1? Number of regular eigenvectors
Recall that the index of the eigenvalue is the smallest integer i
such that rank ( A i I ) i = n mi .
For this matrix and eigenvalue i , i = 3 , and this will be the
size of the largest Jordan block associated with eigenvalue i .
A Geometric Interpretation of All This Stuff:

Definition: Let X 1 be a subspace of linear vector space X .


This subspace is said to be A-invariant if for every
vector, x X 1 Ax X 1 .

Definition: The set of all (regular) eigenvectors corresponding


to an eigenvalue i forms a basis of a subspace of X , called
the eigenspace of i . This also happens to be the null space
of a transformation defined as A i I .

Theorem: The eigenspace of i is A-invariant and has


dimension equal to the degeneracy of A i I .
Proof: Denote the eigenspace of i as N i . We have already seen
that the number of eigenvectors we will find is equal to qi , the
rank deficiency of A i I . So we will have a basis of N i
consisting of qi vectors, so the dimension of N i is qi .
Now if we take a vector x N i , we can expand it in the basis
of eigenvectors ei as
qi
x = ai ei ,
i =1

where ai 's are coefficients. Then applying operator A:

qi qi qi qi
Ax = A ai ei = ai ( Aei ) = ai ( i ei ) = ( ai i ) ei N i
i =1 i =1 i =1 i =1

So Ax is in N i by virtue of it being a linear combination of


the basis vectors. n
Picture: (plane)
Ay IfIfaavector
vectoryyisisNOT
NOTininthe
thesubspace,
subspace,the
the
action
actionof
ofAAdoesn't
doesn'tnecessarily
necessarilyput
putitit
e1 there.
there.
eigenvectors of A
e e1 y
2

e
2
x

Ax

Plane
Plane(subspace)
(subspace)formed
formedby
by IfIfaavector
vectorxxstarts
startsout
outininthe
the
eigenvectors
eigenvectorsofofAA subspace,
subspace,ititstays
staysininthe
thesubspace
subspace
when
whenAAacts
actson
onit.
it.
Chapter 5: Functions of Vectors and Matrices

y , Ax (= y T Ax ): "Bilinear Form"

x, Ax (= x T Ax ) : "Quadratic Form"

Note that because

x Ax = x Ax
T
( T
)
T
= x T AT x ,

(
) +
T
1 A A
x T Ax = x T Ax + x T AT x = x T x

2 2
any quadratic form can be written as a quadratic form with
a symmetric A-matrix. We therefore treat all quadratic
forms as is they contained symmetric matrices.
DEFINITIONS: Let Q = x T Ax

1. Q (or A) is positive definite iff : x , Ax > 0 for all x 0.


2. Q (or A) is positive semidefinite if : x , Ax 0 for all x 0.
3. Q (or A) is negative definite iff: x , Ax < 0 for all x 0.

4. Q (or A) is negative semidefinite if: x , Ax 0 for all x 0.

5. Q (or A) is indefinite if: x , Ax > 0 for some x 0, and


x , Ax < 0 for other x 0.

Tests for definiteness of matrix A in terms of its

F
eigenvalues i :
If the real parts of eigenvalues i
Matrix A is . . . of A are: . . . .

1. Positive definite All > 0

2. Positive semidefinite All 0

3. Negative definite All <0

4. Negative semidefinite All 0

5. Indefinite Some Re( i ) > 0, some Re( i ) < 0 .

See book for tests involving leading principal minors.

You might also like