From our understanding of
eigenvalues and eigenvectors we have
discovered several things about our operator matrix,
. We know that if the
eigenvectors of
span
and we know how to express any vector
in terms of
, then we have the operator
all figured out. If we have
acting on
, then this is equal to
acting on the combinations of
eigenvectors. Which we know proves to be fairly easy!
We are still left with two questions that need to be
addressed:
When do the eigenvectors
of
span
(assuming
are linearly independent)?
How do we express a given vector
in terms of
?
Answer to question #1
When do the eigenvectors
of
span
?
If
has
distinct eigenvalues
where
and
are integers, then
has
linearly independent
eigenvectors
which then span
.
The proof of this statement is not very hard, but is not
really interesting enough to include here. If you wish toresearch this idea further, read Strang, G., "Linear Algebra
and its Application" for the proof.
Furthermore,
distinct
eigenvalues means
has
distinct roots.
Answer to question #2
How do we express a given vector
in terms of
?
We want to find
such that
In order to find this set of variables, we will begin by
collecting the vectors
as columns in a n×n matrix
.
Now
[link] becomes
or
which gives us an easy form to solve for our variables inquestion,
:
Note that
is invertible since
it has
linearly independent
columns.
Aside
Let us recall our knowledge of functions and their basis and
examine the role of
.
where
is
just
expressed
in a different
basis :
transforms
from the
standard basis to the basis
Matrix diagonalization and output
We can also use the vectors
to represent the output,
, of a system:
where
is the matrix
with the eigenvalues down the diagonal:
Finally, we can cancel out the
and are left with a final equation for
:
Interpretation
For our interpretation, recall our key formulas:
We can interpret operating on
with
as:
where the three steps (arrows) in the above illustration represent
the following three operations: