<< Chapter < Page | Chapter >> Page > |
In this section, our linear systems will be n×n matrices of complex numbers. For a little background into some of theconcepts that this module is based on, refer to the basics of linear algebra .
Let be an n×n matrix, where is a linear operator on vectors in .
Through [link] and [link] , let us look at the difference between [link] and [link] .
If is an eigenvector of , then only its length changes. See [link] and notice how our vector's length is simply scaled by our variable, , called the eigenvalue :
From inspection and understanding of eigenvectors, find the two eigenvectors, and , of Also, what are the corresponding eigenvalues, and ? Do not worry if you are having problems seeing these values from the information given so far,we will look at more rigorous ways to find these values soon.
The eigenvectors you found should be: And the corresponding eigenvalues are
Show that these two vectors, are eigenvectors of , where . Also, find the corresponding eigenvalues.
In order to prove that these two vectors are
eigenvectors, we will show that these statements meetthe requirements stated in the
definition .
These results show us that
only scales the two vectors (
In the above examples, we relied on your understanding of thedefinition and on some basic observations to find and prove the values of the eigenvectors and eigenvalues. However, as youcan probably tell, finding these values will not always be that easy. Below, we walk through a rigorous and mathematicalapproach at calculating the eigenvalues and eigenvectors of a matrix.
Find such that , where is the "zero vector." We will start with [link] , and then work our way down until we find a way to explicitly calculate . In the previous step, we used the fact that where is the identity matrix. So, is just a new matrix.
Given the following matrix, , then we can find our new matrix, .
If for some , then is not invertible . This means: This determinant (shown directly above) turns out to be a polynomial expression (of order ). Look at the examples below to see what this means.
Starting with matrix (shown below), we will find the polynomial expression, where our eigenvalues will be the dependent variable.
Starting with matrix (shown below), we will find the polynomial expression, where our eigenvalues will be the dependent variable.
If you have not already noticed it, calculating the eigenvalues is equivalent to calculating the roots of
Given an eigenvalue, , the associated eigenvectors are given by set of equations with unknowns. Simply solve the equations to find the eigenvectors.
Say the eigenvectors of , , span , meaning are linearly independent and we can write any as
where in [link] we have,
For the following matrix, and vector, , solve for their product. Try solving it using two differentmethods: directly and using eigenvectors.
Direct Method (use basic matrix multiplication) Eigenvectors (use the eigenvectors and eigenvalues we found earlier for this same matrix) As shown in [link] , we want to represent as a sum of its scaled eigenvectors. For this case, we have: Therefore, we have Notice that this method using eigenvectors required no matrix multiplication. This may have seemed more complicated here, but just imagine being really big, or even just a few dimensions larger!
Notification Switch
Would you like to follow the 'Signals and systems' conversation and receive update notifications?