<< Chapter < Page | Chapter >> Page > |
These two views of the operation as a decomposition of a signal or the recomposition of the signal to or from a different basis system areextremely valuable in signal analysis. The ideas from linear algebra of subspaces, inner product, span, orthogonality, rank, etc. are all important here.The dimensions of the domain and range of the operators may or may not be the same. The matrices may ormay not be square and may or may not be of full rank [link] , [link] .
A set of linearly independent vectors forms a basis for a vector space if every vector in the space can be uniquely written
and the dual basis is defined as a set vectors in that space allows a simple inner product (denoted by parenthesis: ) to calculate the expansion coefficients as
A basis expansion has enough vectors but none extra. It is efficient in that no fewer expansion vectors will represent all the vectors in the space but is fragil in that losing one coefficient or one basis vectordestroys the ability to exactly represent the signal by [link] . The expansion [link] can be written as a matrix operation
where the columns of F are the basis vectors and the vector a has the expansion coefficients as entries. Equation [link] can also be written as a matrix operation
which has the dual basis vectors as rows of . From [link] and [link] , we have
Since this is true for all ,
or
which states the dual basis vectors are the rows of the inverse of the matrix whose columns are the basis vectors (and vice versa). When thevector set is a basis, F is necessarily square and from [link] and [link] , one can show
Because this system requires two basis sets, the expansion basis and the dual basis, it is called biorthogonal .
If the basis vectors are not only independent but orthonormal, the basis set is its own dual and the inverse of F is simply its transpose.
When done in Hilbert spaces, this decomposition is sometimes called an abstract Fourier expansion [link] , [link] , [link] .
Because many signals are digital representations of voltage, current, force, velocity, pressure, flow, etc., the inner product of the signal with itself (the norm squared) is a measure of the signal energy .
Parseval's theorem states that if the basis system is orthogonal, then the norm squared (or “energy”) is invarient across a change in basis. If a change of basis is made with
then
for some constant which can be made unity by normalization if desired.
For the discrete Fourier transform (DFT) of which is
the energy calculated in the time domain: is equal to the norm squared of the frequency coefficients: , within a multiplicative constant of . This is because the basis functions of the Fourier transform are orthogonal:“the sum of the squares is the square of the sum” which means means the energy calculated in the time domain is the same as that calculated in the frequency domain.The energy of the signal (the square of the sum) is the sum of the energies at each frequency (the sum of the squares). Because of the orthogonal basis, the cross terms are zero.Although one seldom directly uses Parseval's theorem, its truth is what make sense in talking about frequency domain filtering of a time domain signal. A more general form is known as Plancherel theorem [link] .
Notification Switch
Would you like to follow the 'Basic vector space methods in signal and systems theory' conversation and receive update notifications?