<< Chapter < Page | Chapter >> Page > |
Let be a vector space with basis . The dimension of , denoted , is the cardinality of .
Every vector space has a basis.
Every basis for a vector space has the same cardinality.
is well-defined .
If , we say is finite dimensional .
vector space | field of scalars | dimension |
---|---|---|
Every subspace is a vector space, and therefore has its own dimension.
Suppose is a linearly independent set. Then
Let be a vector space, and let and be subspaces.
We say is the direct sum of and , written , if and only if for every , there exist unique and such that .
If , then is called a complement of .
If , and , and , then is odd and even, which implies and .
Invoke a basis.
Let be a vector space over . A norm is a mapping , denoted by , such that forall , , and
Euclidean norms:
: :
Every norm induces a metric on which leads to a notion of "distance" between vectors.
Let be a vector space over , or . An inner product is a mapping , denoted , such that
over:
over:
If , then is called the "Hermitian," or "conjugatetranspose" of .
If we define , then Hence, every inner product induces a norm.
For all , , In inner product spaces, we have a notion of the angle between two vectors:
and are orthogonal if Notation: .
If in addition , we say and are orthonormal .
In an orthogonal (orthonormal) set , each pair of vectors is orthogonal (orthonormal).
An Orthonormal basis is a basis such that
The standard basis for or
The normalized DFT basis
If the representation of with respect to is then
Every inner product space has an orthonormal basis. Any (countable) basis can be made orthogonal by theGram-Schmidt orthogonalization process.
Let be a subspace. The orthogonal compliment is is easily seen to be a subspace.
If , then .
Loosely speaking, a linear transformation is a mapping from one vector space to another that preserves vector space operations.
More precisely, let , be vector spaces over the same field . A linear transformation is a mapping such that for all , and , .
In this class we will be concerned with linear transformations between (real or complex) Euclidean spaces , or subspaces thereof.
Also known as the kernel:
Both the image and the nullspace are easily seen to be subspaces.
Every linear transformation has a matrix representation . If , or , then is represented by an matrix where and is the standard basis vector.
If , then
If , then
The linear transformation/matrix is invertible if and only if there exists a matrix such that (identity).
Only
square matrices
can be invertible.
Let
be linear,
or
. The
following are equivalent:
If (or in the complex case), we say is orthogonal (or unitary ).
Notification Switch
Would you like to follow the 'Digital signal processing: a user's guide' conversation and receive update notifications?