Recall that if
is an orthogonal projection onto a subspace
, we can write
any
as
where
and
. We now turn to how to actually find
.
We begin with the finite-dimensional case, assuming that
is a basis for
. If
then we have that for any
We also note that since
, we can write
. Thus we obtain
from which we obtain
We know
and
. Our goal is to find
. Note that a procedure for calculating
for any given
is
equivalent to one that computes
.
To find
, observe that
[link] represents a set of
equations with
unknowns.
More compactly, we want to find a vector
such that
where
is called the “Grammian” or “Gram matrix” of
One can show since
are linearly independent that
is
positive definite, and hence invertible.
Also note that by construction,
is
conjugate symmetric , or
“Hermitian” , i.e.,
, where
denotes the conjugate transpose of
.
Thus, since
exists, we can write
to calculate
.
As a special case, suppose now that
is an orthobasis for
? What is
? It is just the identity matrix
!
Computing
just got much easier, since now
. Plugging this
back into out formula for
we obtain
Just to verify, note that
is indeed a projection matrix:
Example Suppose
is given by
Suppose
is given by
Let
.
Our goal is to find the closest (in
) function in
to
. Using
from before, we can calculate
,
,
,
. Thus, we have that