<< Chapter < Page | Chapter >> Page > |
While the spark , null space property (NSP), and restricted isometry property (RIP) all provide guarantees for the recovery of sparse signals, verifying that a general matrix satisfies any of these properties has a combinatorial computational complexity, since in each case one must essentially consider submatrices. In many settings it is preferable to use properties of that are easily computable to provide more concrete recovery guarantees. The coherence of a matrix is one such property [link] , [link] .
The coherence of a matrix , , is the largest absolute inner product between any two columns , of :
It is possible to show that the coherence of a matrix is always in the range ; the lower bound is known as the Welch bound [link] , [link] , [link] . Note that when , the lower bound is approximately .
One can sometimes relate coherence to the spark, NSP, and RIP. For example, the coherence and spark properties of a matrix can be related by employing the Gershgorin circle theorem [link] , [link] .
The eigenvalues of an matrix with entries , , lie in the union of discs , , centered at and with radius .
Applying this theorem on the Gram matrix leads to the following straightforward result.
For any matrix ,
Since does not depend on the scaling of the columns, we can assume without loss of generality that has unit-norm columns. Let with determine a set of indices. We consider the restricted Gram matrix , which satisfies the following properties:
From [link] , if then the matrix is positive definite, so that the columns of are linearly independent. Thus, the spark condition implies or, equivalently, for all , yielding .
By merging Theorem 1 from "Null space conditions" with [link] , we can pose the following condition on that guarantees uniqueness.
If
then for each measurement vector there exists at most one signal such that .
[link] , together with the Welch bound, provides an upper bound on the level of sparsity that guarantees uniqueness using coherence: . Another straightforward application of the Gershgorin circle theorem ( [link] ) connects the RIP to the coherence property.
If has unit-norm columns and coherence , then satisfies the RIP of order with for all .
The proof of this lemma is similar to that of [link] .
The results given here emphasize the need for small coherence for the matrices used in CS. Coherence bounds have been studied both for deterministic and randomized matrices. For example, there are known matrices of size that achieve the coherence lower bound , such as the Gabor frame generated from the Alltop sequence [link] and more general equiangular tight frames [link] . These constructions restrict the number of measurements needed to recover a -sparse signal to be . Furthermore, it can be shown that when the distribution used has zero mean and finite variance, then in the asymptotic regime (as and grow) the coherence converges to [link] , [link] , [link] . Such constructions would allow to grow asymptotically as , matching the known finite-dimensional bounds.
The measurement bounds dependent on coherence are handicapped by the squared dependence on the sparsity , but it is possible to overcome this bottleneck by shifting the types of guarantees from worst-case/deterministic behavior, to average-case/probabilistic behavior [link] , [link] . In this setting, we pose a probabilistic prior on the set of -sparse signals . It is then possible to show that if has low coherence and spectral norm , and if , then the signal can be recovered from its CS measurements with high probability. Note that if we replace the Welch bound, then we obtain , which returns to the linear dependence of the measurement bound on the signal sparsity that appears in RIP-based results.
Notification Switch
Would you like to follow the 'An introduction to compressive sensing' conversation and receive update notifications?