<< Chapter < Page | Chapter >> Page > |
We now consider the worst-case bound for this error. Using standard properties of the singular value decomposition, it is straightforward to show that if satisfies the RIP of order (with constant ), then the largest singular value of lies in the range . Thus, if we consider the worst-case recovery error over all such that , then the recovery error can be bounded by
Therefore, if is exactly -sparse, then the guarantee for the pseudoinverse recovery method, which is given perfect knowledge of the true support of , cannot improve upon the bound in [link] by more than a constant value.
We now examine a slightly different noise model. Whereas [link] assumed that the noise norm was small, the theorem below analyzes a different recovery algorithm known as the Dantzig selector in the case where is small [link] . We will see below that this will lead to a simple analysis of the performance of this algorithm in Gaussian noise.
Suppose that satisfies the RIP of order with and we obtain measurements of the form where . Then when , the solution to [link] obeys
where
The proof mirrors that of [link] . Since , we again have that , so and thus [link] applies. We follow a similar approach as in [link] to bound . We first note that
where the last inequality again follows since . Next, note that . Using this we can apply the Cauchy-Schwarz inequality to obtain
Finally, since , we have that every coefficient of is at most , and thus . Thus,
as desired.
Finally, we also examine the performance of these approaches in the presence of Gaussian noise. The case of Gaussian noise was first considered in [link] , which examined the performance of minimization with noisy measurements. We now see that [link] and [link] can be leveraged to provide similar guarantees for minimization. To simplify our discussion we will restrict our attention to the case where , so that and the error bounds in [link] and [link] depend only on the noise .
To begin, suppose that the coefficients of are i.i.d. according to a Gaussian distribution with mean zero and variance . Since the Gaussian distribution is itself sub-Gaussian, we can apply results such as Corollary 1 from "Concentration of measure for sub-Gaussian random variables" to show that there exists a constant such that for any ,
Applying this result to [link] with , we obtain the following result for the special case of Gaussian noise.
Suppose that satisfies the RIP of order with . Furthermore, suppose that and that we obtain measurements of the form where the entries of are i.i.d. . Then when , the solution to [link] obeys
with probability at least .
We can similarly consider [link] in the context of Gaussian noise. If we assume that the columns of have unit norm, then each coefficient of is a Gaussian random variable with mean zero and variance . Using standard tail bounds for the Gaussian distribution (see Theorem 1 from "Sub-Gaussian random variables" ), we have that
for . Thus, using the union bound over the bounds for different , we obtain
Applying this to [link] , we obtain the following result, which is a simplified version of Theorem 1.1 of [link] .
Suppose that has unit-norm columns and satisfies the RIP of order with . Furthermore, suppose that and that we obtain measurements of the form where the entries of are i.i.d. . Then when , the solution to [link] obeys
with probability at least .
Ignoring the precise constants and the probabilities with which the bounds hold (which we have made no effort to optimize), we observe that if then these results appear to be essentially the same. However, there is a subtle difference. Specifically, if and are fixed and we consider the effect of varying , we can see that [link] yields a bound that is adaptive to this change, providing a stronger guarantee when is small, whereas the bound in [link] does not improve as is reduced. Thus, while they provide very similar guarantees, there are certain circumstances where the Dantzig selector is preferable. See [link] for further discussion of the comparative advantages of these approaches.
Notification Switch
Would you like to follow the 'An introduction to compressive sensing' conversation and receive update notifications?