<< Chapter < Page | Chapter >> Page > |
Fix . Let be an random matrix whose entries are i.i.d. with drawn according to a strictly sub-Gaussian distribution with . If
then satisfies the RIP of order with the prescribed with probability exceeding , where is arbitrary and .
Even within the class of probabilistic results, there are two distinct flavors. The typical approach is to combine a probabilistic construction of a matrix that will satisfy the RIP with high probability with the previous results in this chapter. This yields a procedure that, with high probability, will satisfy a deterministic guarantee applying to all possible signals . A weaker kind of result is one that states that given a signal , we can draw a random matrix and with high probability expect certain performance for that signal . This type of guarantee is sometimes called instance-optimal in probability . The distinction is essentially whether or not we need to draw a new random for each signal . This may be an important distinction in practice, but if we assume for the moment that it is permissible to draw a new matrix for each , then we can see that [link] may be somewhat pessimistic. In order to establish our main result we will rely on the fact, previously used in "Matrices that satisfy the RIP" , that sub-Gaussian matrices preserve the norm of an arbitrary vector with high probability. Specifically, a slight modification of Corollary 1 from "Matrices that satisfy the RIP" shows that for any , if we choose according to the procedure in [link] , then we also have that
with . Using this we obtain the following result.
Let be fixed. Set Suppose that is an sub-Gaussian random matrix with . Suppose we obtain measurements of the form . Set . Then with probability exceeding , when , the solution to [link] obeys
First we recall that, as noted above, from [link] we have that will satisfy the RIP of order with probability at least . Next, let denote the index set corresponding to the entries of with largest magnitude and write . Since , we can write . If is sub-Gaussian then from Lemma 2 from "Sub-Gaussian random variables" we have that is also sub-Gaussian, and one can apply [link] to obtain that with probability at least , . Thus, applying the union bound we have that with probability exceeding , we satisfy the necessary conditions to apply Theorem 1 from "Signal recovery in noise" to , in which case and hence
From the triangle inequality we thus obtain
which establishes the theorem.
Thus, although it is not possible to achieve a deterministic guarantee of the form in [link] without taking a prohibitively large number of measurements, it is possible to show that such performance guarantees can hold with high probability while simultaneously taking far fewer measurements than would be suggested by [link] . Note that the above result applies only to the case where the parameter is selected correctly, which requires some limited knowledge of , namely . In practice this limitation can easily be overcome through a parameter selection technique such as cross-validation [link] , but there also exist more intricate analyses of minimization that show it is possible to obtain similar performance without requiring an oracle for parameter selection [link] . Note that [link] can also be generalized to handle other measurement matrices and to the case where is compressible rather than sparse. Moreover, this proof technique is applicable to a variety of the greedy algorithms described later in this course that do not require knowledge of the noise level to establish similar results [link] , [link] .
Notification Switch
Would you like to follow the 'An introduction to compressive sensing' conversation and receive update notifications?