<< Chapter < Page | Chapter >> Page > |
where and are constants and denotes the Euclidean norm. Plugging into [link] and taking log likelihoods, we obtain
where denotes the objective function (risk)
our ideal risk would be .
Instead of performing continuous-valued MAP estimation, we optimize for the MAP in the discretized domain . We begin with a technical condition on the input.
Condition 1 [link] We require that the probability density has bounded derivatives
where is the derivative w.r.t. entry of , , and .
Let be the quantization bin in nearest to . Condition 1 ensures that a small perturbation from to does not change by much. We use this fact to prove that is sufficiently close to asymptotically.
Theorem 9 [link] Let be an i.i.d. Gaussian measurement matrix where each entry has mean zero and variance . Suppose that Condition 1 holds and the aspect ratio , and let the noise be i.i.d. zero-mean Gaussian.Then for all , the quantized estimator satisfies
almost surely as .
Given that Theorem 9 shows that the risk penalty due to quantization vanishes asymptotically in , we now describe a Kolmogorov-inspired estimatorfor CS over a quantized grid. We define the energy :
where .
Ideally, our goal is to compute the globally minimum energy solution
Theorem 10 [link] Let be a bounded stationary ergodic source. Then the outcome of Algorithm 1 after iterations obeys
We define a stochastic transition matrix from to itself given by the Boltzmann distribution for super-iteration . Similarly, defines the stable state distribution on for , satisfying .
Definition 4 [link] The Dobrushin's ergodic coefficient of a Markov chain transition matrix is denoted by and defined as
where denotes the row of , .
From the definition, . Moreover, the ergodic coefficient can be rewritten as
where denotes the entry of at the row and column.
We group the product of transition matrices across super-iterations as
There are two common characterizations for the stable-state behavior of a non-homogeneous MC.
Definition 5 [link] A non-homogeneous MC is called weakly ergodic if for any distributions and over the state space , and any ,
Similarly, a non-homogeneous MC is called strongly ergodic if there exists a distribution over the state space such that for any distribution over , and any ,
We will use the following two theorems from [link] in our proof.
Theorem 11 [link] A Markov chain is weakly ergodic if and only if there exists a sequence of integers such that
Theorem 12 [link] Let a Markov chain be weakly ergodic.Assume that there exists a sequence of probability distributions on the state space such that . Then the Markov chain is strongly ergodic if
Notification Switch
Would you like to follow the 'Universal algorithms in signal processing and communications' conversation and receive update notifications?