<< Chapter < Page Chapter >> Page >

We map the input x n to a sequence z n over a finite alphabet, where actual numerical values are obtained by a scalar function, g ( z i ) . Ideally, g ( · ) should minimize distortion. Because we focus on square error distortion, the optimal g ( · ) is conditional expectation,

g ( α ) = E [ x i | z i = α ] = i : z i = a x i i : z i = a i 1 .

Keep in mind that in an actual compression system, the encoder knows x n , but the decoder does not. Therefore, the encoder must describe (a quantizedversion of) g ( · ) to the decoder. Putting aside these details, the main point is that the adaptive reproduction alphabet algorithm also acheives theRD function for stationary and ergodic sources while allowing a reduction in the size of the alphabet, thus accelerating the algorithm.

Universal signal reconstruction

Scalar channel: Consider a scalar channel, y = x + z , where x , y , z R n , x is generated by a stationalry and ergodic source X , and z i N ( 0 , 1 ) is zero mean unit norm Gaussian noise. To avoid taking the analysis toward continuous valued x , we can simplify the setting and temporarily consider discrete-valued x , y , and z ; for example, we can consider a quantized version of the continuous problem.

Donoho  [link] proposed the following reconstruction,

x ˆ = arg min w s . t . Y - w 2 2 n K ( w ) ,

where w is a possible reconstruction sequence, and K ( w ) is the Kolmogorov complexity of w . Donoho calls this reconstruction algorithm the Kolmogorov sampler .

What is the Kolmogorov complexity  [link] , [link] , [link] of the sequence w ? It is the length of the shortest computer program that outputs w and then halts. One may ask what computer language we can use for the program;after all, it is easy to imagine some programming language being well-tuned to a specific sequence w . However, concepts such as universal Turing machines  [link] can be used to put forward an abstract programming language that will run on any hypothetical computer. The key point is that the computer is limited to a state machinethat processes an infinite-length tape, and one state machine can emulate another by programming a compiler onto the tape.

Is it practical to talk about the shortest program that outputs w and then halts? Not really, because we would need to run the Turing machine  [link] (the computer) on an exponential number of programs. Moreover, it is well known that for some programs the Turingmachine never halts (there are endless loops), and technical conditions such as time-out mechanisms will need to be added in. Therefore, computing the Kolmogorov complexity K ( w ) is impractical  [link] , [link] , [link] . However, for w that was generated by a stationary ergodic source, it is possible to approximate K ( w ) by H k ( w ) , the empirical entropy  [link] , [link] .

What is the performance of Donoho's Kolmogorov sampler? Donoho proved that x ˆ has the following distribution, X ˆ p ( X | Y )   [link] . That is, x ˆ is sampled from the posterior. Therefore, given y , we have that x ˆ and x are both generated by the same distribution.

Consider now that x ˆ M M S E , the minimum mean square error estimator, is the center of mass of the posterior. Seeing that x ˆ - x ˆ M M S E and x - x ˆ M M S E are orthogonal,

E [ x - x ˆ 2 2 ] = E [ x - x ˆ M M S E 2 2 ] + E [ x ˆ M M S E - x ˆ 2 2 ] ,

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Universal algorithms in signal processing and communications. OpenStax CNX. May 16, 2013 Download for free at http://cnx.org/content/col11524/1.1
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Universal algorithms in signal processing and communications' conversation and receive update notifications?

Ask