PRIOR WORK

.

Given two sets of highly correlated data, a noisy version Y at the decoder, and the original X at the encoder, as shown in Figure -1, the question is how to transmit X most efficiently? Note that X and Y cannot communicate at the encoder stage.

Figure-1) Encoder-Decoder System

According to the Slepian-Wolf Theorem, established in 1971, given the scheme shown in Figure-2, it is possible to transmit X and Y without any loss if all of the following hold true:

- R1 > H(X|Y),

- R2 > H(Y|X), and

- R1 + R2 > H(X,Y).

.

Figure-2) Slepian-Wolf Encoding-Decoding

Graphically, this can be shown as below:

Figure-3) The Slepian-Wolf Bound Plot

It should be noted that the scheme shown in Figure-1 assumes that NO INFORMATION about Y is included at the encoder. This means that our problem, as shown in Figure-1, is a special case of the Slepian-Wolf Theorem, where Y is already transmitted and present at the decoder. Hence, we can show this case as in Figure-4.

Figure-4) The Slepian-Wolf Bound for Our Problem

Hence, for our problem as shown in Figure-1, according to the Slepian-Wolf Theorem, X can be transmitted without loss for any rate at or above H(X|Y) bits. This can also be intuitively understood as H(X|Y) is the uncertainity still present in X, given that Y is known. Hence, it is this uncertainity that needs to be transmitted.

In addition to the theoratical framework established by Slepian-Wolf, practical work has also be done in the recent years to turn the Slepian-Wolf Theorem into practice. The most notable of these is the work by Ramchandran and Pradhan. The authors use a binary symetric channel modelling with EQUALLY PROBABLE INPUTS (i.e. both the 0's and 1's of X occur equally likely). The channel is illustrated below in Figure-5.

Figure-5) The Binary Symetric Channel with p being the probability of error

The basic scheme used by the authors in encoding X is based on separating the possible codewords into bins such that the codewords of each bin are maximally separated. (This requires the use of an error-correcting code for long codeword sequences.) A simple example with codewords of length 3 is shown below.

Figure-6) Illustration of Bin Formation

After the bins are formed, the bin numbers are transmitted over the channel, and the decoder decodes X as the closest member of the received bin to the present Y codeword at the decoder.

ABSTRACT
INTRODUCTION
PRIOR WORK
BASIC SCHEME
METHODOLOGY
RESULTS
REFERENCES