Adaptive Redundancy Removal in Data Transmission
01 April 1968
In the design, analysis, and testing of data transmission systems it is invariably assumed that the input digits are identically distributed, independent random variables. However, in many actual systems the input digits may arise from a physical source which imposes significant correlations in the data train. In these cases we know that the entropy of the source is less than when independent digits are presented. Accordingly, we should be able to use the redundancy in the input message to provide, in some sense, more efficient transmission. For example, we could imagine the redundancy being used to decrease bandwidth, to increase speed, to lower probability of error, or to lower average signal power. Redundancy removal in analog transmission systems was investigated in the early 1950's by Oliver, Ivretzmer, Harrison, and Elias 1-4 . Each of these papers relied on the theory of linear prediction as developed by Wiener in the early 1940's.6 Figure 1 shows the basic idea. It is assumed that the input samples are taken from a stationary time series {a;n}. These samples are passed through a linear filter whose output xn at time tn forms a linear prediction of the sample xn based on all preceding samples. The prediction xn is subtracted from the actual sample xn and only the error en is passed on for further processing and 549