Information Theory and Approximation of Bandlimited Functions

01 October 1970

New Image

It is the purpose of this paper to discuss both the best approximation of sets of bandlimited functions under Sobolev norms and the concomitant information-theoretic estimates. The Sobolev norms are useful when it is desired to approximate simultaneously the function and some of its derivatives. This requires an amount of information beyond that for approximating only the function. Section II gives the necessary background definitions of width, entropy, and capacity; theorems providing representations of bandlimited functions, as well as a form of Mitjagin's inequality relating approximability to entropy, are proved. The distinction between capacity and entropy is comparable to that between communication and storage, since capacity refers to the number of distinguishable functions transmitted from a signal source while entropy measures a bit requirement for the reproduction of a function to within a specified accuracy. A constructive approach to communication requirements implies an explicit means of representing any function of the signal source by numbers with a uniformly bounded number of digits. The procedure or algorithm used is usually obtained from an infinite series representation with subsequent truncation and quantization. Pulse code modulation systems provide examples of this procedure. Section II gives a precise definition, while Section III presents an explicit construction of a feasible algorithm. This algorithm has been applied to the design of a class of PCM systems.1 1911