Design of Bandlimited Signals for Binary Communication Using Simple Correlation Detection

01 February 1965

New Image

In the reception of serial binary data transmitted over a noisy bandlimited channel, errors result from the combined effects of intersymbol interference and noise. Minimization of the error rate involves appropriate design of both the transmitted signal and the method of detection, taking into account the effects of both causes of degradation. Nyquist has shown how bandlimited signals may be designed so as to eliminate intersymbol interference when detection is accomplished by periodic instantaneous sampling. 1 Sunde has shown that optimum performance over a channel with white Gaussian noise is achieved when the shaping is divided equally between the transmitter and receiver. 2 Tufts has developed a technique of long memory detection which eliminates intersymbol interference and optimizes noise performance subject to that constraint, for an arbitrary transmitted signal.3 Kurz and Trabka have studied the design of signals for transmission in the presence of nonwhite noise without the problem of intersymbol interference. 4 ' 5 * This paper is based on parts of a thesis accepted by the faculty of the Graduate Division of the School of Engineering and Science of New York University in partial fulfillment of the requirements for the degree of Doctor of Engineering Science. 235