Pulse Transmission by AM, FM, and PM in the Presence of Phase Distortion

01 March 1961

New Image

355 Binary pulse transmission by various methods of carrier modulation has been dealt with elsewhere on the premise of ideal amplitude and phase characteristics of the carrier channels. 1 An important consideration in many applications is the performance in the presence of phase distortion or equivalent envelope delay distortion, which may be appreciable in certain transmission facilities. An ideal amplitude spectrum of received pulses can be approached with the aid of appropriate terminal filters with gradual cutoffs, such that the associated phase characteristic is virtually linear. Nevertheless, pronounced phase distortion may be encountered in pulse transmission over channels with sharp cutoffs outside the pulse spectrum band, as in frequency division carrier system channels designed primarily for voice transmission. The principal purpose of the present analysis is a theoretical evaluation of transmission impairments resulting from certain representative types of delay distortion in pulse transmission by various methods of carrier modulation and signal detection. These transmission impairments are reflected in the need for increased signal-to-noise ratio at the detector input to compensate for the effect of delay distortion. The performance in pulse transmission by various carrier modulation and detection methods can be related to a basic function known as the carrier pulse transmission characteristic. This basic function gives the shape of a single carrier pulse at the channel output, i.e., the detector input, under ideal conditions or in the presence of the particular kind of transmission distortion under consideration.