Phase Dispersion Characteristics During Fade in a Microwave Line-of-Sight Radio Channel
01 December 1973
Fading in microwave communication channels has been the subject of investigation by many workers for a considerable length of time. However, the emphasis on these studies has been more toward the behavior of the amplitude characteristics of the signal rather than of its phase characteristics. The purpose of this study was to investigate the phase characteristics of a microwave signal transmitted over a typical tropospheric, line-of-sight link.* Specifically, the following topics were addressed in the experiment. (i) Measurement of phase variations over a microwave radio channel, as a function of both frequencj' and time; obtaining from the experimental results statistics on the phase nonlinearity in a microwave radio channel. (it) Correlation, if any, between the amplitude and phase distortions. Measurements were made in late 1970 on a TH-3 radio channel operating between Atlanta and Palmetto, Georgia. The experiment was conducted as an adjunct to an ongoing study by other members of Bell Laboratories. The transmitter located in Atlanta and the front end of the receiver situated in Palmetto were common to both experiments. However, a different set of apparatus was employed for the measurement of amplitude and phase in the present study. G. M. Babler 1 has reported on the experimental layout of the microwave link. This paper addresses three major areas: experimental technique and arrangement, measured data, and statistical analysis. The experimental technique is a novel one in that it measures directly the phase difference between pairs of transmitted tones separated in frequency.