Digital Radio Outage Due to Selective Fading - Observation vs Prediction from Laboratory Simulation

01 May 1979

New Image

Present interest in using high-speed common carrier digital radio 1-5 has precipitated a need for estimating the performance of such systems during periods of selective (multipath) fading. This paper describes a method of characterizing a digital radio system in the laboratory which allows the outage to be accurately estimated. For a digital radio system, outage requirements are stated in terms of the number of seconds in a time period (usually a heavy fading month) during which the bit error rate (BER) may exceed a specified level; typically, 10~3 or 1(T4 is appropriate to voice circuit application. The method is based upon a statistical channel model6 developed from measurements on an unprotected 26.4-mile hop in the 6-GHz band in Palmetto, Georgia in 1977 using a general trade 8-PSK digital 1073 radio system as a channel measuring probe. The modeled fading occurrences were scaled to the basis of a heavy fading month using the occurrence of time faded below a level at a single frequency as the means of calibration. The bit error rate performance of the digital radio system was measured during the time period used for channel modeling and for an extended period corresponding to a heavy fading month. This same radio system was later subjected to a measurement program in the laboratory using a multipath simulator which provides a circuit realization of the fading model. The measured results are used with the channel model to determine the occurrence of channel conditions which will cause the BER to exceed a given threshold.