Rain Attenuation and Radio Path Design

01 January 1970

New Image

Heavy rainfall on a radio path absorbs and scatters power transmitted at frequencies above 10 GHz and causes large fading of received signals. At 20 GHz, for example, the attenuation due to a uniform rain rate of 100 mm/hr is about 10 dB/lcm. Rain attenuation is so severe at these frequencies that for some applications transmission paths must be restricted to a few kilometers or less rather than the tens of kilometers common at lower frequencies. Since the cost of a radio system increases with the number of repeaters it is important to use the longest path allowed by the transmission objectives. This path length can be determined accurately only if the fading outage due to rain attenuation can be predicted. Bussey estimated fading statistics on a microwave path from point rain rate data. 1 He used the rain attenuation theory of Ryde and Ryde 2-4 to convert rain statistics to fading statistics and since 1950 his results have been used in the design of radio systems.5 However, as operating frequencies increase and path lengths get shorter, increased precision of fading estimates is required for optimum radio system design. Over the years a number of experiments have been performed in which attenuation was measured on a path at specific times and compared with values computed from rain rates measured by rain gauges spaced along the path near ground level. Here too, the theory of Ryde 121