The Simulation of Rain Fade on Arbitrary Microwave Link Networks

Conference: EuCAP 2009 - 3rd European Conference on Antennas and Propagation
03/23/2009 - 03/27/2009 at Berlin, Germany

Proceedings: EuCAP 2009

Pages: 5Language: englishTyp: PDF

Personal VDE Members are entitled to a 10% discount on this title

Paulson, Kevin S.; Zhang, Xaiobei (Dept. of Engineering, University of Hull, Hull, HU6 7RX, UK)

To predict the Quality of Service (QoS) at a node in heterogeneous networks of line-of-sight, terrestrial, microwave links requires knowledge of the spatial and temporal statistics of rain over scales of a few meters to tens or hundreds of kilometres, and over temporal periods as short as one-second. Meteorological radar databases provide rainrate maps over areas with a spatial resolution as fine as a few hundred meters and a sampling period of 2 to 15 minutes. Such two-dimensional, rainrate map time-series would have wide application in the simulation of rain scatter and attenuation of arbitrary millimetre-wave radio networks, if the sampling period were considerably shorter i.e. of the order of 10 seconds or less, and the integration volumes smaller. This paper investigates a stochastic-numerical method to interpolate and downscale rainrate field time-series to shorter sampling periods and smaller spatial integration areas, while conserving the measured and expected statistics. A series of radar derived rain maps, with a 10 minute sample period, are interpolated to 10 seconds. The statistics of the interpolated-downscaled data are compared to fine scale rain data i.e. 10 seconds rain gauge data and radar data with a 300 metres resolution. The interpolated rain map time series is used to predict the fade duration statistics of a microwave link and these are compared to a published and ITU-R model.