Miscalibration of radar determines a systematic error (i.e., bias) that is observed in radar estimates of rainfall. Although a rain gauge can provide a pointwise rainfall measurement, weather radar can cover an extended area. To compare the two measurements, it is necessary to individuate the weather radar measurements at the same location as the rain gauge. Bias is measured as the ratio between cumulative rain gauge measurements and the corresponding radar estimates. The rainfall is usually cumulated, taking into account all rainfall events registered in the target area. The contribution of this work is the determination of the optimal number of rainfall events that are necessary to calibrate rainfall radar. The proposed methodology is based on the entropy concept. In particular, the optimal number of events must fulfil two conditions, namely, maximisation of information content and minimisation of redundant information. To verify the methodology, the bias values are estimated with 1) a reduced number of events and 2) all available data. The proposed approach is tested on the Polar 55C weather radar located in the borough area of Rome (IT). The radar is calibrated against rainfall measurements of a couple of rain gauges placed in the Roman city centre. Analysing the information content of all data, it is found that it is possible to reduce the number of rainfall events without losing information in evaluating the bias.
Data selection to assess bias in rainfall radar estimates: An entropy-based method
Ridolfi, Elena;
2013
Abstract
Miscalibration of radar determines a systematic error (i.e., bias) that is observed in radar estimates of rainfall. Although a rain gauge can provide a pointwise rainfall measurement, weather radar can cover an extended area. To compare the two measurements, it is necessary to individuate the weather radar measurements at the same location as the rain gauge. Bias is measured as the ratio between cumulative rain gauge measurements and the corresponding radar estimates. The rainfall is usually cumulated, taking into account all rainfall events registered in the target area. The contribution of this work is the determination of the optimal number of rainfall events that are necessary to calibrate rainfall radar. The proposed methodology is based on the entropy concept. In particular, the optimal number of events must fulfil two conditions, namely, maximisation of information content and minimisation of redundant information. To verify the methodology, the bias values are estimated with 1) a reduced number of events and 2) all available data. The proposed approach is tested on the Polar 55C weather radar located in the borough area of Rome (IT). The radar is calibrated against rainfall measurements of a couple of rain gauges placed in the Roman city centre. Analysing the information content of all data, it is found that it is possible to reduce the number of rainfall events without losing information in evaluating the bias.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.