To date, the use of matched-field processing (MFP) for leak detection in pipes has been limited to cases where the mismatch between data and model is assumed to be random and Gaussian distributed. This paper extends the MFP to the more realistic case where the mismatch involves both random and modeling errors. Experimental results show that the modeling error for cases with and without a leak remains similar in shape and magnitude. As a result, modeling errors can potentially be estimated from a baseline signal which ideally can be obtained before major defects emerge. This attribute is exploited to formulate a novel MFP technique that uses both past baseline and current signals to detect leaks. The novel MFP remains optimal in the sense of achieving maximum signal-to-noise ratio. The gain of the proposed leak detection method is assessed via three experimental scenarios in which the modeling errors range from simple to complex.
File in questo prodotto:
Non ci sono file associati a questo prodotto.