The conventional ordinary least squares (OLS) method of fitting a. line to a set of data points is notoriously unreliable when the amount of random noise in the input (such as an image) is significant compared with the amount of data that is correlated with the line itself. Although points which lie far away from the line (i.e., outliers) are usually only to noise, they contribute the most to the squared distances, thereby skewing the line estimate from its correct position. In this paper we present an analytic method of separating the data of interest from the outliers. We assume that the overall data (i.e., the line data plus the noise) can be modeled as a mixture of two statistical distributions. Applying a variant of the method of moments (MoM) to the assumed model yields an analytic estimate of the desired line. Key words: Line fitting, outliers, noise removal, mixture models, method of moments.
|Title of host publication||Proceedings of the 12th IAPR International Conference on Pattern Recognition - Conference B|
|Subtitle of host publication||Pattern Recognition and Neural Networks, ICPR 1994|
|Publisher||Institute of Electrical and Electronics Engineers Inc.|
|Number of pages||3|
|State||Published - 1994|
|Event||12th IAPR International Conference on Pattern Recognition - Conference B: Pattern Recognition and Neural Networks, ICPR 1994 - Jerusalem, Israel|
Duration: 9 Oct 1994 → 13 Oct 1994
|Name||Proceedings - International Conference on Pattern Recognition|
|Conference||12th IAPR International Conference on Pattern Recognition - Conference B: Pattern Recognition and Neural Networks, ICPR 1994|
|Period||9/10/94 → 13/10/94|
Bibliographical notePublisher Copyright:
© 1994 Institute of Electrical and Electronics Engineers Inc.. All rights reserved.