Abstract
The conventional ordinary least squares (OLS) method of fitting a line to a set of data points is very unreliable when the amount of random noise in the input (such as an image) is significant compared with the amount of data that is correlated with the lane itself. In this paper we present an analytic method of separating the data of interest from the outliers. We assume that the overall data (i.e., the line data plus the noise) can be modeled as a mixture of two statistical distributions. Applying a variant of the method of moments (MoM) to the assumed model yields an analytic estimate of the desired line.
Original language | American English |
---|---|
Title of host publication | Pattern Recognition, 1994. Vol. 2-Conference B: Computer Vision & Image Processing., Proceedings of the 12th IAPR International. Conference on |
Publisher | IEEE |
State | Published - 1994 |