AN OUTLIER DETECTION METHOD IN GEODETIC NETWORKS BASED ON THE ORIGINAL OBSERVATIONS
Conteúdo do artigo principal
Resumo
The observations in geodetic networks are measured repetitively and in the network
adjustment step, the mean values of these original observations are used. The mean
operator is a kind of Least Square Estimation (LSE). LSE provides optimal results
when random errors are normally distributed. If one of the original repetitive
observations has outlier, the magnitude of this outlier will decrease because the
mean value of these original observations is used in the network adjustment and
outlier detection. In this case, the reliability of the outlier detection methods
decreases, too. Since the original repetitive observations are independent, they can
be used in the adjustment model instead of the estimating mean value of them. In
this study, to show the effects of the estimating mean value of the original repetitive
observations, a leveling network that contains both outward run and backward run
observations were simulated. Tests for outlier, Huber and Danish methods were
applied to two different cases. First, the mean values of the original observations
(outward run and return run) were used; and then all original observations were
considered in the outlier detection. The reliabilities of the methods were measured
by Mean Succes Rate. According to the obtained results, the second case has more
reliable results than first case.