8436:
516:, and remove them, usually checking the source of the data to see whether the outliers were erroneously recorded. Indeed, in the speed-of-light example above, it is easy to see and remove the two outliers prior to proceeding with any further analysis. However, in modern times, data sets often consist of large numbers of variables being measured on large numbers of experimental units. Therefore, manual screening for outliers is often impractical.
193:
8422:
996:
8460:
312:. If the dataset is, e.g., the values {2,3,5,6,9}, then if we add another datapoint with value -1000 or +1000 to the data, the resulting mean will be very different from the mean of the original data. Similarly, if we replace one of the values with a datapoint of value -1000 or +1000 then the resulting mean will be very different from the mean of the original data.
8448:
323:. Taking the same dataset {2,3,5,6,9}, if we add another datapoint with value -1000 or +1000 then the median will change slightly, but it will still be similar to the median of the original data. If we replace one of the values with a data point of value -1000 or +1000 then the resulting median will still be similar to the median of the original data.
1057:
941:
257:- robustness to breaking of the assumptions about the underlying distribution of the data. Classical statistical procedures are typically sensitive to "longtailedness" (e.g., when the distribution of the data has longer tails than the assumed normal distribution). This implies that they will be strongly affected by the presence of
2090:
5102:
can sometimes be accommodated in the data through the use of trimmed means, other scale estimators apart from standard deviation (e.g., MAD) and
Winsorization. In calculations of a trimmed mean, a fixed percentage of data is dropped from each end of an ordered data, thus eliminating the outliers. The
4248:
Since M-estimators are normal only asymptotically, for small sample sizes it might be appropriate to use an alternative approach to inference, such as the bootstrap. However, M-estimates are not necessarily unique (i.e., there might be more than one solution that satisfies the equations). Also, it is
444:
The distribution of the mean is clearly much wider than that of the 10% trimmed mean (the plots are on the same scale). Also whereas the distribution of the trimmed mean appears to be close to normal, the distribution of the raw mean is quite skewed to the left. So, in this sample of 66 observations,
920:
If we replace the lowest observation, â44, by â1000, the mean becomes 11.73, whereas the 10% trimmed mean is still 27.43. In many areas of applied statistics, it is common for data to be log-transformed to make them near symmetrical. Very small values become large negative when log-transformed, and
1064:
The empirical influence function is a measure of the dependence of the estimator on the value of any one of the points in the sample. It is a model-free measure in the sense that it simply relies on calculating the estimator again with a different sample. On the right is Tukey's biweight function,
877:
The higher the breakdown point of an estimator, the more robust it is. Intuitively, we can understand that a breakdown point cannot exceed 50% because if more than half of the observations are contaminated, it is not possible to distinguish between the underlying distribution and the contaminating
579:
is the proportion of incorrect observations (e.g. arbitrarily large observations) an estimator can handle before giving an incorrect (e.g., arbitrarily large) result. Usually, the asymptotic (infinite sample) limit is quoted as the breakdown point, although the finite-sample breakdown point may be
397:
Although the bulk of the data looks to be more or less normally distributed, there are two obvious outliers. These outliers have a large effect on the mean, dragging it towards them, and away from the center of the bulk of the data. Thus, if the mean is intended as a measure of the location of the
2127:
Instead of relying solely on the data, we could use the distribution of the random variables. The approach is quite different from that of the previous paragraph. What we are now trying to do is to see what happens to an estimator when we change the distribution of the data slightly: it assumes a
477:
The outliers in the speed-of-light data have more than just an adverse effect on the mean; the usual estimate of scale is the standard deviation, and this quantity is even more badly affected by outliers because the squares of the deviations from the mean go into the calculation, so the outliers'
5110:
However, using these types of models to predict missing values or outliers in a long time series is difficult and often unreliable, particularly if the number of values to be in-filled is relatively high in comparison with total record length. The accuracy of the estimate depends on how good and
4252:
Of course, as we saw with the speed-of-light example, the mean is only normally distributed asymptotically and when outliers are present the approximation can be very poor even for quite large samples. However, classical statistical tests, including those based on the mean, are typically bounded
519:
Outliers can often interact in such a way that they mask each other. As a simple example, consider a small univariate data set containing one modest and one large outlier. The estimated standard deviation will be grossly inflated by the large outlier. The result is that the modest outlier looks
503:
The distribution of standard deviation is erratic and wide, a result of the outliers. The MAD is better behaved, and Qn is a little bit more efficient than MAD. This simple example demonstrates that when outliers are present, the standard deviation cannot be recommended as an estimate of scale.
5138:
non-robust initial fit can suffer from the effect of masking, that is, a group of outliers can mask each other and escape detection. Second, if a high breakdown initial fit is used for outlier detection, the follow-up analysis might inherit some of the inefficiencies of the initial estimator.
908:
In the speed-of-light example, removing the two lowest observations causes the mean to change from 26.2 to 27.75, a change of 1.55. The estimate of scale produced by the Qn method is 6.3. We can divide this by the square root of the sample size to get a robust standard error, and we find this
5137:
One common approach to handle outliers in data analysis is to perform outlier detection first, followed by an efficient estimation method (e.g., the least squares). While this approach is often useful, one must keep in mind two challenges. First, an outlier detection method that relies on a
5067:
is such a function that is also a statistic, meaning that it is computed in terms of the data alone. Such functions are robust to parameters in the sense that they are independent of the values of the parameters, but not robust to the model in the sense that they assume an underlying model
3527:
284:
for distributional robustness, and reserve 'robustness' for non-distributional robustness, e.g., robustness to violation of assumptions about the probability model or estimator, but this is a minority usage. Plain 'robustness' to mean 'distributional robustness' is common.
120:, where one mixes in a small amount (1â5% is often sufficient) of contamination. For instance, one may use a mixture of 95% a normal distribution, and 5% a normal distribution with the same mean but significantly higher standard deviation (representing outliers).
4560:
4633:
M-estimators do not necessarily relate to a density function and so are not fully parametric. Fully parametric approaches to robust modeling and inference, both
Bayesian and likelihood approaches, usually deal with heavy-tailed distributions such as Student's
401:
Also, the distribution of the mean is known to be asymptotically normal due to the central limit theorem. However, outliers can make the distribution of the mean non-normal, even for fairly large data sets. Besides this non-normality, the mean is also
330:, the median has a breakdown point of 50%, meaning that half the points must be outliers before the median can be moved outside the range of the non-outliers, while the mean has a breakdown point of 0, as a single large observation can throw it off.
882:. Therefore, the maximum breakdown point is 0.5 and there are estimators which achieve such a breakdown point. For example, the median has a breakdown point of 0.5. The X% trimmed mean has a breakdown point of X%, for the chosen level of X.
5119:(KSOM) offers a simple and robust multivariate model for data analysis, thus providing good possibilities to estimate missing values, taking into account their relationship or correlation with other pertinent variables in the data record.
1589:
2963:
1831:
455:
Robust statistical methods, of which the trimmed mean is a simple example, seek to outperform classical statistical methods in the presence of outliers, or, more generally, when underlying parametric assumptions are not quite correct.
4960:
5115:(rather than the univariate approach of most traditional methods of estimating missing values and outliers). In such cases, a multivariate model will be more representative than a univariate one for predicting missing values. The
89:. In statistics, classical estimation methods rely heavily on assumptions that are often not met in practice. In particular, it is often assumed that the data errors are normally distributed, at least approximately, or that the
2624:
3008:
of the contamination (the asymptotic bias caused by contamination in the observations). For a robust estimator, we want a bounded influence function, that is, one which does not go to infinity as x becomes arbitrarily large.
4597:
function is not critical to gaining a good robust estimate, and many choices will give similar results that offer great improvements, in terms of efficiency and bias, over classical estimates in the presence of outliers.
1726:
546:
Although this article deals with general principles for univariate statistical methods, robust methods also exist for regression problems, generalized linear models, and parameter estimation of various distributions.
3338:
1355:
771:
3228:
916:
The 10% trimmed mean for the speed-of-light data is 27.43. Removing the two lowest observations and recomputing gives 27.67. The trimmed mean is less affected by the outliers and has a higher breakdown point.
2384:
5049:
264:
By contrast, more robust estimators that are not so sensitive to distributional distortions such as longtailedness are also resistant to the presence of outliers. Thus, in the context of robust statistics,
2099:-th value in the sample by an arbitrary value and looking at the output of the estimator. Alternatively, the EIF is defined as the effect, scaled by n+1 instead of n, on the estimator of adding the point
3351:
1400:
1457:
5094:. If there are relatively few missing points, there are some models which can be used to estimate values to complete the series, such as replacing missing values with the mean or median of the data.
530:
Robust methods provide automatic ways of detecting, downweighting (or removing), and flagging outliers, largely removing the need for manual screening. Care must be taken; initial data showing the
527:
problems, diagnostic plots are used to identify outliers. However, it is common that once a few outliers have been removed, others become visible. The problem is even worse in higher dimensions.
134:
by replacing estimators that are optimal under the assumption of a normal distribution with estimators that are optimal for, or at least derived for, other distributions; for example, using the
4726:
4200:
increases at an accelerating rate, whilst for absolute errors, it increases at a constant rate. When
Winsorizing is used, a mixture of these two effects is introduced: for small values of x,
4449:
261:
in the data, and the estimates they produce may be heavily distorted if there are extreme outliers in the data, compared to what they would be if the outliers were not included in the data.
4412:
4056:
4242:
M-estimators do not necessarily relate to a probability density function. Therefore, off-the-shelf approaches to inference that arise from likelihood theory can not, in general, be used.
2323:
3995:
3771:
1293:
1111:
4245:
It can be shown that M-estimators are asymptotically normally distributed so that as long as their standard errors can be computed, an approximate approach to inference is available.
3912:
3828:
1149:
459:
Whilst the trimmed mean performs well relative to the mean in this example, better robust estimates are available. In fact, the mean, median and trimmed mean are all special cases of
4220:
increases at the squared rate, but once the chosen threshold is reached (1.5 in this example), the rate of increase becomes constant. This
Winsorised estimator is also known as the
2275:
5111:
representative the model is and how long the period of missing values extends. When dynamic evolution is assumed in a series, the missing data point problem becomes an exercise in
3705:
1499:
1210:
1645:
650:
2734:
1770:
2203:
872:
696:
4249:
possible that any particular bootstrap sample can contain more outliers than the estimator's breakdown point. Therefore, some care is needed when designing bootstrap schemes.
4256:
These considerations do not "invalidate" M-estimation in any way. They merely make clear that some care is needed in their use, as is true of any other method of estimation.
1068:
In mathematical terms, an influence function is defined as a vector in the space of the estimator, which is in turn defined for a sample which is a subset of the population:
826:
959:
2820:
3639:. However, M-estimators now appear to dominate the field as a result of their generality, their potential for high breakdown points and comparatively high efficiency. See
3106:
3076:
2761:
1244:
4441:
3046:
234:
is resistant to errors in the results, produced by deviations from assumptions (e.g., of normality). This means that if the assumptions are only approximately met, the
4198:
2177:
1803:
1171:
4989:
4755:
2085:{\displaystyle EIF_{i}:x\in {\mathcal {X}}\mapsto n\cdot (T_{n}(x_{1},\dots ,x_{i-1},x,x_{i+1},\dots ,x_{n})-T_{n}(x_{1},\dots ,x_{i-1},x_{i},x_{i+1},\dots ,x_{n}))}
1504:
5861:
5750:
5595:
4843:
4619:
4595:
4322:
4302:
4227:
Tukey's biweight (also known as bisquare) function behaves in a similar way to the squared error function at first, but for larger errors, the function tapers off.
4218:
4159:
4139:
4119:
4099:
4076:
3932:
3848:
2828:
2787:
4863:
4823:
4803:
4783:
4663:
4304:, which means we can derive the properties of such an estimator (such as its rejection point, gross-error sensitivity or local-shift sensitivity) when we know its
7557:
3646:
M-estimators are not inherently robust. However, they can be designed to achieve favourable properties, including robustness. M-estimator are a generalization of
2701:
799:
8062:
4282:
3614:
3594:
3574:
3554:
3006:
2986:
2675:
2655:
2487:
2467:
2447:
2427:
2404:
2243:
2223:
2157:
2117:
1823:
598:
4886:
5063:
is a function of data, whose underlying population distribution is a member of a parametric family, that is not dependent on the values of the parameters. An
8212:
7836:
5977:
Rustum, Rabee; Adeloye, Adebayo J. (September 2007), "Replacing outliers and missing values from activated sludge data using
Kohonen self-organizing map",
85:
Robust statistics seek to provide methods that emulate popular statistical methods, but are not unduly affected by outliers or other small departures from
2498:
6477:
7610:
4253:
above by the nominal size of the test. The same is not true of M-estimators and the type I error rate can be substantially above the nominal level.
8049:
1650:
5902:
520:
relatively normal. As soon as the large outlier is removed, the estimated standard deviation shrinks, and the modest outlier now looks unusual.
5072:, frequently constructed in terms of these to not be sensitive to assumptions about parameters, are still very sensitive to model assumptions.
3241:
6101:
5809:
5724:
5490:
5269:
5234:
4785:
can be estimated from the data in the same way as any other parameter. In practice, it is common for there to be multiple local maxima when
1300:
6472:
6172:
5511:, Wiley Series in Probability and Mathematical Statistics: Probability and Mathematical Statistics, New York: John Wiley & Sons, Inc.,
701:
498:
5971:
3120:
1065:
which, as we will later see, is an example of what a "good" (in a sense defined later on) empirical influence function should look like.
7076:
6224:
8464:
2332:
5907:, Wiley Series in Probability and Mathematical Statistics: Applied Probability and Statistics, New York: John Wiley & Sons, Inc.,
4997:
1017:
288:
When considering how robust an estimator is to the presence of outliers, it is useful to test what happens when an extreme outlier is
4232:
4166:
3522:{\displaystyle \lambda ^{*}(T;F):=\sup _{(x,y)\in {\mathcal {X}}^{2} \atop x\neq y}\left\|{\frac {IF(y;T;F)-IF(x;T;F)}{y-x}}\right\|}
450:
7859:
7751:
5920:
5698:
5671:
5641:
5516:
1043:
977:
489:
of scale. The plots are based on 10,000 bootstrap samples for each estimator, with some
Gaussian noise added to the resampled data (
8037:
7911:
4870:
1362:
8095:
7756:
7501:
6872:
6462:
6128:
1406:
7086:
5315:
371:
are a general class of robust statistics, and are now the preferred solution, though they can be quite involved to calculate.
8146:
7358:
7165:
7054:
7012:
1021:
429:(d). The trimmed mean is a simple, robust estimator of location that deletes a certain percentage of observations (10% here)
6251:
8389:
7348:
93:
can be relied on to produce normally distributed estimates. Unfortunately, when there are outliers in the data, classical
7398:
7940:
7889:
7874:
7864:
7733:
7605:
7572:
7353:
7183:
6145:
4555:{\displaystyle M=-\int _{\mathcal {X}}\left({\frac {\partial \psi (x,\theta )}{\partial \theta }}\right)_{T(F)}\,dF(x).}
3647:
8009:
7310:
5068:(parametric family), and in fact, such functions are often very sensitive to violations of the model assumptions. Thus
4671:
8284:
8085:
7064:
6733:
6197:
5347:
5147:
8169:
8136:
1006:
135:
5827:
Rosen, C.; Lennox, J.A. (October 2001), "Multivariate and multiscale monitoring of wastewater treatment operation",
8486:
8141:
7884:
7643:
7549:
7529:
7437:
7148:
6966:
6449:
6321:
4330:
490:
438:
7315:
7081:
6939:
4000:
1025:
1010:
7901:
7669:
7390:
7244:
7173:
7093:
6951:
6932:
6640:
6361:
6003:
2280:
482:
334:
8014:
1256:
1074:
8384:
8151:
7699:
7664:
7628:
7413:
6855:
6764:
6723:
6635:
6326:
6165:
6088:, Statistical Modeling and Decision Science (3rd ed.), Amsterdam: Elsevier/Academic Press, pp. 1â22,
5157:
5107:
involves accommodating an outlier by replacing it with the next highest or next smallest value as appropriate.
5095:
3937:
3710:
486:
472:
434:
7421:
7405:
1118:
3860:
3776:
8293:
7906:
7846:
7783:
7143:
7005:
6995:
6845:
6759:
6040:
Ting, Jo-anne; Theodorou, Evangelos; Schaal, Stefan (2007), "A Kalman filter for robust outlier detection",
5112:
5091:
5081:
2248:
403:
247:
239:
112:
The practical effect of problems seen in the influence function can be studied empirically by examining the
8054:
7991:
4880:
For the speed-of-light data, allowing the kurtosis parameter to vary and maximizing the likelihood, we get
4621:
functions are to be preferred, and Tukey's biweight (also known as bisquare) function is a popular choice.
1469:
1180:
8331:
8261:
7746:
7633:
6630:
6527:
6434:
6313:
6212:
5990:
5557:; Portnoy, Stephen (1992), "Reweighted LS estimators converge at the same rate as the initial estimator",
5131:
3653:
1598:
603:
342:
172:
166:
86:
50:
8452:
7330:
5534:
Harvey, A. C.; Fernandes, C. (October 1989), "Time Series Models for Count or
Qualitative Observations",
4761:-distribution is equivalent to the Cauchy distribution. The degrees of freedom is sometimes known as the
2709:
1731:
8356:
8298:
8241:
8067:
7960:
7869:
7595:
7479:
7338:
7220:
7212:
7027:
6923:
6901:
6860:
6825:
6792:
6738:
6713:
6668:
6607:
6567:
6369:
6192:
2182:
2132:
and measures sensitivity to change in this distribution. By contrast, the empirical influence assumes a
831:
773:
to estimate the mean. Such an estimator has a breakdown point of 0 (or finite-sample breakdown point of
655:
296:
one of the existing data points, and then to consider the effect of multiple additions or replacements.
124:
113:
90:
62:
61:. Another motivation is to provide methods with good performance when there are small departures from a
8435:
7325:
37:
that maintain their properties even if the underlying distributional assumptions are incorrect. Robust
6133:
5711:, Wiley Series in Probability and Statistics (2nd ed.), Chichester: John Wiley & Sons, Ltd.,
5177:
3631:(The mathematical context of this paragraph is given in the section on empirical influence functions.)
8279:
7854:
7803:
7779:
7741:
7659:
7638:
7590:
7469:
7447:
7416:
7202:
7153:
7071:
7044:
7000:
6956:
6718:
6494:
6374:
5559:
804:
8426:
8351:
8274:
7955:
7719:
7712:
7674:
7582:
7562:
7534:
7267:
7133:
7128:
7118:
7110:
6928:
6889:
6779:
6769:
6678:
6457:
6413:
6331:
6256:
6158:
5116:
5064:
4221:
2792:
524:
338:
66:
8001:
3084:
273:
are effectively synonymous. For one perspective on research in robust statistics up to 2000, see
8440:
8251:
8105:
7950:
7826:
7723:
7707:
7684:
7461:
7195:
7178:
7138:
7049:
6944:
6906:
6877:
6837:
6797:
6743:
6660:
6346:
6341:
6028:
5962:
5938:
5898:
5878:
5856:
5767:
5612:
5543:
5504:
5251:
5216:
3533:
3054:
2739:
2634:
2326:
1217:
350:
346:
154:
70:
54:
42:
3635:
Historically, several approaches to robust estimation were proposed, including R-estimators and
930:
4420:
3024:
1584:{\displaystyle X_{1},\dots ,X_{n}:(\Omega ,{\mathcal {A}})\rightarrow ({\mathcal {X}},\Sigma )}
8346:
8316:
8308:
8128:
8119:
8044:
7975:
7831:
7816:
7791:
7679:
7620:
7486:
7474:
7100:
6961:
6884:
6728:
6650:
6429:
6303:
6097:
6050:
5916:
5844:
5805:
5799:
5720:
5694:
5667:
5637:
5512:
5486:
5265:
5230:
5197:
5152:
4174:
2958:{\displaystyle IF(x;T;F):=\lim _{t\rightarrow 0^{+}}{\frac {T(t\Delta _{x}+(1-t)F)-T(F)}{t}}.}
391:
356:
243:
131:
by designing estimators so that a pre-selected behaviour of the influence function is achieved
5707:
Maronna, Ricardo A.; Martin, R. Douglas; Yohai, Victor J.; SalibiĂĄn-Barrera, MatĂas (2019) ,
2162:
1775:
1156:
493:). Panel (a) shows the distribution of the standard deviation, (b) of the MAD and (c) of Qn.
8371:
8326:
8090:
8077:
7970:
7945:
7879:
7811:
7689:
7297:
7190:
7123:
7036:
6983:
6802:
6673:
6467:
6266:
6233:
6089:
6062:
6012:
5986:
5954:
5908:
5870:
5836:
5787:
5759:
5712:
5604:
5568:
5257:
5222:
5189:
5060:
4968:
4734:
890:
contain more details. The level and the power breakdown points of tests are investigated in
523:
This problem of masking gets worse as the complexity of the data increases. For example, in
320:
309:
6111:
6076:
6024:
5930:
5890:
5819:
5779:
5681:
5651:
5624:
5582:
5526:
4828:
4604:
4580:
4307:
4287:
4203:
4144:
4124:
4104:
4084:
4061:
3917:
3833:
2766:
425:
Panels (c) and (d) of the plot show the bootstrap distribution of the mean (c) and the 10%
8288:
8032:
7894:
7821:
7496:
7370:
7343:
7320:
7289:
6916:
6911:
6865:
6595:
6246:
6125:
6107:
6072:
6020:
5998:
5926:
5886:
5815:
5775:
5677:
5647:
5620:
5578:
5522:
4848:
4808:
4788:
4768:
4648:
2631:
1174:
305:
160:
142:
46:
4955:{\displaystyle {\hat {\mu }}=27.40,\quad {\hat {\sigma }}=3.81,\quad {\hat {\nu }}=2.13.}
2680:
776:
8237:
8232:
6695:
6625:
6271:
6093:
5791:
5659:
5636:, Kendall's Library of Statistics, vol. 5, New York: John Wiley & Sons, Inc.,
5069:
4267:
3599:
3579:
3559:
3539:
2991:
2971:
2660:
2640:
2472:
2452:
2432:
2412:
2389:
2228:
2208:
2142:
2102:
1808:
583:
383:
5840:
192:
8480:
8394:
8361:
8224:
8185:
7996:
7965:
7429:
7383:
6988:
6690:
6517:
6281:
6276:
5123:
3850:
is some function. MLE are therefore a special case of M-estimators (hence the name: "
921:
zeroes become negatively infinite. Therefore, this example is of practical interest.
387:
379:
250:, meaning having a bias tending towards 0 as the sample size tends towards infinity.
176:
117:
6547:
6032:
5966:
5593:; Simpson, Douglas G.; Portnoy, Stephen L. (1990), "Breakdown robustness of tests",
2619:{\displaystyle dT_{G-F}(F)=\lim _{t\rightarrow 0^{+}}{\frac {T(tG+(1-t)F)-T(F)}{t}}}
433:
of the data, then computes the mean in the usual way. The analysis was performed in
419:
8336:
8269:
8246:
8161:
7491:
6787:
6685:
6620:
6562:
6484:
6439:
5795:
5736:
Statistical procedures for analysis of environmental monitoring data and assessment
5500:
5087:
3017:
Properties of an influence function that bestow it with desirable performance are:
460:
426:
6139:
6053:(1947), "On the asymptotic distribution of differentiable statistical functions",
5351:
5323:
406:
in the presence of outliers and less variable measures of location are available.
8379:
8341:
8024:
7925:
7787:
7600:
7567:
7059:
6976:
6971:
6615:
6572:
6552:
6532:
6522:
6291:
5104:
3636:
3625:
995:
497:
481:
The plots below show the bootstrap distributions of the standard deviation, the
414:
The plot below shows a density plot of the speed-of-light data, together with a
364:
360:
17:
5193:
7225:
6705:
6405:
6336:
6286:
6261:
6181:
6067:
5942:
909:
quantity to be 0.78. Thus, the change in the mean resulting from removing two
535:
531:
38:
34:
5859:; Croux, Christophe (1993), "Alternatives to the median absolute deviation",
5573:
5201:
1721:{\displaystyle T_{n}:({\mathcal {X}}^{n},\Sigma ^{n})\rightarrow (\Gamma ,S)}
7378:
7230:
6850:
6645:
6557:
6542:
6537:
6502:
6016:
5745:
5590:
5554:
576:
230:
224:
94:
5848:
4765:. It is the parameter that controls how heavy the tails are. In principle,
5716:
5261:
5226:
4625:
recommend the biweight function with efficiency at the normal set to 85%.
4231:
4165:
449:
6894:
6512:
6389:
6384:
6379:
6351:
5099:
910:
513:
415:
398:
center of the data, it is, in a sense, biased when outliers are present.
41:
methods have been developed for many common problems, such as estimating
3333:{\displaystyle \gamma ^{*}(T;F):=\sup _{x\in {\mathcal {X}}}|IF(x;T;F)|}
382:
et al. in
Bayesian Data Analysis (2004) consider a data set relating to
8399:
8100:
5912:
5882:
5771:
5616:
5547:
2968:
It describes the effect of an infinitesimal contamination at the point
2406:, the estimator sequence asymptotically measures the correct quantity.
258:
58:
1350:{\displaystyle ({\mathcal {X}},\Sigma )=(\mathbb {R} ,{\mathcal {B}})}
8321:
7302:
7276:
7256:
6507:
6298:
1592:
766:{\displaystyle {\overline {X_{n}}}:={\frac {X_{1}+\cdots +X_{n}}{n}}}
316:
74:
6134:
Nick
Fieller's course notes on Statistical Modelling and Computation
5947:
5874:
5763:
5608:
4869:
3773:. In 1964, Huber proposed to generalize this to the minimization of
1056:
445:
only 2 outliers cause the central limit theorem to be inapplicable.
394:
page, and the book's website contains more information on the data.
5958:
3223:{\displaystyle \rho ^{*}:=\inf _{r>0}\{r:IF(x;T;F)=0,|x|>r\}}
5336:
5256:. Wiley Series in Probability and Statistics (1 ed.). Wiley.
5221:. Wiley Series in Probability and Statistics (1 ed.). Wiley.
4622:
1055:
887:
6042:
International Conference on Intelligent Robots and Systems â IROS
3536:, represents the effect of shifting an observation slightly from
2379:{\displaystyle \forall \theta \in \Theta ,T(F_{\theta })=\theta }
292:
to the dataset, and to test what happens when an extreme outlier
6241:
5044:{\displaystyle {\hat {\mu }}=27.49,\quad {\hat {\sigma }}=4.51.}
555:
The basic tools used to describe and measure robustness are the
8210:
7777:
7524:
6823:
6593:
6210:
6154:
2159:
be a convex subset of the set of all finite signed measures on
149:
Robust estimates have been studied for the following problems:
6142:
contains course notes on robust statistics and some data sets.
5691:
Hidden Markov and Other Models for Discrete-Valued Time Series
4264:
It can be shown that the influence function of an M-estimator
989:
934:
367:
are a general class of simple statistics, often robust, while
187:
6150:
105:
5425:
5423:
4468:
3411:
3286:
2721:
1862:
1673:
1567:
1551:
1441:
1339:
1309:
1271:
1127:
1089:
512:
Traditionally, statisticians would manually screen data for
65:. For example, robust methods work well for mixtures of two
1395:{\displaystyle \Theta =\mathbb {R} \times \mathbb {R} ^{+}}
897:
Statistics with high breakdown points are sometimes called
5098:
can also be used to estimate missing values. In addition,
2988:
on the estimate we are seeking, standardized by the mass
2469:
exactly but another, slightly different, "going towards"
1452:{\displaystyle (\Gamma ,S)=(\mathbb {R} ,{\mathcal {B}})}
441:
samples were used for each of the raw and trimmed means.
6086:
Introduction to robust estimation and hypothesis testing
4825:
at a value around 4 or 6. The figure below displays the
1463:
The empirical influence function is defined as follows.
97:
often have very poor performance, when judged using the
5685:. Republished in paperback, 2004. 2nd ed., Wiley, 2009.
5178:"On some connections between statistics and cryptology"
955:
204:
175:
form, for which the standard method is equivalent to a
4003:
3940:
3863:
3779:
3713:
3656:
2449:. What happens when the data doesn't follow the model
422:(panel (b)). The outliers are visible in these plots.
141:
with low degrees of freedom (high kurtosis) or with a
5794:; Vetterling, William T.; Flannery, Brian P. (2007),
5250:
Huber, Peter J.; Ronchetti, Elvezio M. (2009-01-29).
5215:
Huber, Peter J.; Ronchetti, Elvezio M. (2009-01-29).
5000:
4971:
4889:
4851:
4831:
4811:
4791:
4771:
4737:
4674:
4651:
4607:
4583:
4452:
4423:
4333:
4310:
4290:
4270:
4206:
4177:
4147:
4127:
4107:
4087:
4064:
3920:
3836:
3602:
3582:
3562:
3542:
3354:
3244:
3123:
3087:
3057:
3027:
2994:
2974:
2831:
2795:
2769:
2742:
2712:
2683:
2663:
2643:
2501:
2475:
2455:
2435:
2415:
2392:
2335:
2283:
2251:
2231:
2211:
2185:
2165:
2145:
2105:
1834:
1811:
1778:
1734:
1653:
1601:
1507:
1472:
1409:
1365:
1303:
1259:
1220:
1183:
1159:
1121:
1077:
834:
807:
779:
704:
658:
606:
586:
8063:
Autoregressive conditional heteroskedasticity (ARCH)
4121:
have been proposed. The two figures below show four
363:
are general methods to make statistics more robust.
8370:
8307:
8260:
8223:
8178:
8160:
8127:
8118:
8076:
8023:
7984:
7933:
7924:
7845:
7802:
7732:
7698:
7652:
7619:
7581:
7548:
7460:
7369:
7288:
7243:
7211:
7164:
7109:
7035:
7026:
6836:
6778:
6752:
6704:
6659:
6606:
6493:
6448:
6422:
6404:
6360:
6312:
6232:
6223:
5945:(2011), "Robust statistics for outlier detection",
5127:
2277:be the asymptotic value of some estimator sequence
2136:and measures sensitivity to change in the samples.
950:
may be too technical for most readers to understand
390:. The data sets for that book can be found via the
5801:Numerical Recipes: The Art of Scientific Computing
5748:(2000), "A robust journey in the new millennium",
5103:mean is then calculated using the remaining data.
5043:
4983:
4954:
4857:
4837:
4817:
4797:
4777:
4749:
4720:
4657:
4613:
4589:
4554:
4435:
4406:
4316:
4296:
4276:
4212:
4192:
4153:
4133:
4113:
4093:
4070:
4050:
3989:
3926:
3906:
3842:
3822:
3765:
3699:
3608:
3588:
3568:
3548:
3521:
3332:
3222:
3100:
3070:
3040:
3000:
2980:
2957:
2814:
2781:
2755:
2728:
2695:
2669:
2649:
2618:
2481:
2461:
2441:
2421:
2398:
2378:
2317:
2269:
2237:
2217:
2197:
2171:
2151:
2111:
2084:
1817:
1797:
1764:
1720:
1639:
1583:
1493:
1451:
1394:
1349:
1287:
1238:
1204:
1165:
1143:
1105:
913:is approximately twice the robust standard error.
866:
820:
793:
765:
690:
644:
592:
538:were rejected as outliers by non-human screening.
171:estimation of model-states in models expressed in
5398:
4805:is allowed to vary. As such, it is common to fix
4721:{\displaystyle \psi (x)={\frac {x}{x^{2}+\nu }}.}
2763:is the probability measure which gives mass 1 to
5485:, Boca Raton, FL: Chapman & Hall/CRC Press,
4577:In many practical situations, the choice of the
3384:
3274:
3138:
2863:
2534:
891:
7611:Multivariate adaptive regression splines (MARS)
5862:Journal of the American Statistical Association
5751:Journal of the American Statistical Association
5596:Journal of the American Statistical Association
5402:
6001:(2010), "The changing history of robustness",
5709:Robust statistics: Theory and methods (with R)
5453:
5301:
879:
73:; under this model, non-robust methods like a
6166:
5689:MacDonald, Iain L.; Zucchini, Walter (1997),
5632:Hettmansperger, T. P.; McKean, J. W. (1998),
5536:Journal of Business & Economic Statistics
5429:
5182:Journal of Statistical Planning and Inference
4407:{\displaystyle IF(x;T,F)=M^{-1}\psi (x,T(F))}
4051:{\textstyle \psi (x)={\frac {d\rho (x)}{dx}}}
2822:. The influence function is then defined by:
2095:What this means is that we are replacing the
8:
5804:(3rd ed.), Cambridge University Press,
5414:
3217:
3153:
2776:
2770:
1759:
1741:
5441:
5130:have recently shown that a modification of
2318:{\displaystyle (T_{n})_{n\in \mathbb {N} }}
1024:. Unsourced material may be challenged and
223:There are various definitions of a "robust
8220:
8207:
8124:
7930:
7799:
7774:
7545:
7521:
7249:
7032:
6833:
6820:
6603:
6590:
6229:
6220:
6207:
6173:
6159:
6151:
3990:{\textstyle \sum _{i=1}^{n}\psi (x_{i})=0}
3766:{\textstyle \sum _{i=1}^{n}-\log f(x_{i})}
2325:. We will suppose that this functional is
1288:{\displaystyle (\Omega ,{\mathcal {A}},P)}
1106:{\displaystyle (\Omega ,{\mathcal {A}},P)}
828:arbitrarily large just by changing any of
6066:
5734:McBean, Edward A.; Rovers, Frank (1998),
5666:, New York: John Wiley & Sons, Inc.,
5572:
5465:
5362:
5024:
5023:
5002:
5001:
4999:
4970:
4935:
4934:
4913:
4912:
4891:
4890:
4888:
4850:
4830:
4810:
4790:
4770:
4736:
4700:
4690:
4673:
4665:degrees of freedom, it can be shown that
4650:
4606:
4582:
4533:
4518:
4479:
4467:
4466:
4451:
4422:
4368:
4332:
4309:
4289:
4269:
4205:
4176:
4146:
4126:
4106:
4086:
4063:
4019:
4002:
3972:
3956:
3945:
3939:
3919:
3895:
3879:
3868:
3862:
3835:
3811:
3795:
3784:
3778:
3754:
3729:
3718:
3712:
3688:
3672:
3661:
3655:
3650:(MLEs) which is determined by maximizing
3601:
3581:
3561:
3541:
3441:
3416:
3410:
3409:
3387:
3359:
3353:
3325:
3293:
3285:
3284:
3277:
3249:
3243:
3206:
3198:
3141:
3128:
3122:
3092:
3086:
3062:
3056:
3032:
3026:
2993:
2973:
2901:
2885:
2877:
2866:
2830:
2806:
2794:
2768:
2747:
2741:
2720:
2719:
2711:
2682:
2662:
2642:
2556:
2548:
2537:
2509:
2500:
2474:
2454:
2434:
2414:
2391:
2361:
2334:
2309:
2308:
2301:
2291:
2282:
2250:
2230:
2210:
2184:
2164:
2144:
2104:
2070:
2045:
2032:
2013:
1994:
1981:
1965:
1940:
1915:
1896:
1883:
1861:
1860:
1845:
1833:
1810:
1789:
1777:
1733:
1691:
1678:
1672:
1671:
1658:
1652:
1628:
1609:
1600:
1566:
1565:
1550:
1549:
1531:
1512:
1506:
1485:
1481:
1480:
1471:
1440:
1439:
1432:
1431:
1408:
1386:
1382:
1381:
1373:
1372:
1364:
1338:
1337:
1330:
1329:
1308:
1307:
1302:
1270:
1269:
1258:
1219:
1196:
1192:
1191:
1182:
1158:
1126:
1125:
1120:
1088:
1087:
1076:
1044:Learn how and when to remove this message
978:Learn how and when to remove this message
962:, without removing the technical details.
858:
839:
833:
808:
806:
783:
778:
751:
732:
725:
711:
705:
703:
682:
663:
657:
633:
614:
605:
585:
274:
5634:Robust nonparametric statistical methods
5126:are not robust to outliers. To this end
3907:{\textstyle \sum _{i=1}^{n}\rho (x_{i})}
3823:{\textstyle \sum _{i=1}^{n}\rho (x_{i})}
2123:Influence function and sensitivity curve
1144:{\displaystyle ({\mathcal {X}},\Sigma )}
463:. Details appear in the sections below.
6146:Online experiments using R and JSXGraph
5991:10.1061/(asce)0733-9372(2007)133:9(909)
5904:Robust Regression and Outlier Detection
5285:
5283:
5281:
5168:
575:Intuitively, the breakdown point of an
8137:KaplanâMeier estimator (product limit)
6136:contain material on robust regression.
2270:{\displaystyle T:A\rightarrow \Gamma }
5386:
5374:
5289:
5076:Replacing outliers and missing values
3914:can often be done by differentiating
3854:aximum likelihood type" estimators).
3700:{\textstyle \prod _{i=1}^{n}f(x_{i})}
3640:
3532:This value, which looks a lot like a
1494:{\displaystyle n\in \mathbb {N} ^{*}}
1205:{\displaystyle p\in \mathbb {N} ^{*}}
960:make it understandable to non-experts
883:
7:
8447:
8147:Accelerated failure time (AFT) model
5979:Journal of Environmental Engineering
5542:(4), Taylor & Francis: 407â417,
5316:"When was the ozone hole discovered"
4991:and maximizing the likelihood gives
4845:-function for 4 different values of
4260:Influence function of an M-estimator
2179:. We want to estimate the parameter
1640:{\displaystyle (x_{1},\dots ,x_{n})}
1151:is a measurable space (state space),
1022:adding citations to reliable sources
645:{\displaystyle (X_{1},\dots ,X_{n})}
418:(panel (a)). Also shown is a normal
327:
253:Usually, the most important case is
99:
8459:
7742:Analysis of variance (ANOVA, anova)
5128:Ting, Theodorou & Schaal (2007)
2729:{\displaystyle x\in {\mathcal {X}}}
1772:. The empirical influence function
1765:{\displaystyle i\in \{1,\dots ,n\}}
652:and the corresponding realizations
7837:CochranâMantelâHaenszel statistics
6463:Pearson product-moment correlation
6094:10.1016/B978-0-12-386983-8.00001-9
4505:
4482:
4141:functions and their corresponding
3388:
2898:
2803:
2744:
2345:
2336:
2264:
2198:{\displaystyle \theta \in \Theta }
2192:
2166:
1706:
1688:
1647:is a sample from these variables.
1575:
1543:
1413:
1366:
1317:
1263:
1224:
1160:
1135:
1081:
867:{\displaystyle x_{1},\dots ,x_{n}}
691:{\displaystyle x_{1},\dots ,x_{n}}
25:
6055:Annals of Mathematical Statistics
5934:. Republished in paperback, 2003.
5796:"Section 15.7. Robust Estimation"
5530:. Republished in paperback, 2005.
5483:Robust methods for data reduction
5481:Farcomeni, A.; Greco, L. (2013),
368:
8458:
8446:
8434:
8421:
8420:
4868:
4230:
4164:
994:
939:
892:He, Simpson & Portnoy (1990)
580:more useful. For example, given
496:
448:
191:
57:that are not unduly affected by
8096:Least-squares spectral analysis
6129:robust statistics course notes.
5399:MacDonald & Zucchini (1997)
5022:
4933:
4911:
2386:. This means that at the model
821:{\displaystyle {\overline {x}}}
116:of proposed estimators under a
53:. One motivation is to produce
7077:Mean-unbiased minimum-variance
5029:
5007:
4940:
4918:
4896:
4684:
4678:
4546:
4540:
4528:
4522:
4500:
4488:
4401:
4398:
4392:
4380:
4358:
4340:
4187:
4181:
4034:
4028:
4013:
4007:
3978:
3965:
3901:
3888:
3817:
3804:
3760:
3747:
3694:
3681:
3576:, i.e., add an observation at
3515:
3498:
3480:
3468:
3450:
3438:
3402:
3390:
3377:
3365:
3326:
3322:
3304:
3294:
3267:
3255:
3207:
3199:
3186:
3168:
3081:Small local-shift sensitivity
3051:Small gross-error sensitivity
2943:
2937:
2928:
2922:
2910:
2891:
2870:
2856:
2838:
2607:
2601:
2592:
2586:
2574:
2562:
2541:
2527:
2521:
2367:
2354:
2298:
2284:
2261:
2079:
2076:
1987:
1971:
1889:
1876:
1867:
1715:
1703:
1700:
1697:
1667:
1634:
1602:
1578:
1562:
1559:
1556:
1540:
1446:
1428:
1422:
1410:
1344:
1326:
1320:
1304:
1282:
1260:
1233:
1221:
1138:
1122:
1100:
1078:
639:
607:
485:(MAD) and the RousseeuwâCroux
1:
8390:Geographic information system
7606:Simultaneous equations models
5841:10.1016/s0043-1354(01)00069-0
5403:Harvey & Fernandes (1989)
5176:Sarkar, Palash (2014-05-01).
3707:or, equivalently, minimizing
3648:maximum likelihood estimators
2815:{\displaystyle G=\Delta _{x}}
600:independent random variables
508:Manual screening for outliers
280:Some experts prefer the term
238:will still have a reasonable
145:of two or more distributions.
7573:Coefficient of determination
7184:Uniformly most powerful test
5507:; Stahel, Werner A. (1986),
5454:Rousseeuw & Leroy (1987)
5302:Rousseeuw & Croux (1993)
4876:Example: speed-of-light data
4629:Robust parametric approaches
3101:{\displaystyle \lambda ^{*}}
925:Empirical influence function
904:Example: speed-of-light data
880:Rousseeuw & Leroy (1987)
813:
717:
8142:Proportional hazards models
8086:Spectral density estimation
8068:Vector autoregression (VAR)
7502:Maximum posterior estimator
6734:Randomized controlled trial
5901:; Leroy, Annick M. (1987),
5655:. 2nd ed., CRC Press, 2011.
5430:Rustum & Adeloye (2007)
5148:Robust confidence intervals
5117:Kohonen self organising map
3071:{\displaystyle \gamma ^{*}}
2756:{\displaystyle \Delta _{x}}
1239:{\displaystyle (\Gamma ,S)}
308:is not a robust measure of
8503:
7902:Multivariate distributions
6322:Average absolute deviation
5415:McBean & Rovers (1998)
5194:10.1016/j.jspi.2013.05.008
5079:
4238:Properties of M-estimators
3623:
928:
470:
8416:
8219:
8206:
7890:Structural equation model
7798:
7773:
7544:
7520:
7252:
7226:Score/Lagrange multiplier
6832:
6819:
6641:Sample size determination
6602:
6589:
6219:
6206:
6188:
6004:The American Statistician
5503:; Ronchetti, Elvezio M.;
5442:Rosen & Lennox (2001)
4436:{\displaystyle p\times p}
3041:{\displaystyle \rho ^{*}}
1295:is any probability space,
1060:Tukey's biweight function
483:median absolute deviation
478:effects are exacerbated.
335:median absolute deviation
255:distributional robustness
127:can proceed in two ways:
8385:Environmental statistics
7907:Elliptical distributions
7700:Generalized linear model
7629:Simple linear regression
7399:HodgesâLehmann estimator
6856:Probability distribution
6765:Stochastic approximation
6327:Coefficient of variation
5693:, Taylor & Francis,
5158:Unit-weighted regression
5134:can deal with outliers.
5096:Simple linear regression
4193:{\displaystyle \rho (x)}
3556:to a neighbouring point
2429:be some distribution in
473:Robust measures of scale
227:". Strictly speaking, a
8045:Cross-correlation (XCF)
7653:Non-standard predictors
7087:LehmannâScheffĂ© theorem
6760:Adaptive clinical trial
6068:10.1214/aoms/1177730385
6017:10.1198/tast.2010.10159
5466:He & Portnoy (1992)
5082:Imputation (statistics)
3344:Local-shift sensitivity
3234:Gross-error sensitivity
3021:Finite rejection point
2172:{\displaystyle \Sigma }
1798:{\displaystyle EIF_{i}}
1166:{\displaystyle \Theta }
1113:is a probability space,
542:Variety of applications
341:are robust measures of
319:is a robust measure of
275:Portnoy & He (2000)
267:distributionally robust
248:asymptotically unbiased
242:, and reasonably small
167:regression coefficients
63:parametric distribution
8441:Mathematics portal
8262:Engineering statistics
8170:NelsonâAalen estimator
7747:Analysis of covariance
7634:Ordinary least squares
7558:Pearson product-moment
6962:Statistical functional
6873:Empirical distribution
6706:Controlled experiments
6435:Frequency distribution
6213:Descriptive statistics
5574:10.1214/aos/1176348910
5045:
4985:
4984:{\displaystyle \nu =4}
4956:
4859:
4839:
4819:
4799:
4779:
4751:
4750:{\displaystyle \nu =1}
4722:
4659:
4615:
4591:
4556:
4437:
4408:
4318:
4298:
4278:
4214:
4194:
4155:
4135:
4115:
4095:
4072:
4052:
3991:
3961:
3928:
3908:
3884:
3844:
3824:
3800:
3767:
3734:
3701:
3677:
3610:
3590:
3570:
3550:
3523:
3334:
3224:
3102:
3072:
3042:
3002:
2982:
2959:
2816:
2783:
2757:
2730:
2697:
2677:, in the direction of
2671:
2651:
2620:
2483:
2463:
2443:
2423:
2400:
2380:
2319:
2271:
2239:
2219:
2199:
2173:
2153:
2113:
2086:
1819:
1799:
1766:
1722:
1641:
1585:
1495:
1453:
1396:
1351:
1289:
1246:is a measurable space,
1240:
1206:
1167:
1145:
1107:
1061:
868:
822:
801:) because we can make
795:
767:
692:
646:
594:
551:Measures of robustness
410:Estimation of location
343:statistical dispersion
326:Described in terms of
8357:Population statistics
8299:System identification
8033:Autocorrelation (ACF)
7961:Exponential smoothing
7875:Discriminant analysis
7870:Canonical correlation
7734:Partition of variance
7596:Regression validation
7440:(JonckheereâTerpstra)
7339:Likelihood-ratio test
7028:Frequentist inference
6940:Locationâscale family
6861:Sampling distribution
6826:Statistical inference
6793:Cross-sectional study
6780:Observational studies
6739:Randomized experiment
6568:Stem-and-leaf display
6370:Central limit theorem
6084:Wilcox, Rand (2012),
5717:10.1002/9781119214656
5337:Maronna et al. (2019)
5262:10.1002/9780470434697
5227:10.1002/9780470434697
5113:multivariate analysis
5046:
4986:
4957:
4860:
4840:
4838:{\displaystyle \psi }
4820:
4800:
4780:
4752:
4723:
4660:
4623:Maronna et al. (2019)
4616:
4614:{\displaystyle \psi }
4592:
4590:{\displaystyle \psi }
4557:
4438:
4409:
4319:
4317:{\displaystyle \psi }
4299:
4297:{\displaystyle \psi }
4279:
4215:
4213:{\displaystyle \rho }
4195:
4156:
4154:{\displaystyle \psi }
4136:
4134:{\displaystyle \rho }
4116:
4114:{\displaystyle \psi }
4096:
4094:{\displaystyle \rho }
4073:
4071:{\displaystyle \rho }
4053:
3992:
3941:
3929:
3927:{\displaystyle \rho }
3909:
3864:
3845:
3843:{\displaystyle \rho }
3825:
3780:
3768:
3714:
3702:
3657:
3611:
3591:
3571:
3551:
3524:
3335:
3225:
3103:
3073:
3043:
3003:
2983:
2960:
2817:
2784:
2782:{\displaystyle \{x\}}
2758:
2731:
2698:
2672:
2652:
2621:
2484:
2464:
2444:
2424:
2401:
2381:
2320:
2272:
2245:. Let the functional
2240:
2220:
2200:
2174:
2154:
2114:
2087:
1820:
1800:
1767:
1728:is an estimator. Let
1723:
1642:
1586:
1496:
1454:
1397:
1352:
1290:
1241:
1207:
1168:
1146:
1108:
1059:
899:resistant statistics.
888:Maronna et al. (2019)
869:
823:
796:
768:
693:
647:
595:
534:first appearing over
386:measurements made by
361:Winsorised estimators
125:parametric statistics
114:sampling distribution
91:central limit theorem
51:regression parameters
8280:Probabilistic design
7865:Principal components
7708:Exponential families
7660:Nonlinear regression
7639:General linear model
7601:Mixed effects models
7591:Errors and residuals
7568:Confounding variable
7470:Bayesian probability
7448:Van der Waerden test
7438:Ordered alternative
7203:Multiple comparisons
7082:RaoâBlackwellization
7045:Estimating equations
7001:Statistical distance
6719:Factorial experiment
6252:Arithmetic-Geometric
6044:, pp. 1514â1519
5560:Annals of Statistics
5348:Resistant statistics
4998:
4969:
4887:
4858:{\displaystyle \nu }
4849:
4829:
4818:{\displaystyle \nu }
4809:
4798:{\displaystyle \nu }
4789:
4778:{\displaystyle \nu }
4769:
4735:
4672:
4658:{\displaystyle \nu }
4649:
4605:
4581:
4450:
4421:
4331:
4308:
4288:
4268:
4204:
4175:
4171:For squared errors,
4145:
4125:
4105:
4085:
4062:
4001:
3938:
3918:
3861:
3834:
3777:
3711:
3654:
3600:
3580:
3560:
3540:
3352:
3242:
3121:
3085:
3055:
3025:
3013:Desirable properties
2992:
2972:
2829:
2793:
2767:
2740:
2710:
2681:
2661:
2641:
2499:
2473:
2453:
2433:
2413:
2390:
2333:
2281:
2249:
2229:
2209:
2183:
2163:
2143:
2103:
1832:
1809:
1776:
1732:
1651:
1599:
1505:
1470:
1407:
1363:
1301:
1257:
1218:
1181:
1157:
1119:
1075:
1018:improve this section
832:
805:
777:
702:
656:
604:
584:
282:resistant statistics
67:normal distributions
8352:Official statistics
8275:Methods engineering
7956:Seasonal adjustment
7724:Poisson regressions
7644:Bayesian regression
7583:Regression analysis
7563:Partial correlation
7535:Regression analysis
7134:Prediction interval
7129:Likelihood interval
7119:Confidence interval
7111:Interval estimation
7072:Unbiased estimators
6890:Model specification
6770:Up-and-down designs
6458:Partial correlation
6414:Index of dispersion
6332:Interquartile range
5999:Stigler, Stephen M.
5939:Rousseeuw, Peter J.
5899:Rousseeuw, Peter J.
5857:Rousseeuw, Peter J.
5505:Rousseeuw, Peter J.
5352:David B. Stephenson
5320:Weather Underground
5132:Masreliez's theorem
5065:ancillary statistic
4645:-distribution with
4284:is proportional to
4222:Huber loss function
4081:Several choices of
4078:has a derivative).
2696:{\displaystyle G-F}
794:{\displaystyle 1/n}
467:Estimation of scale
375:Speed-of-light data
339:interquartile range
246:, as well as being
155:location parameters
71:standard deviations
55:statistical methods
8372:Spatial statistics
8252:Medical statistics
8152:First hitting time
8106:Whittle likelihood
7757:Degrees of freedom
7752:Multivariate ANOVA
7685:Heteroscedasticity
7497:Bayesian estimator
7462:Bayesian inference
7311:KolmogorovâSmirnov
7196:Randomization test
7166:Testing hypotheses
7139:Tolerance interval
7050:Maximum likelihood
6945:Exponential family
6878:Density estimation
6838:Statistical theory
6798:Natural experiment
6744:Scientific control
6661:Survey methodology
6347:Standard deviation
6140:David Olive's site
5913:10.1002/0471725382
5869:(424): 1273â1283,
5792:Teukolsky, Saul A.
5758:(452): 1331â1335,
5744:Portnoy, Stephen;
5314:Masters, Jeffrey.
5041:
4981:
4952:
4855:
4835:
4815:
4795:
4775:
4763:kurtosis parameter
4747:
4718:
4655:
4611:
4587:
4552:
4433:
4404:
4314:
4294:
4274:
4210:
4190:
4151:
4131:
4111:
4091:
4068:
4048:
3987:
3924:
3904:
3840:
3820:
3763:
3697:
3606:
3596:and remove one at
3586:
3566:
3546:
3534:Lipschitz constant
3519:
3436:
3330:
3292:
3220:
3152:
3098:
3068:
3038:
2998:
2978:
2955:
2884:
2812:
2779:
2753:
2726:
2693:
2667:
2647:
2635:Gateaux derivative
2616:
2555:
2492:We're looking at:
2479:
2459:
2439:
2419:
2396:
2376:
2315:
2267:
2235:
2215:
2205:of a distribution
2195:
2169:
2149:
2109:
2082:
1815:
1795:
1762:
1718:
1637:
1581:
1491:
1449:
1392:
1347:
1285:
1236:
1202:
1163:
1141:
1103:
1062:
864:
818:
791:
763:
688:
642:
590:
561:influence function
491:smoothed bootstrap
357:Trimmed estimators
347:standard deviation
203:. You can help by
106:influence function
27:Type of statistics
8487:Robust statistics
8474:
8473:
8412:
8411:
8408:
8407:
8347:National accounts
8317:Actuarial science
8309:Social statistics
8202:
8201:
8198:
8197:
8194:
8193:
8129:Survival function
8114:
8113:
7976:Granger causality
7817:Contingency table
7792:Survival analysis
7769:
7768:
7765:
7764:
7621:Linear regression
7516:
7515:
7512:
7511:
7487:Credible interval
7456:
7455:
7239:
7238:
7055:Method of moments
6924:Parametric family
6885:Statistical model
6815:
6814:
6811:
6810:
6729:Random assignment
6651:Statistical power
6585:
6584:
6581:
6580:
6430:Contingency table
6400:
6399:
6267:Generalized/power
6103:978-0-12-386983-8
5835:(14): 3402â3410,
5811:978-0-521-88068-8
5788:Press, William H.
5726:978-1-119-21468-7
5664:Robust statistics
5509:Robust statistics
5492:978-1-4665-9062-5
5271:978-0-470-12990-6
5253:Robust Statistics
5236:978-0-470-12990-6
5218:Robust Statistics
5153:Robust regression
5032:
5010:
4943:
4921:
4899:
4713:
4512:
4277:{\displaystyle T}
4046:
3609:{\displaystyle x}
3589:{\displaystyle y}
3569:{\displaystyle y}
3549:{\displaystyle x}
3513:
3434:
3383:
3273:
3137:
3001:{\displaystyle t}
2981:{\displaystyle x}
2950:
2862:
2670:{\displaystyle F}
2650:{\displaystyle T}
2614:
2533:
2482:{\displaystyle G}
2462:{\displaystyle F}
2442:{\displaystyle A}
2422:{\displaystyle G}
2399:{\displaystyle F}
2327:Fisher consistent
2238:{\displaystyle A}
2218:{\displaystyle F}
2152:{\displaystyle A}
2112:{\displaystyle x}
1818:{\displaystyle i}
1054:
1053:
1046:
988:
987:
980:
816:
761:
720:
593:{\displaystyle n}
565:sensitivity curve
392:Classic data sets
271:outlier-resistant
221:
220:
109:described below.
87:model assumptions
31:Robust statistics
16:(Redirected from
8494:
8462:
8461:
8450:
8449:
8439:
8438:
8424:
8423:
8327:Crime statistics
8221:
8208:
8125:
8091:Fourier analysis
8078:Frequency domain
8058:
8005:
7971:Structural break
7931:
7880:Cluster analysis
7827:Log-linear model
7800:
7775:
7716:
7690:Homoscedasticity
7546:
7522:
7441:
7433:
7425:
7424:(KruskalâWallis)
7409:
7394:
7349:Cross validation
7334:
7316:AndersonâDarling
7263:
7250:
7221:Likelihood-ratio
7213:Parametric tests
7191:Permutation test
7174:1- & 2-tails
7065:Minimum distance
7037:Point estimation
7033:
6984:Optimal decision
6935:
6834:
6821:
6803:Quasi-experiment
6753:Adaptive designs
6604:
6591:
6468:Rank correlation
6230:
6221:
6208:
6175:
6168:
6161:
6152:
6114:
6079:
6070:
6045:
6035:
5993:
5969:
5933:
5893:
5851:
5822:
5782:
5739:
5729:
5703:
5684:
5654:
5627:
5603:(410): 446â452,
5585:
5576:
5567:(4): 2161â2167,
5550:
5529:
5501:Hampel, Frank R.
5495:
5469:
5463:
5457:
5451:
5445:
5439:
5433:
5427:
5418:
5412:
5406:
5396:
5390:
5384:
5378:
5372:
5366:
5363:von Mises (1947)
5360:
5354:
5345:
5339:
5334:
5328:
5327:
5322:. Archived from
5311:
5305:
5299:
5293:
5287:
5276:
5275:
5247:
5241:
5240:
5212:
5206:
5205:
5173:
5061:pivotal quantity
5055:Related concepts
5050:
5048:
5047:
5042:
5034:
5033:
5025:
5012:
5011:
5003:
4990:
4988:
4987:
4982:
4961:
4959:
4958:
4953:
4945:
4944:
4936:
4923:
4922:
4914:
4901:
4900:
4892:
4872:
4864:
4862:
4861:
4856:
4844:
4842:
4841:
4836:
4824:
4822:
4821:
4816:
4804:
4802:
4801:
4796:
4784:
4782:
4781:
4776:
4756:
4754:
4753:
4748:
4727:
4725:
4724:
4719:
4714:
4712:
4705:
4704:
4691:
4664:
4662:
4661:
4656:
4620:
4618:
4617:
4612:
4596:
4594:
4593:
4588:
4561:
4559:
4558:
4553:
4532:
4531:
4517:
4513:
4511:
4503:
4480:
4473:
4472:
4471:
4442:
4440:
4439:
4434:
4413:
4411:
4410:
4405:
4376:
4375:
4323:
4321:
4320:
4315:
4303:
4301:
4300:
4295:
4283:
4281:
4280:
4275:
4234:
4219:
4217:
4216:
4211:
4199:
4197:
4196:
4191:
4168:
4160:
4158:
4157:
4152:
4140:
4138:
4137:
4132:
4120:
4118:
4117:
4112:
4100:
4098:
4097:
4092:
4077:
4075:
4074:
4069:
4057:
4055:
4054:
4049:
4047:
4045:
4037:
4020:
3996:
3994:
3993:
3988:
3977:
3976:
3960:
3955:
3933:
3931:
3930:
3925:
3913:
3911:
3910:
3905:
3900:
3899:
3883:
3878:
3849:
3847:
3846:
3841:
3829:
3827:
3826:
3821:
3816:
3815:
3799:
3794:
3772:
3770:
3769:
3764:
3759:
3758:
3733:
3728:
3706:
3704:
3703:
3698:
3693:
3692:
3676:
3671:
3615:
3613:
3612:
3607:
3595:
3593:
3592:
3587:
3575:
3573:
3572:
3567:
3555:
3553:
3552:
3547:
3528:
3526:
3525:
3520:
3518:
3514:
3512:
3501:
3442:
3435:
3433:
3422:
3421:
3420:
3415:
3414:
3364:
3363:
3339:
3337:
3336:
3331:
3329:
3297:
3291:
3290:
3289:
3254:
3253:
3229:
3227:
3226:
3221:
3210:
3202:
3151:
3133:
3132:
3107:
3105:
3104:
3099:
3097:
3096:
3077:
3075:
3074:
3069:
3067:
3066:
3047:
3045:
3044:
3039:
3037:
3036:
3007:
3005:
3004:
2999:
2987:
2985:
2984:
2979:
2964:
2962:
2961:
2956:
2951:
2946:
2906:
2905:
2886:
2883:
2882:
2881:
2821:
2819:
2818:
2813:
2811:
2810:
2788:
2786:
2785:
2780:
2762:
2760:
2759:
2754:
2752:
2751:
2735:
2733:
2732:
2727:
2725:
2724:
2702:
2700:
2699:
2694:
2676:
2674:
2673:
2668:
2656:
2654:
2653:
2648:
2625:
2623:
2622:
2617:
2615:
2610:
2557:
2554:
2553:
2552:
2520:
2519:
2488:
2486:
2485:
2480:
2468:
2466:
2465:
2460:
2448:
2446:
2445:
2440:
2428:
2426:
2425:
2420:
2405:
2403:
2402:
2397:
2385:
2383:
2382:
2377:
2366:
2365:
2324:
2322:
2321:
2316:
2314:
2313:
2312:
2296:
2295:
2276:
2274:
2273:
2268:
2244:
2242:
2241:
2236:
2224:
2222:
2221:
2216:
2204:
2202:
2201:
2196:
2178:
2176:
2175:
2170:
2158:
2156:
2155:
2150:
2118:
2116:
2115:
2110:
2091:
2089:
2088:
2083:
2075:
2074:
2056:
2055:
2037:
2036:
2024:
2023:
1999:
1998:
1986:
1985:
1970:
1969:
1951:
1950:
1926:
1925:
1901:
1900:
1888:
1887:
1866:
1865:
1850:
1849:
1824:
1822:
1821:
1816:
1804:
1802:
1801:
1796:
1794:
1793:
1771:
1769:
1768:
1763:
1727:
1725:
1724:
1719:
1696:
1695:
1683:
1682:
1677:
1676:
1663:
1662:
1646:
1644:
1643:
1638:
1633:
1632:
1614:
1613:
1590:
1588:
1587:
1582:
1571:
1570:
1555:
1554:
1536:
1535:
1517:
1516:
1500:
1498:
1497:
1492:
1490:
1489:
1484:
1458:
1456:
1455:
1450:
1445:
1444:
1435:
1401:
1399:
1398:
1393:
1391:
1390:
1385:
1376:
1356:
1354:
1353:
1348:
1343:
1342:
1333:
1313:
1312:
1294:
1292:
1291:
1286:
1275:
1274:
1245:
1243:
1242:
1237:
1211:
1209:
1208:
1203:
1201:
1200:
1195:
1172:
1170:
1169:
1164:
1150:
1148:
1147:
1142:
1131:
1130:
1112:
1110:
1109:
1104:
1093:
1092:
1049:
1042:
1038:
1035:
1029:
998:
990:
983:
976:
972:
969:
963:
943:
942:
935:
873:
871:
870:
865:
863:
862:
844:
843:
827:
825:
824:
819:
817:
809:
800:
798:
797:
792:
787:
772:
770:
769:
764:
762:
757:
756:
755:
737:
736:
726:
721:
716:
715:
706:
697:
695:
694:
689:
687:
686:
668:
667:
651:
649:
648:
643:
638:
637:
619:
618:
599:
597:
596:
591:
500:
452:
328:breakdown points
321:central tendency
310:central tendency
236:robust estimator
216:
213:
195:
188:
161:scale parameters
21:
18:Robust statistic
8502:
8501:
8497:
8496:
8495:
8493:
8492:
8491:
8477:
8476:
8475:
8470:
8433:
8404:
8366:
8303:
8289:quality control
8256:
8238:Clinical trials
8215:
8190:
8174:
8162:Hazard function
8156:
8110:
8072:
8056:
8019:
8015:BreuschâGodfrey
8003:
7980:
7920:
7895:Factor analysis
7841:
7822:Graphical model
7794:
7761:
7728:
7714:
7694:
7648:
7615:
7577:
7540:
7539:
7508:
7452:
7439:
7431:
7423:
7407:
7392:
7371:Rank statistics
7365:
7344:Model selection
7332:
7290:Goodness of fit
7284:
7261:
7235:
7207:
7160:
7105:
7094:Median unbiased
7022:
6933:
6866:Order statistic
6828:
6807:
6774:
6748:
6700:
6655:
6598:
6596:Data collection
6577:
6489:
6444:
6418:
6396:
6356:
6308:
6225:Continuous data
6215:
6202:
6184:
6179:
6122:
6104:
6083:
6049:
6039:
5997:
5976:
5937:
5923:
5897:
5875:10.2307/2291267
5855:
5826:
5812:
5786:
5764:10.2307/2669782
5743:
5738:, Prentice-Hall
5733:
5727:
5706:
5701:
5688:
5674:
5660:Huber, Peter J.
5658:
5644:
5631:
5609:10.2307/2289782
5589:
5553:
5533:
5519:
5499:
5493:
5480:
5477:
5472:
5464:
5460:
5452:
5448:
5440:
5436:
5428:
5421:
5413:
5409:
5397:
5393:
5385:
5381:
5373:
5369:
5361:
5357:
5346:
5342:
5335:
5331:
5313:
5312:
5308:
5300:
5296:
5288:
5279:
5272:
5249:
5248:
5244:
5237:
5214:
5213:
5209:
5175:
5174:
5170:
5166:
5144:
5084:
5078:
5070:test statistics
5057:
4996:
4995:
4967:
4966:
4885:
4884:
4878:
4847:
4846:
4827:
4826:
4807:
4806:
4787:
4786:
4767:
4766:
4733:
4732:
4696:
4695:
4670:
4669:
4647:
4646:
4638:-distribution.
4631:
4603:
4602:
4601:Theoretically,
4579:
4578:
4575:
4504:
4481:
4475:
4474:
4462:
4448:
4447:
4419:
4418:
4364:
4329:
4328:
4306:
4305:
4286:
4285:
4266:
4265:
4262:
4240:
4202:
4201:
4173:
4172:
4143:
4142:
4123:
4122:
4103:
4102:
4083:
4082:
4060:
4059:
4038:
4021:
3999:
3998:
3968:
3936:
3935:
3916:
3915:
3891:
3859:
3858:
3832:
3831:
3807:
3775:
3774:
3750:
3709:
3708:
3684:
3652:
3651:
3628:
3622:
3598:
3597:
3578:
3577:
3558:
3557:
3538:
3537:
3502:
3443:
3437:
3423:
3408:
3389:
3355:
3350:
3349:
3346:
3245:
3240:
3239:
3236:
3124:
3119:
3118:
3115:
3113:Rejection point
3088:
3083:
3082:
3058:
3053:
3052:
3028:
3023:
3022:
3015:
2990:
2989:
2970:
2969:
2897:
2887:
2873:
2827:
2826:
2802:
2791:
2790:
2765:
2764:
2743:
2738:
2737:
2708:
2707:
2679:
2678:
2659:
2658:
2639:
2638:
2558:
2544:
2505:
2497:
2496:
2471:
2470:
2451:
2450:
2431:
2430:
2411:
2410:
2388:
2387:
2357:
2331:
2330:
2297:
2287:
2279:
2278:
2247:
2246:
2227:
2226:
2207:
2206:
2181:
2180:
2161:
2160:
2141:
2140:
2125:
2119:to the sample.
2101:
2100:
2066:
2041:
2028:
2009:
1990:
1977:
1961:
1936:
1911:
1892:
1879:
1841:
1830:
1829:
1825:is defined by:
1807:
1806:
1805:at observation
1785:
1774:
1773:
1730:
1729:
1687:
1670:
1654:
1649:
1648:
1624:
1605:
1597:
1596:
1527:
1508:
1503:
1502:
1479:
1468:
1467:
1405:
1404:
1380:
1361:
1360:
1299:
1298:
1255:
1254:
1216:
1215:
1190:
1179:
1178:
1175:parameter space
1155:
1154:
1117:
1116:
1073:
1072:
1050:
1039:
1033:
1030:
1015:
999:
984:
973:
967:
964:
956:help improve it
953:
944:
940:
933:
931:Cook's distance
927:
906:
854:
835:
830:
829:
803:
802:
775:
774:
747:
728:
727:
707:
700:
699:
678:
659:
654:
653:
629:
610:
602:
601:
582:
581:
573:
571:Breakdown point
557:breakdown point
553:
544:
510:
475:
469:
412:
377:
302:
217:
211:
208:
201:needs expansion
186:
100:breakdown point
83:
69:with different
28:
23:
22:
15:
12:
11:
5:
8500:
8498:
8490:
8489:
8479:
8478:
8472:
8471:
8469:
8468:
8456:
8444:
8430:
8417:
8414:
8413:
8410:
8409:
8406:
8405:
8403:
8402:
8397:
8392:
8387:
8382:
8376:
8374:
8368:
8367:
8365:
8364:
8359:
8354:
8349:
8344:
8339:
8334:
8329:
8324:
8319:
8313:
8311:
8305:
8304:
8302:
8301:
8296:
8291:
8282:
8277:
8272:
8266:
8264:
8258:
8257:
8255:
8254:
8249:
8244:
8235:
8233:Bioinformatics
8229:
8227:
8217:
8216:
8211:
8204:
8203:
8200:
8199:
8196:
8195:
8192:
8191:
8189:
8188:
8182:
8180:
8176:
8175:
8173:
8172:
8166:
8164:
8158:
8157:
8155:
8154:
8149:
8144:
8139:
8133:
8131:
8122:
8116:
8115:
8112:
8111:
8109:
8108:
8103:
8098:
8093:
8088:
8082:
8080:
8074:
8073:
8071:
8070:
8065:
8060:
8052:
8047:
8042:
8041:
8040:
8038:partial (PACF)
8029:
8027:
8021:
8020:
8018:
8017:
8012:
8007:
7999:
7994:
7988:
7986:
7985:Specific tests
7982:
7981:
7979:
7978:
7973:
7968:
7963:
7958:
7953:
7948:
7943:
7937:
7935:
7928:
7922:
7921:
7919:
7918:
7917:
7916:
7915:
7914:
7899:
7898:
7897:
7887:
7885:Classification
7882:
7877:
7872:
7867:
7862:
7857:
7851:
7849:
7843:
7842:
7840:
7839:
7834:
7832:McNemar's test
7829:
7824:
7819:
7814:
7808:
7806:
7796:
7795:
7778:
7771:
7770:
7767:
7766:
7763:
7762:
7760:
7759:
7754:
7749:
7744:
7738:
7736:
7730:
7729:
7727:
7726:
7710:
7704:
7702:
7696:
7695:
7693:
7692:
7687:
7682:
7677:
7672:
7670:Semiparametric
7667:
7662:
7656:
7654:
7650:
7649:
7647:
7646:
7641:
7636:
7631:
7625:
7623:
7617:
7616:
7614:
7613:
7608:
7603:
7598:
7593:
7587:
7585:
7579:
7578:
7576:
7575:
7570:
7565:
7560:
7554:
7552:
7542:
7541:
7538:
7537:
7532:
7526:
7525:
7518:
7517:
7514:
7513:
7510:
7509:
7507:
7506:
7505:
7504:
7494:
7489:
7484:
7483:
7482:
7477:
7466:
7464:
7458:
7457:
7454:
7453:
7451:
7450:
7445:
7444:
7443:
7435:
7427:
7411:
7408:(MannâWhitney)
7403:
7402:
7401:
7388:
7387:
7386:
7375:
7373:
7367:
7366:
7364:
7363:
7362:
7361:
7356:
7351:
7341:
7336:
7333:(ShapiroâWilk)
7328:
7323:
7318:
7313:
7308:
7300:
7294:
7292:
7286:
7285:
7283:
7282:
7274:
7265:
7253:
7247:
7245:Specific tests
7241:
7240:
7237:
7236:
7234:
7233:
7228:
7223:
7217:
7215:
7209:
7208:
7206:
7205:
7200:
7199:
7198:
7188:
7187:
7186:
7176:
7170:
7168:
7162:
7161:
7159:
7158:
7157:
7156:
7151:
7141:
7136:
7131:
7126:
7121:
7115:
7113:
7107:
7106:
7104:
7103:
7098:
7097:
7096:
7091:
7090:
7089:
7084:
7069:
7068:
7067:
7062:
7057:
7052:
7041:
7039:
7030:
7024:
7023:
7021:
7020:
7015:
7010:
7009:
7008:
6998:
6993:
6992:
6991:
6981:
6980:
6979:
6974:
6969:
6959:
6954:
6949:
6948:
6947:
6942:
6937:
6921:
6920:
6919:
6914:
6909:
6899:
6898:
6897:
6892:
6882:
6881:
6880:
6870:
6869:
6868:
6858:
6853:
6848:
6842:
6840:
6830:
6829:
6824:
6817:
6816:
6813:
6812:
6809:
6808:
6806:
6805:
6800:
6795:
6790:
6784:
6782:
6776:
6775:
6773:
6772:
6767:
6762:
6756:
6754:
6750:
6749:
6747:
6746:
6741:
6736:
6731:
6726:
6721:
6716:
6710:
6708:
6702:
6701:
6699:
6698:
6696:Standard error
6693:
6688:
6683:
6682:
6681:
6676:
6665:
6663:
6657:
6656:
6654:
6653:
6648:
6643:
6638:
6633:
6628:
6626:Optimal design
6623:
6618:
6612:
6610:
6600:
6599:
6594:
6587:
6586:
6583:
6582:
6579:
6578:
6576:
6575:
6570:
6565:
6560:
6555:
6550:
6545:
6540:
6535:
6530:
6525:
6520:
6515:
6510:
6505:
6499:
6497:
6491:
6490:
6488:
6487:
6482:
6481:
6480:
6475:
6465:
6460:
6454:
6452:
6446:
6445:
6443:
6442:
6437:
6432:
6426:
6424:
6423:Summary tables
6420:
6419:
6417:
6416:
6410:
6408:
6402:
6401:
6398:
6397:
6395:
6394:
6393:
6392:
6387:
6382:
6372:
6366:
6364:
6358:
6357:
6355:
6354:
6349:
6344:
6339:
6334:
6329:
6324:
6318:
6316:
6310:
6309:
6307:
6306:
6301:
6296:
6295:
6294:
6289:
6284:
6279:
6274:
6269:
6264:
6259:
6257:Contraharmonic
6254:
6249:
6238:
6236:
6227:
6217:
6216:
6211:
6204:
6203:
6201:
6200:
6195:
6189:
6186:
6185:
6180:
6178:
6177:
6170:
6163:
6155:
6149:
6148:
6143:
6137:
6131:
6126:Brian Ripley's
6121:
6120:External links
6118:
6117:
6116:
6102:
6081:
6061:(3): 309â348,
6047:
6037:
6011:(4): 277â281,
5995:
5985:(9): 909â916,
5974:
5959:10.1002/widm.2
5935:
5921:
5895:
5853:
5829:Water Research
5824:
5810:
5784:
5741:
5731:
5725:
5704:
5699:
5686:
5672:
5656:
5642:
5629:
5587:
5551:
5531:
5517:
5497:
5491:
5476:
5473:
5471:
5470:
5458:
5446:
5434:
5419:
5407:
5391:
5379:
5367:
5355:
5340:
5329:
5326:on 2016-09-15.
5306:
5294:
5277:
5270:
5242:
5235:
5207:
5167:
5165:
5162:
5161:
5160:
5155:
5150:
5143:
5140:
5124:Kalman filters
5080:Main article:
5077:
5074:
5056:
5053:
5052:
5051:
5040:
5037:
5031:
5028:
5021:
5018:
5015:
5009:
5006:
4980:
4977:
4974:
4963:
4962:
4951:
4948:
4942:
4939:
4932:
4929:
4926:
4920:
4917:
4910:
4907:
4904:
4898:
4895:
4877:
4874:
4854:
4834:
4814:
4794:
4774:
4746:
4743:
4740:
4729:
4728:
4717:
4711:
4708:
4703:
4699:
4694:
4689:
4686:
4683:
4680:
4677:
4654:
4630:
4627:
4610:
4586:
4574:
4564:
4563:
4562:
4551:
4548:
4545:
4542:
4539:
4536:
4530:
4527:
4524:
4521:
4516:
4510:
4507:
4502:
4499:
4496:
4493:
4490:
4487:
4484:
4478:
4470:
4465:
4461:
4458:
4455:
4432:
4429:
4426:
4415:
4414:
4403:
4400:
4397:
4394:
4391:
4388:
4385:
4382:
4379:
4374:
4371:
4367:
4363:
4360:
4357:
4354:
4351:
4348:
4345:
4342:
4339:
4336:
4313:
4293:
4273:
4261:
4258:
4239:
4236:
4209:
4189:
4186:
4183:
4180:
4150:
4130:
4110:
4090:
4067:
4044:
4041:
4036:
4033:
4030:
4027:
4024:
4018:
4015:
4012:
4009:
4006:
3986:
3983:
3980:
3975:
3971:
3967:
3964:
3959:
3954:
3951:
3948:
3944:
3923:
3903:
3898:
3894:
3890:
3887:
3882:
3877:
3874:
3871:
3867:
3839:
3819:
3814:
3810:
3806:
3803:
3798:
3793:
3790:
3787:
3783:
3762:
3757:
3753:
3749:
3746:
3743:
3740:
3737:
3732:
3727:
3724:
3721:
3717:
3696:
3691:
3687:
3683:
3680:
3675:
3670:
3667:
3664:
3660:
3624:Main article:
3621:
3618:
3605:
3585:
3565:
3545:
3530:
3529:
3517:
3511:
3508:
3505:
3500:
3497:
3494:
3491:
3488:
3485:
3482:
3479:
3476:
3473:
3470:
3467:
3464:
3461:
3458:
3455:
3452:
3449:
3446:
3440:
3432:
3429:
3426:
3419:
3413:
3407:
3404:
3401:
3398:
3395:
3392:
3386:
3382:
3379:
3376:
3373:
3370:
3367:
3362:
3358:
3345:
3342:
3341:
3340:
3328:
3324:
3321:
3318:
3315:
3312:
3309:
3306:
3303:
3300:
3296:
3288:
3283:
3280:
3276:
3272:
3269:
3266:
3263:
3260:
3257:
3252:
3248:
3235:
3232:
3231:
3230:
3219:
3216:
3213:
3209:
3205:
3201:
3197:
3194:
3191:
3188:
3185:
3182:
3179:
3176:
3173:
3170:
3167:
3164:
3161:
3158:
3155:
3150:
3147:
3144:
3140:
3136:
3131:
3127:
3114:
3111:
3110:
3109:
3095:
3091:
3079:
3065:
3061:
3049:
3035:
3031:
3014:
3011:
2997:
2977:
2966:
2965:
2954:
2949:
2945:
2942:
2939:
2936:
2933:
2930:
2927:
2924:
2921:
2918:
2915:
2912:
2909:
2904:
2900:
2896:
2893:
2890:
2880:
2876:
2872:
2869:
2865:
2861:
2858:
2855:
2852:
2849:
2846:
2843:
2840:
2837:
2834:
2809:
2805:
2801:
2798:
2778:
2775:
2772:
2750:
2746:
2723:
2718:
2715:
2692:
2689:
2686:
2666:
2646:
2628:
2627:
2613:
2609:
2606:
2603:
2600:
2597:
2594:
2591:
2588:
2585:
2582:
2579:
2576:
2573:
2570:
2567:
2564:
2561:
2551:
2547:
2543:
2540:
2536:
2532:
2529:
2526:
2523:
2518:
2515:
2512:
2508:
2504:
2478:
2458:
2438:
2418:
2395:
2375:
2372:
2369:
2364:
2360:
2356:
2353:
2350:
2347:
2344:
2341:
2338:
2311:
2307:
2304:
2300:
2294:
2290:
2286:
2266:
2263:
2260:
2257:
2254:
2234:
2214:
2194:
2191:
2188:
2168:
2148:
2124:
2121:
2108:
2093:
2092:
2081:
2078:
2073:
2069:
2065:
2062:
2059:
2054:
2051:
2048:
2044:
2040:
2035:
2031:
2027:
2022:
2019:
2016:
2012:
2008:
2005:
2002:
1997:
1993:
1989:
1984:
1980:
1976:
1973:
1968:
1964:
1960:
1957:
1954:
1949:
1946:
1943:
1939:
1935:
1932:
1929:
1924:
1921:
1918:
1914:
1910:
1907:
1904:
1899:
1895:
1891:
1886:
1882:
1878:
1875:
1872:
1869:
1864:
1859:
1856:
1853:
1848:
1844:
1840:
1837:
1814:
1792:
1788:
1784:
1781:
1761:
1758:
1755:
1752:
1749:
1746:
1743:
1740:
1737:
1717:
1714:
1711:
1708:
1705:
1702:
1699:
1694:
1690:
1686:
1681:
1675:
1669:
1666:
1661:
1657:
1636:
1631:
1627:
1623:
1620:
1617:
1612:
1608:
1604:
1580:
1577:
1574:
1569:
1564:
1561:
1558:
1553:
1548:
1545:
1542:
1539:
1534:
1530:
1526:
1523:
1520:
1515:
1511:
1488:
1483:
1478:
1475:
1461:
1460:
1448:
1443:
1438:
1434:
1430:
1427:
1424:
1421:
1418:
1415:
1412:
1402:
1389:
1384:
1379:
1375:
1371:
1368:
1358:
1346:
1341:
1336:
1332:
1328:
1325:
1322:
1319:
1316:
1311:
1306:
1296:
1284:
1281:
1278:
1273:
1268:
1265:
1262:
1248:
1247:
1235:
1232:
1229:
1226:
1223:
1213:
1199:
1194:
1189:
1186:
1162:
1152:
1140:
1137:
1134:
1129:
1124:
1114:
1102:
1099:
1096:
1091:
1086:
1083:
1080:
1052:
1051:
1002:
1000:
993:
986:
985:
947:
945:
938:
926:
923:
905:
902:
861:
857:
853:
850:
847:
842:
838:
815:
812:
790:
786:
782:
760:
754:
750:
746:
743:
740:
735:
731:
724:
719:
714:
710:
685:
681:
677:
674:
671:
666:
662:
641:
636:
632:
628:
625:
622:
617:
613:
609:
589:
572:
569:
552:
549:
543:
540:
509:
506:
487:(Qn) estimator
471:Main article:
468:
465:
411:
408:
384:speed-of-light
376:
373:
301:
298:
219:
218:
198:
196:
185:
182:
181:
180:
169:
163:
157:
147:
146:
132:
82:
79:
26:
24:
14:
13:
10:
9:
6:
4:
3:
2:
8499:
8488:
8485:
8484:
8482:
8467:
8466:
8457:
8455:
8454:
8445:
8443:
8442:
8437:
8431:
8429:
8428:
8419:
8418:
8415:
8401:
8398:
8396:
8395:Geostatistics
8393:
8391:
8388:
8386:
8383:
8381:
8378:
8377:
8375:
8373:
8369:
8363:
8362:Psychometrics
8360:
8358:
8355:
8353:
8350:
8348:
8345:
8343:
8340:
8338:
8335:
8333:
8330:
8328:
8325:
8323:
8320:
8318:
8315:
8314:
8312:
8310:
8306:
8300:
8297:
8295:
8292:
8290:
8286:
8283:
8281:
8278:
8276:
8273:
8271:
8268:
8267:
8265:
8263:
8259:
8253:
8250:
8248:
8245:
8243:
8239:
8236:
8234:
8231:
8230:
8228:
8226:
8225:Biostatistics
8222:
8218:
8214:
8209:
8205:
8187:
8186:Log-rank test
8184:
8183:
8181:
8177:
8171:
8168:
8167:
8165:
8163:
8159:
8153:
8150:
8148:
8145:
8143:
8140:
8138:
8135:
8134:
8132:
8130:
8126:
8123:
8121:
8117:
8107:
8104:
8102:
8099:
8097:
8094:
8092:
8089:
8087:
8084:
8083:
8081:
8079:
8075:
8069:
8066:
8064:
8061:
8059:
8057:(BoxâJenkins)
8053:
8051:
8048:
8046:
8043:
8039:
8036:
8035:
8034:
8031:
8030:
8028:
8026:
8022:
8016:
8013:
8011:
8010:DurbinâWatson
8008:
8006:
8000:
7998:
7995:
7993:
7992:DickeyâFuller
7990:
7989:
7987:
7983:
7977:
7974:
7972:
7969:
7967:
7966:Cointegration
7964:
7962:
7959:
7957:
7954:
7952:
7949:
7947:
7944:
7942:
7941:Decomposition
7939:
7938:
7936:
7932:
7929:
7927:
7923:
7913:
7910:
7909:
7908:
7905:
7904:
7903:
7900:
7896:
7893:
7892:
7891:
7888:
7886:
7883:
7881:
7878:
7876:
7873:
7871:
7868:
7866:
7863:
7861:
7858:
7856:
7853:
7852:
7850:
7848:
7844:
7838:
7835:
7833:
7830:
7828:
7825:
7823:
7820:
7818:
7815:
7813:
7812:Cohen's kappa
7810:
7809:
7807:
7805:
7801:
7797:
7793:
7789:
7785:
7781:
7776:
7772:
7758:
7755:
7753:
7750:
7748:
7745:
7743:
7740:
7739:
7737:
7735:
7731:
7725:
7721:
7717:
7711:
7709:
7706:
7705:
7703:
7701:
7697:
7691:
7688:
7686:
7683:
7681:
7678:
7676:
7673:
7671:
7668:
7666:
7665:Nonparametric
7663:
7661:
7658:
7657:
7655:
7651:
7645:
7642:
7640:
7637:
7635:
7632:
7630:
7627:
7626:
7624:
7622:
7618:
7612:
7609:
7607:
7604:
7602:
7599:
7597:
7594:
7592:
7589:
7588:
7586:
7584:
7580:
7574:
7571:
7569:
7566:
7564:
7561:
7559:
7556:
7555:
7553:
7551:
7547:
7543:
7536:
7533:
7531:
7528:
7527:
7523:
7519:
7503:
7500:
7499:
7498:
7495:
7493:
7490:
7488:
7485:
7481:
7478:
7476:
7473:
7472:
7471:
7468:
7467:
7465:
7463:
7459:
7449:
7446:
7442:
7436:
7434:
7428:
7426:
7420:
7419:
7418:
7415:
7414:Nonparametric
7412:
7410:
7404:
7400:
7397:
7396:
7395:
7389:
7385:
7384:Sample median
7382:
7381:
7380:
7377:
7376:
7374:
7372:
7368:
7360:
7357:
7355:
7352:
7350:
7347:
7346:
7345:
7342:
7340:
7337:
7335:
7329:
7327:
7324:
7322:
7319:
7317:
7314:
7312:
7309:
7307:
7305:
7301:
7299:
7296:
7295:
7293:
7291:
7287:
7281:
7279:
7275:
7273:
7271:
7266:
7264:
7259:
7255:
7254:
7251:
7248:
7246:
7242:
7232:
7229:
7227:
7224:
7222:
7219:
7218:
7216:
7214:
7210:
7204:
7201:
7197:
7194:
7193:
7192:
7189:
7185:
7182:
7181:
7180:
7177:
7175:
7172:
7171:
7169:
7167:
7163:
7155:
7152:
7150:
7147:
7146:
7145:
7142:
7140:
7137:
7135:
7132:
7130:
7127:
7125:
7122:
7120:
7117:
7116:
7114:
7112:
7108:
7102:
7099:
7095:
7092:
7088:
7085:
7083:
7080:
7079:
7078:
7075:
7074:
7073:
7070:
7066:
7063:
7061:
7058:
7056:
7053:
7051:
7048:
7047:
7046:
7043:
7042:
7040:
7038:
7034:
7031:
7029:
7025:
7019:
7016:
7014:
7011:
7007:
7004:
7003:
7002:
6999:
6997:
6994:
6990:
6989:loss function
6987:
6986:
6985:
6982:
6978:
6975:
6973:
6970:
6968:
6965:
6964:
6963:
6960:
6958:
6955:
6953:
6950:
6946:
6943:
6941:
6938:
6936:
6930:
6927:
6926:
6925:
6922:
6918:
6915:
6913:
6910:
6908:
6905:
6904:
6903:
6900:
6896:
6893:
6891:
6888:
6887:
6886:
6883:
6879:
6876:
6875:
6874:
6871:
6867:
6864:
6863:
6862:
6859:
6857:
6854:
6852:
6849:
6847:
6844:
6843:
6841:
6839:
6835:
6831:
6827:
6822:
6818:
6804:
6801:
6799:
6796:
6794:
6791:
6789:
6786:
6785:
6783:
6781:
6777:
6771:
6768:
6766:
6763:
6761:
6758:
6757:
6755:
6751:
6745:
6742:
6740:
6737:
6735:
6732:
6730:
6727:
6725:
6722:
6720:
6717:
6715:
6712:
6711:
6709:
6707:
6703:
6697:
6694:
6692:
6691:Questionnaire
6689:
6687:
6684:
6680:
6677:
6675:
6672:
6671:
6670:
6667:
6666:
6664:
6662:
6658:
6652:
6649:
6647:
6644:
6642:
6639:
6637:
6634:
6632:
6629:
6627:
6624:
6622:
6619:
6617:
6614:
6613:
6611:
6609:
6605:
6601:
6597:
6592:
6588:
6574:
6571:
6569:
6566:
6564:
6561:
6559:
6556:
6554:
6551:
6549:
6546:
6544:
6541:
6539:
6536:
6534:
6531:
6529:
6526:
6524:
6521:
6519:
6518:Control chart
6516:
6514:
6511:
6509:
6506:
6504:
6501:
6500:
6498:
6496:
6492:
6486:
6483:
6479:
6476:
6474:
6471:
6470:
6469:
6466:
6464:
6461:
6459:
6456:
6455:
6453:
6451:
6447:
6441:
6438:
6436:
6433:
6431:
6428:
6427:
6425:
6421:
6415:
6412:
6411:
6409:
6407:
6403:
6391:
6388:
6386:
6383:
6381:
6378:
6377:
6376:
6373:
6371:
6368:
6367:
6365:
6363:
6359:
6353:
6350:
6348:
6345:
6343:
6340:
6338:
6335:
6333:
6330:
6328:
6325:
6323:
6320:
6319:
6317:
6315:
6311:
6305:
6302:
6300:
6297:
6293:
6290:
6288:
6285:
6283:
6280:
6278:
6275:
6273:
6270:
6268:
6265:
6263:
6260:
6258:
6255:
6253:
6250:
6248:
6245:
6244:
6243:
6240:
6239:
6237:
6235:
6231:
6228:
6226:
6222:
6218:
6214:
6209:
6205:
6199:
6196:
6194:
6191:
6190:
6187:
6183:
6176:
6171:
6169:
6164:
6162:
6157:
6156:
6153:
6147:
6144:
6141:
6138:
6135:
6132:
6130:
6127:
6124:
6123:
6119:
6113:
6109:
6105:
6099:
6095:
6091:
6087:
6082:
6078:
6074:
6069:
6064:
6060:
6056:
6052:
6051:von Mises, R.
6048:
6043:
6038:
6034:
6030:
6026:
6022:
6018:
6014:
6010:
6006:
6005:
6000:
5996:
5992:
5988:
5984:
5980:
5975:
5973:
5968:
5964:
5960:
5956:
5952:
5948:
5944:
5940:
5936:
5932:
5928:
5924:
5922:0-471-85233-3
5918:
5914:
5910:
5906:
5905:
5900:
5896:
5892:
5888:
5884:
5880:
5876:
5872:
5868:
5864:
5863:
5858:
5854:
5850:
5846:
5842:
5838:
5834:
5830:
5825:
5821:
5817:
5813:
5807:
5803:
5802:
5797:
5793:
5789:
5785:
5781:
5777:
5773:
5769:
5765:
5761:
5757:
5753:
5752:
5747:
5742:
5737:
5732:
5728:
5722:
5718:
5714:
5710:
5705:
5702:
5700:9780412558504
5696:
5692:
5687:
5683:
5679:
5675:
5673:0-471-41805-6
5669:
5665:
5661:
5657:
5653:
5649:
5645:
5643:0-340-54937-8
5639:
5635:
5630:
5626:
5622:
5618:
5614:
5610:
5606:
5602:
5598:
5597:
5592:
5588:
5584:
5580:
5575:
5570:
5566:
5562:
5561:
5556:
5552:
5549:
5545:
5541:
5537:
5532:
5528:
5524:
5520:
5518:0-471-82921-8
5514:
5510:
5506:
5502:
5498:
5494:
5488:
5484:
5479:
5478:
5474:
5467:
5462:
5459:
5455:
5450:
5447:
5443:
5438:
5435:
5431:
5426:
5424:
5420:
5416:
5411:
5408:
5404:
5400:
5395:
5392:
5388:
5383:
5380:
5376:
5371:
5368:
5364:
5359:
5356:
5353:
5349:
5344:
5341:
5338:
5333:
5330:
5325:
5321:
5317:
5310:
5307:
5303:
5298:
5295:
5291:
5286:
5284:
5282:
5278:
5273:
5267:
5263:
5259:
5255:
5254:
5246:
5243:
5238:
5232:
5228:
5224:
5220:
5219:
5211:
5208:
5203:
5199:
5195:
5191:
5187:
5183:
5179:
5172:
5169:
5163:
5159:
5156:
5154:
5151:
5149:
5146:
5145:
5141:
5139:
5135:
5133:
5129:
5125:
5120:
5118:
5114:
5108:
5106:
5101:
5097:
5093:
5089:
5083:
5075:
5073:
5071:
5066:
5062:
5054:
5038:
5035:
5026:
5019:
5016:
5013:
5004:
4994:
4993:
4992:
4978:
4975:
4972:
4949:
4946:
4937:
4930:
4927:
4924:
4915:
4908:
4905:
4902:
4893:
4883:
4882:
4881:
4875:
4873:
4871:
4866:
4852:
4832:
4812:
4792:
4772:
4764:
4760:
4744:
4741:
4738:
4715:
4709:
4706:
4701:
4697:
4692:
4687:
4681:
4675:
4668:
4667:
4666:
4652:
4644:
4639:
4637:
4628:
4626:
4624:
4608:
4599:
4584:
4573:
4569:
4565:
4549:
4543:
4537:
4534:
4525:
4519:
4514:
4508:
4497:
4494:
4491:
4485:
4476:
4463:
4459:
4456:
4453:
4446:
4445:
4444:
4430:
4427:
4424:
4395:
4389:
4386:
4383:
4377:
4372:
4369:
4365:
4361:
4355:
4352:
4349:
4346:
4343:
4337:
4334:
4327:
4326:
4325:
4311:
4291:
4271:
4259:
4257:
4254:
4250:
4246:
4243:
4237:
4235:
4233:
4228:
4225:
4223:
4207:
4184:
4178:
4169:
4167:
4162:
4148:
4128:
4108:
4088:
4079:
4065:
4042:
4039:
4031:
4025:
4022:
4016:
4010:
4004:
3984:
3981:
3973:
3969:
3962:
3957:
3952:
3949:
3946:
3942:
3921:
3896:
3892:
3885:
3880:
3875:
3872:
3869:
3865:
3855:
3853:
3837:
3812:
3808:
3801:
3796:
3791:
3788:
3785:
3781:
3755:
3751:
3744:
3741:
3738:
3735:
3730:
3725:
3722:
3719:
3715:
3689:
3685:
3678:
3673:
3668:
3665:
3662:
3658:
3649:
3644:
3642:
3638:
3633:
3632:
3627:
3619:
3617:
3603:
3583:
3563:
3543:
3535:
3509:
3506:
3503:
3495:
3492:
3489:
3486:
3483:
3477:
3474:
3471:
3465:
3462:
3459:
3456:
3453:
3447:
3444:
3430:
3427:
3424:
3417:
3405:
3399:
3396:
3393:
3380:
3374:
3371:
3368:
3360:
3356:
3348:
3347:
3343:
3319:
3316:
3313:
3310:
3307:
3301:
3298:
3281:
3278:
3270:
3264:
3261:
3258:
3250:
3246:
3238:
3237:
3233:
3214:
3211:
3203:
3195:
3192:
3189:
3183:
3180:
3177:
3174:
3171:
3165:
3162:
3159:
3156:
3148:
3145:
3142:
3134:
3129:
3125:
3117:
3116:
3112:
3093:
3089:
3080:
3063:
3059:
3050:
3033:
3029:
3020:
3019:
3018:
3012:
3010:
2995:
2975:
2952:
2947:
2940:
2934:
2931:
2925:
2919:
2916:
2913:
2907:
2902:
2894:
2888:
2878:
2874:
2867:
2859:
2853:
2850:
2847:
2844:
2841:
2835:
2832:
2825:
2824:
2823:
2807:
2799:
2796:
2773:
2748:
2716:
2713:
2704:
2690:
2687:
2684:
2664:
2644:
2636:
2633:
2630:which is the
2611:
2604:
2598:
2595:
2589:
2583:
2580:
2577:
2571:
2568:
2565:
2559:
2549:
2545:
2538:
2530:
2524:
2516:
2513:
2510:
2506:
2502:
2495:
2494:
2493:
2490:
2476:
2456:
2436:
2416:
2407:
2393:
2373:
2370:
2362:
2358:
2351:
2348:
2342:
2339:
2328:
2305:
2302:
2292:
2288:
2258:
2255:
2252:
2232:
2212:
2189:
2186:
2146:
2137:
2135:
2131:
2130:distribution,
2122:
2120:
2106:
2098:
2071:
2067:
2063:
2060:
2057:
2052:
2049:
2046:
2042:
2038:
2033:
2029:
2025:
2020:
2017:
2014:
2010:
2006:
2003:
2000:
1995:
1991:
1982:
1978:
1974:
1966:
1962:
1958:
1955:
1952:
1947:
1944:
1941:
1937:
1933:
1930:
1927:
1922:
1919:
1916:
1912:
1908:
1905:
1902:
1897:
1893:
1884:
1880:
1873:
1870:
1857:
1854:
1851:
1846:
1842:
1838:
1835:
1828:
1827:
1826:
1812:
1790:
1786:
1782:
1779:
1756:
1753:
1750:
1747:
1744:
1738:
1735:
1712:
1709:
1692:
1684:
1679:
1664:
1659:
1655:
1629:
1625:
1621:
1618:
1615:
1610:
1606:
1594:
1572:
1546:
1537:
1532:
1528:
1524:
1521:
1518:
1513:
1509:
1486:
1476:
1473:
1464:
1436:
1425:
1419:
1416:
1403:
1387:
1377:
1369:
1359:
1334:
1323:
1314:
1297:
1279:
1276:
1266:
1253:
1252:
1251:
1250:For example,
1230:
1227:
1214:
1197:
1187:
1184:
1177:of dimension
1176:
1153:
1132:
1115:
1097:
1094:
1084:
1071:
1070:
1069:
1066:
1058:
1048:
1045:
1037:
1034:February 2012
1027:
1023:
1019:
1013:
1012:
1008:
1003:This section
1001:
997:
992:
991:
982:
979:
971:
961:
957:
951:
948:This article
946:
937:
936:
932:
924:
922:
918:
914:
912:
903:
901:
900:
895:
893:
889:
885:
881:
878:distribution
875:
859:
855:
851:
848:
845:
840:
836:
810:
788:
784:
780:
758:
752:
748:
744:
741:
738:
733:
729:
722:
712:
708:
698:, we can use
683:
679:
675:
672:
669:
664:
660:
634:
630:
626:
623:
620:
615:
611:
587:
578:
570:
568:
566:
562:
558:
550:
548:
541:
539:
537:
533:
528:
526:
521:
517:
515:
507:
505:
501:
499:
494:
492:
488:
484:
479:
474:
466:
464:
462:
457:
453:
451:
446:
442:
440:
436:
432:
431:from each end
428:
423:
421:
417:
409:
407:
405:
399:
395:
393:
389:
388:Simon Newcomb
385:
381:
374:
372:
370:
366:
362:
358:
354:
352:
348:
344:
340:
336:
331:
329:
324:
322:
318:
313:
311:
307:
299:
297:
295:
291:
286:
283:
278:
276:
272:
268:
262:
260:
256:
251:
249:
245:
241:
237:
233:
232:
226:
215:
206:
202:
199:This section
197:
194:
190:
189:
183:
178:
177:Kalman filter
174:
170:
168:
164:
162:
158:
156:
152:
151:
150:
144:
140:
139:-distribution
138:
133:
130:
129:
128:
126:
121:
119:
118:mixture model
115:
110:
108:
107:
102:
101:
96:
92:
88:
80:
78:
77:work poorly.
76:
72:
68:
64:
60:
56:
52:
48:
44:
40:
36:
32:
19:
8463:
8451:
8432:
8425:
8337:Econometrics
8287: /
8270:Chemometrics
8247:Epidemiology
8240: /
8213:Applications
8055:ARIMA model
8002:Q-statistic
7951:Stationarity
7847:Multivariate
7790: /
7786: /
7784:Multivariate
7782: /
7722: /
7718: /
7492:Bayes factor
7391:Signed rank
7303:
7277:
7269:
7257:
7017:
6952:Completeness
6788:Cohort study
6686:Opinion poll
6621:Missing data
6608:Study design
6563:Scatter plot
6485:Scatter plot
6478:Spearman's Ï
6440:Grouped data
6085:
6058:
6054:
6041:
6008:
6002:
5982:
5978:
5953:(1): 73â79,
5950:
5946:
5903:
5866:
5860:
5832:
5828:
5800:
5755:
5749:
5735:
5708:
5690:
5663:
5633:
5600:
5594:
5564:
5558:
5539:
5535:
5508:
5482:
5461:
5449:
5437:
5410:
5394:
5387:Huber (1981)
5382:
5375:Huber (1981)
5370:
5358:
5343:
5332:
5324:the original
5319:
5309:
5297:
5290:Huber (1981)
5252:
5245:
5217:
5210:
5185:
5181:
5171:
5136:
5121:
5109:
5088:missing data
5085:
5058:
4964:
4879:
4867:
4762:
4758:
4730:
4642:
4640:
4635:
4632:
4600:
4576:
4571:
4567:
4416:
4263:
4255:
4251:
4247:
4244:
4241:
4229:
4226:
4170:
4163:
4080:
3934:and solving
3856:
3851:
3645:
3641:Huber (1981)
3637:L-estimators
3634:
3630:
3629:
3620:M-estimators
3531:
3016:
2967:
2789:. We choose
2705:
2629:
2491:
2408:
2138:
2133:
2129:
2126:
2096:
2094:
1465:
1462:
1249:
1067:
1063:
1040:
1031:
1016:Please help
1004:
974:
965:
949:
919:
915:
907:
898:
896:
884:Huber (1981)
876:
574:
564:
560:
556:
554:
545:
529:
522:
518:
511:
502:
495:
480:
476:
461:M-estimators
458:
454:
447:
443:
430:
427:trimmed mean
424:
413:
400:
396:
378:
369:M-estimators
365:L-estimators
355:
345:, while the
332:
325:
314:
303:
293:
289:
287:
281:
279:
270:
266:
263:
254:
252:
235:
228:
222:
209:
205:adding to it
200:
148:
136:
122:
111:
104:
98:
84:
81:Introduction
30:
29:
8465:WikiProject
8380:Cartography
8342:Jurimetrics
8294:Reliability
8025:Time domain
8004:(LjungâBox)
7926:Time-series
7804:Categorical
7788:Time-series
7780:Categorical
7715:(Bernoulli)
7550:Correlation
7530:Correlation
7326:JarqueâBera
7298:Chi-squared
7060:M-estimator
7013:Asymptotics
6957:Sufficiency
6724:Interaction
6636:Replication
6616:Effect size
6573:Violin plot
6553:Radar chart
6533:Forest plot
6523:Correlogram
6473:Kendall's Ï
5943:Hubert, Mia
5105:Winsorizing
4161:functions.
3857:Minimizing
3626:M-estimator
2134:sample set,
437:and 10,000
404:inefficient
173:state-space
165:estimating
159:estimating
153:estimating
39:statistical
8332:Demography
8050:ARMA model
7855:Regression
7432:(Friedman)
7393:(Wilcoxon)
7331:Normality
7321:Lilliefors
7268:Student's
7144:Resampling
7018:Robustness
7006:divergence
6996:Efficiency
6934:(monotone)
6929:Likelihood
6846:Population
6679:Stratified
6631:Population
6450:Dependence
6406:Count data
6337:Percentile
6314:Dispersion
6247:Arithmetic
6182:Statistics
5746:He, Xuming
5591:He, Xuming
5555:He, Xuming
5475:References
5092:imputation
5090:is called
5086:Replacing
4566:Choice of
4443:given by:
4324:function.
929:See also:
536:Antarctica
532:ozone hole
525:regression
240:efficiency
184:Definition
95:estimators
35:statistics
7713:Logistic
7480:posterior
7406:Rank sum
7154:Jackknife
7149:Bootstrap
6967:Bootstrap
6902:Parameter
6851:Statistic
6646:Statistic
6558:Run chart
6543:Pie chart
6538:Histogram
6528:Fan chart
6503:Bar chart
6385:L-moments
6272:Geometric
5377:, page 45
5292:, page 1.
5202:0378-3758
5188:: 20â37.
5122:Standard
5030:^
5027:σ
5008:^
5005:μ
4973:ν
4941:^
4938:ν
4919:^
4916:σ
4897:^
4894:μ
4853:ν
4833:ψ
4813:ν
4793:ν
4773:ν
4739:ν
4710:ν
4676:ψ
4653:ν
4609:ψ
4585:ψ
4509:θ
4506:∂
4498:θ
4486:ψ
4483:∂
4464:∫
4460:−
4428:×
4417:with the
4378:ψ
4370:−
4312:ψ
4292:ψ
4208:ρ
4179:ρ
4149:ψ
4129:ρ
4109:ψ
4089:ρ
4066:ρ
4026:ρ
4005:ψ
3963:ψ
3943:∑
3922:ρ
3886:ρ
3866:∑
3838:ρ
3802:ρ
3782:∑
3742:
3736:−
3716:∑
3659:∏
3507:−
3472:−
3428:≠
3406:∈
3361:∗
3357:λ
3282:∈
3251:∗
3247:γ
3130:∗
3126:ρ
3094:∗
3090:λ
3064:∗
3060:γ
3034:∗
3030:ρ
2932:−
2917:−
2899:Δ
2871:→
2804:Δ
2745:Δ
2717:∈
2688:−
2632:one-sided
2596:−
2581:−
2542:→
2514:−
2374:θ
2363:θ
2346:Θ
2343:∈
2340:θ
2337:∀
2306:∈
2265:Γ
2262:→
2193:Θ
2190:∈
2187:θ
2167:Σ
2061:…
2018:−
2004:…
1975:−
1956:…
1920:−
1906:…
1874:⋅
1868:↦
1858:∈
1751:…
1739:∈
1707:Γ
1701:→
1689:Σ
1619:…
1576:Σ
1560:→
1544:Ω
1522:…
1487:∗
1477:∈
1414:Γ
1378:×
1367:Θ
1318:Σ
1264:Ω
1225:Γ
1198:∗
1188:∈
1161:Θ
1136:Σ
1082:Ω
1005:does not
968:June 2010
849:…
814:¯
742:⋯
718:¯
673:…
624:…
577:estimator
439:bootstrap
353:are not.
231:statistic
225:statistic
212:July 2008
8481:Category
8427:Category
8120:Survival
7997:Johansen
7720:Binomial
7675:Isotonic
7262:(normal)
6907:location
6714:Blocking
6669:Sampling
6548:QâQ plot
6513:Box plot
6495:Graphics
6390:Skewness
6380:Kurtosis
6352:Variance
6282:Heronian
6277:Harmonic
6033:10728417
5972:Preprint
5967:17448982
5849:11547861
5662:(1981),
5142:See also
5100:outliers
4641:For the
3997:, where
3830:, where
3516:‖
3439:‖
911:outliers
563:and the
514:outliers
420:QâQ plot
416:rug plot
300:Examples
294:replaces
259:outliers
103:and the
59:outliers
43:location
8453:Commons
8400:Kriging
8285:Process
8242:studies
8101:Wavelet
7934:General
7101:Plug-in
6895:L space
6674:Cluster
6375:Moments
6193:Outline
6112:3286430
6077:0022330
6025:2758558
5931:0914792
5891:1245360
5883:2291267
5820:2371990
5780:1825288
5772:2669782
5682:0606374
5652:1604954
5625:1141746
5617:2289782
5583:1193333
5548:1391639
5527:0829458
4965:Fixing
2329:, i.e.
1026:removed
1011:sources
954:Please
229:robust
143:mixture
123:Robust
8322:Census
7912:Normal
7860:Manova
7680:Robust
7430:2-way
7422:1-way
7260:-test
6931:
6508:Biplot
6299:Median
6292:Lehmer
6234:Center
6110:
6100:
6075:
6031:
6023:
5965:
5929:
5919:
5889:
5881:
5847:
5818:
5808:
5778:
5770:
5723:
5697:
5680:
5670:
5650:
5640:
5623:
5615:
5581:
5546:
5525:
5515:
5489:
5268:
5233:
5200:
4757:, the
1593:i.i.d.
559:, the
380:Gelman
317:median
75:t-test
49:, and
7946:Trend
7475:prior
7417:anova
7306:-test
7280:-test
7272:-test
7179:Power
7124:Pivot
6917:shape
6912:scale
6362:Shape
6342:Range
6287:Heinz
6262:Cubic
6198:Index
6029:S2CID
5963:S2CID
5879:JSTOR
5768:JSTOR
5613:JSTOR
5544:JSTOR
5164:Notes
5039:4.51.
5017:27.49
4950:2.13.
4906:27.40
1173:is a
351:range
290:added
47:scale
8179:Test
7379:Sign
7231:Wald
6304:Mode
6242:Mean
6098:ISBN
5917:ISBN
5845:PMID
5806:ISBN
5721:ISBN
5695:ISBN
5668:ISBN
5638:ISBN
5513:ISBN
5487:ISBN
5266:ISBN
5231:ISBN
5198:ISSN
4928:3.81
4731:For
4570:and
4101:and
4058:(if
3212:>
3146:>
2706:Let
2409:Let
2139:Let
1595:and
1591:are
1501:and
1466:Let
1009:any
1007:cite
886:and
359:and
349:and
337:and
333:The
315:The
306:mean
304:The
269:and
244:bias
33:are
7359:BIC
7354:AIC
6090:doi
6063:doi
6013:doi
5987:doi
5983:133
5955:doi
5909:doi
5871:doi
5837:doi
5760:doi
5713:doi
5605:doi
5569:doi
5258:doi
5223:doi
5190:doi
5186:148
3739:log
3385:sup
3275:sup
3139:inf
2864:lim
2657:at
2637:of
2535:lim
2225:in
1020:by
958:to
207:.
8483::
6108:MR
6106:,
6096:,
6073:MR
6071:,
6059:18
6057:,
6027:,
6021:MR
6019:,
6009:64
6007:,
5981:,
5970:.
5961:,
5949:,
5941:;
5927:MR
5925:,
5915:,
5887:MR
5885:,
5877:,
5867:88
5865:,
5843:,
5833:35
5831:,
5816:MR
5814:,
5798:,
5790:;
5776:MR
5774:,
5766:,
5756:95
5754:,
5719:,
5678:MR
5676:,
5648:MR
5646:,
5621:MR
5619:,
5611:,
5601:85
5599:,
5579:MR
5577:,
5565:20
5563:,
5538:,
5523:MR
5521:,
5422:^
5401:;
5350:,
5318:.
5280:^
5264:.
5229:.
5196:.
5184:.
5180:.
5059:A
4865:.
4224:.
3643:.
3616:.
3381::=
3271::=
3135::=
2860::=
2736:.
2703:.
2489:?
894:.
874:.
723::=
567:.
277:.
45:,
7304:G
7278:F
7270:t
7258:Z
6977:V
6972:U
6174:e
6167:t
6160:v
6115:.
6092::
6080:.
6065::
6046:.
6036:.
6015::
5994:.
5989::
5957::
5951:1
5911::
5894:.
5873::
5852:.
5839::
5823:.
5783:.
5762::
5740:.
5730:.
5715::
5628:.
5607::
5586:.
5571::
5540:7
5496:.
5468:.
5456:.
5444:.
5432:.
5417:.
5405:.
5389:.
5365:.
5304:.
5274:.
5260::
5239:.
5225::
5204:.
5192::
5036:=
5020:,
5014:=
4979:4
4976:=
4947:=
4931:,
4925:=
4909:,
4903:=
4759:t
4745:1
4742:=
4716:.
4707:+
4702:2
4698:x
4693:x
4688:=
4685:)
4682:x
4679:(
4643:t
4636:t
4572:Ï
4568:Ï
4550:.
4547:)
4544:x
4541:(
4538:F
4535:d
4529:)
4526:F
4523:(
4520:T
4515:)
4501:)
4495:,
4492:x
4489:(
4477:(
4469:X
4457:=
4454:M
4431:p
4425:p
4402:)
4399:)
4396:F
4393:(
4390:T
4387:,
4384:x
4381:(
4373:1
4366:M
4362:=
4359:)
4356:F
4353:,
4350:T
4347:;
4344:x
4341:(
4338:F
4335:I
4272:T
4188:)
4185:x
4182:(
4043:x
4040:d
4035:)
4032:x
4029:(
4023:d
4017:=
4014:)
4011:x
4008:(
3985:0
3982:=
3979:)
3974:i
3970:x
3966:(
3958:n
3953:1
3950:=
3947:i
3902:)
3897:i
3893:x
3889:(
3881:n
3876:1
3873:=
3870:i
3852:M
3818:)
3813:i
3809:x
3805:(
3797:n
3792:1
3789:=
3786:i
3761:)
3756:i
3752:x
3748:(
3745:f
3731:n
3726:1
3723:=
3720:i
3695:)
3690:i
3686:x
3682:(
3679:f
3674:n
3669:1
3666:=
3663:i
3604:x
3584:y
3564:y
3544:x
3510:x
3504:y
3499:)
3496:F
3493:;
3490:T
3487:;
3484:x
3481:(
3478:F
3475:I
3469:)
3466:F
3463:;
3460:T
3457:;
3454:y
3451:(
3448:F
3445:I
3431:y
3425:x
3418:2
3412:X
3403:)
3400:y
3397:,
3394:x
3391:(
3378:)
3375:F
3372:;
3369:T
3366:(
3327:|
3323:)
3320:F
3317:;
3314:T
3311:;
3308:x
3305:(
3302:F
3299:I
3295:|
3287:X
3279:x
3268:)
3265:F
3262:;
3259:T
3256:(
3218:}
3215:r
3208:|
3204:x
3200:|
3196:,
3193:0
3190:=
3187:)
3184:F
3181:;
3178:T
3175:;
3172:x
3169:(
3166:F
3163:I
3160::
3157:r
3154:{
3149:0
3143:r
3108:.
3078:,
3048:,
2996:t
2976:x
2953:.
2948:t
2944:)
2941:F
2938:(
2935:T
2929:)
2926:F
2923:)
2920:t
2914:1
2911:(
2908:+
2903:x
2895:t
2892:(
2889:T
2879:+
2875:0
2868:t
2857:)
2854:F
2851:;
2848:T
2845:;
2842:x
2839:(
2836:F
2833:I
2808:x
2800:=
2797:G
2777:}
2774:x
2771:{
2749:x
2722:X
2714:x
2691:F
2685:G
2665:F
2645:T
2626:,
2612:t
2608:)
2605:F
2602:(
2599:T
2593:)
2590:F
2587:)
2584:t
2578:1
2575:(
2572:+
2569:G
2566:t
2563:(
2560:T
2550:+
2546:0
2539:t
2531:=
2528:)
2525:F
2522:(
2517:F
2511:G
2507:T
2503:d
2477:G
2457:F
2437:A
2417:G
2394:F
2371:=
2368:)
2359:F
2355:(
2352:T
2349:,
2310:N
2303:n
2299:)
2293:n
2289:T
2285:(
2259:A
2256::
2253:T
2233:A
2213:F
2147:A
2107:x
2097:i
2080:)
2077:)
2072:n
2068:x
2064:,
2058:,
2053:1
2050:+
2047:i
2043:x
2039:,
2034:i
2030:x
2026:,
2021:1
2015:i
2011:x
2007:,
2001:,
1996:1
1992:x
1988:(
1983:n
1979:T
1972:)
1967:n
1963:x
1959:,
1953:,
1948:1
1945:+
1942:i
1938:x
1934:,
1931:x
1928:,
1923:1
1917:i
1913:x
1909:,
1903:,
1898:1
1894:x
1890:(
1885:n
1881:T
1877:(
1871:n
1863:X
1855:x
1852::
1847:i
1843:F
1839:I
1836:E
1813:i
1791:i
1787:F
1783:I
1780:E
1760:}
1757:n
1754:,
1748:,
1745:1
1742:{
1736:i
1716:)
1713:S
1710:,
1704:(
1698:)
1693:n
1685:,
1680:n
1674:X
1668:(
1665::
1660:n
1656:T
1635:)
1630:n
1626:x
1622:,
1616:,
1611:1
1607:x
1603:(
1579:)
1573:,
1568:X
1563:(
1557:)
1552:A
1547:,
1541:(
1538::
1533:n
1529:X
1525:,
1519:,
1514:1
1510:X
1482:N
1474:n
1459:,
1447:)
1442:B
1437:,
1433:R
1429:(
1426:=
1423:)
1420:S
1417:,
1411:(
1388:+
1383:R
1374:R
1370:=
1357:,
1345:)
1340:B
1335:,
1331:R
1327:(
1324:=
1321:)
1315:,
1310:X
1305:(
1283:)
1280:P
1277:,
1272:A
1267:,
1261:(
1234:)
1231:S
1228:,
1222:(
1212:,
1193:N
1185:p
1139:)
1133:,
1128:X
1123:(
1101:)
1098:P
1095:,
1090:A
1085:,
1079:(
1047:)
1041:(
1036:)
1032:(
1028:.
1014:.
981:)
975:(
970:)
966:(
952:.
860:n
856:x
852:,
846:,
841:1
837:x
811:x
789:n
785:/
781:1
759:n
753:n
749:X
745:+
739:+
734:1
730:X
713:n
709:X
684:n
680:x
676:,
670:,
665:1
661:x
640:)
635:n
631:X
627:,
621:,
616:1
612:X
608:(
588:n
435:R
214:)
210:(
179:.
137:t
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.