36:
5329:
3018:
1918:
467:
5315:
5353:
5341:
2441:
1704:. Given an unobservable function that relates the independent variable to the dependent variable â say, a line â the deviations of the dependent variable observations from this function are the unobservable errors. If one runs a regression on some data, then the deviations of the dependent variable observations from the
1708:
function are the residuals. If the linear model is applicable, a scatterplot of residuals plotted against the independent variable should be random about zero with no trend to the residuals. If the data exhibit a trend, the regression model is likely incorrect; for example, the true function may be a
635:
from which the statistical unit was chosen randomly. For example, if the mean height in a population of 21-year-old men is 1.75 meters, and one randomly chosen man is 1.80 meters tall, then the "error" is 0.05 meters; if the randomly chosen man is 1.70 meters tall, then the "error" is â0.05 meters.
1787:
is the number of parameters estimated in the model (one for each variable in the regression equation, not including the intercept)). One can then also calculate the mean square of the model by dividing the sum of squares of the model minus the degrees of freedom, which is just the number of
1644:
1326:
2296:
1215:
969:
1844:
The use of the term "error" as discussed in the sections above is in the sense of a deviation of a value from a hypothetical unobserved value. At least two other uses also occur in statistics, both referring to observable
1450:
1788:
parameters. Then the F value can be calculated by dividing the mean square of the model by the mean square of the error, and we can then determine significance (which is why you want the mean squares to begin with.).
817:
1830:, where the case in question is somehow different from the others in a dataset. For example, a large residual may be expected in the middle of the domain, but considered an outlier at the end of the domain.
894:
2473:
1775:(they are the same because ANOVA is a type of regression), the sum of squares of the residuals (aka sum of squares of the error) is divided by the degrees of freedom (where the degrees of freedom equal
1768:
minus the number of parameters (excluding the intercept) p being estimated - 1). This forms an unbiased estimate of the variance of the unobserved errors, and is called the mean squared error.
1860:(MSE) refers to the amount by which the values predicted by an estimator differ from the quantities being estimated (typically outside the sample from which the model was estimated). The
1500:
1099:
1581:
1029:
1574:
2466:
616:). In this case, the errors are the deviations of the observations from the population mean, while the residuals are the deviations of the observations from the sample mean.
2366:
692:
Note that, because of the definition of the sample mean, the sum of the residuals within a random sample is necessarily zero, and thus the residuals are necessarily
1527:
1878:(SSR) is the sum of the squares of the deviations of the actual values from the predicted values, within the sample used for estimation. This is the basis for the
4450:
2535:
2459:
5384:
4955:
5105:
4729:
1234:
1129:
3370:
2544:
905:
4503:
2549:
1807:
than the variability of residuals at the ends of the domain: linear regressions fit endpoints better than the middle. This is also reflected in the
497:
4942:
2982:
1369:
407:
2224:
2191:
2120:
2087:
2054:
743:
3365:
3065:
2877:
2517:
397:
1771:
Another method to calculate the mean square of error when analyzing the variance of linear regression using a technique like that used in
3969:
3117:
1660:
appears in both the numerator and the denominator and cancels. That is fortunate because it means that even though we do not know
2857:
2507:
4752:
4644:
2777:
2398:
2280:
2155:
79:
57:
831:
5357:
4930:
4804:
2583:
361:
1363:. That fact, and the normal and chi-squared distributions given above form the basis of calculations involving the t-statistic:
4988:
4649:
4394:
3765:
3355:
1808:
1761:
1120:
651:
of the unobservable statistical error. Consider the previous example with men's heights and suppose we have a random sample of
412:
350:
170:
145:
3979:
5039:
4251:
4058:
3947:
3905:
2819:
2445:
2340:
1952:
1942:
1709:
quadratic or higher order polynomial. If they are random, or have no trend, but "fan out" - they exhibit a phenomenon called
272:
3144:
5379:
5282:
4241:
2425:
1992:
231:
4291:
1803:
where the errors are identically distributed, the variability of residuals of inputs in the middle of the domain will be
5389:
4833:
4782:
4767:
4757:
4626:
4498:
4465:
4246:
4076:
3005:
2905:
2895:
2814:
2759:
1987:
490:
4902:
4203:
1740:
estimate of the variance of the unobserved errors, the bias is removed by dividing the sum of the squared residuals by
5177:
4978:
3957:
3626:
3090:
3032:
2847:
2420:
1874:
1652:
of the numerator and the denominator separately depend on the value of the unobservable population standard deviation
613:
433:
5062:
5029:
1665:
1347:
of a population with unknown mean and unknown variance. No correction is necessary if the population mean is known.
5034:
4777:
4536:
4442:
4422:
4330:
4041:
3859:
3342:
3214:
2527:
2002:
1957:
1882:
estimate, where the regression coefficients are chosen such that the SSR is minimal (i.e. its derivative is zero).
1862:
402:
371:
298:
4208:
3974:
3832:
1458:
1048:
4794:
4562:
4283:
4137:
4066:
3986:
3844:
3825:
3533:
3254:
2872:
2699:
2663:
2632:
2022:
1892:
515:
392:
381:
345:
252:
4907:
2852:
1818:
Thus to compare residuals at different inputs, one needs to adjust the residuals by the expected variability of
640:
of the entire population, is typically unobservable, and hence the statistical error cannot be observed either.
453:
5277:
5044:
4592:
4557:
4521:
4306:
3748:
3657:
3616:
3528:
3219:
3058:
2730:
2694:
2622:
2512:
2494:
1947:
1649:
1113:
695:
324:
247:
140:
119:
50:
44:
4314:
4298:
2388:
5186:
4799:
4739:
4676:
4036:
3898:
3888:
3738:
3652:
2593:
605:
483:
376:
4947:
4884:
1724:(MSE). The mean squared error of a regression is a number computed from the sum of squares of the computed
1224:, on the other hand, is observable. The quotient of that sum by Ï has a chi-squared distribution with only
5224:
5154:
4639:
4526:
3523:
3420:
3327:
3206:
3105:
2946:
2772:
2637:
2627:
2578:
2112:
Econometrics in Theory and
Practice: Analysis of Cross Section, Time Series and Panel Data with Stata 15.1
1887:
1812:
1340:
632:
527:
340:
335:
277:
61:
5345:
4223:
984:
5249:
5191:
5134:
4960:
4853:
4762:
4488:
4372:
4231:
4113:
4105:
3920:
3816:
3794:
3753:
3718:
3685:
3631:
3606:
3561:
3500:
3460:
3262:
3085:
3027:
2987:
2951:
2936:
2887:
2831:
2658:
1680:. This t-statistic can be interpreted as "the number of standard errors away from the regression line."
722:
699:. The statistical errors, on the other hand, are independent, and their sum within the random sample is
428:
124:
5328:
4218:
3017:
2415:
1917:
1540:
466:
2182:
Frederik Michel
Dekking; Cornelis Kraaikamp; Hendrik Paul LopuhaÀ; Ludolf Erwin Meester (2005-06-15).
1220:
However, this quantity is not observable as the population mean is unknown. The sum of squares of the
5172:
4747:
4696:
4672:
4634:
4552:
4531:
4362:
4340:
4309:
4095:
4046:
3964:
3937:
3893:
3849:
3611:
3387:
3267:
2992:
2931:
2918:
2867:
2767:
2689:
2668:
2642:
2017:
1977:
1701:
585:
554:
is the deviation of the observed value from the true value of a quantity of interest (for example, a
448:
319:
287:
242:
221:
129:
5319:
5244:
5167:
4848:
4612:
4605:
4567:
4475:
4455:
4427:
4160:
4026:
4021:
4011:
4003:
3821:
3782:
3672:
3662:
3571:
3350:
3306:
3224:
3149:
3051:
3010:
2941:
2826:
2793:
2745:
2735:
2714:
2709:
2588:
2570:
2555:
2486:
1997:
1972:
1967:
1937:
1689:
1673:
707:
573:
539:
366:
267:
262:
216:
165:
155:
100:
4894:
5333:
5144:
4998:
4843:
4719:
4616:
4600:
4577:
4354:
4088:
4071:
4031:
3942:
3837:
3799:
3770:
3730:
3690:
3636:
3553:
3239:
3234:
3022:
2926:
2915:
2740:
2375:
1932:
1923:
1721:
1710:
1356:
734:
471:
200:
185:
1736:, the number of observations, the result is the mean of the squared residuals. Since this is a
5239:
5209:
5201:
5021:
5012:
4937:
4868:
4724:
4709:
4684:
4572:
4513:
4379:
4367:
3993:
3910:
3854:
3777:
3621:
3543:
3322:
3196:
2977:
2704:
2614:
2605:
2394:
2336:
2276:
2230:
2220:
2197:
2187:
2161:
2151:
2116:
2083:
2050:
2044:
1854:
1839:
1800:
1737:
1360:
257:
160:
114:
2257:
2077:
1639:{\displaystyle \operatorname {Var} \left({\overline {X}}_{n}\right)={\frac {\sigma ^{2}}{n}}}
5264:
5219:
4983:
4970:
4863:
4838:
4772:
4704:
4582:
4190:
4083:
4016:
3929:
3876:
3695:
3566:
3360:
3159:
3126:
2673:
2540:
2332:
2110:
1846:
1714:
535:
282:
211:
1505:
5181:
4925:
4787:
4714:
4389:
4263:
4236:
4213:
4182:
3809:
3804:
3758:
3488:
3139:
2956:
2862:
2798:
1962:
1344:
823:
637:
555:
443:
150:
2803:
2451:
5130:
5125:
3588:
3518:
3164:
2900:
2144:
2012:
2007:
1982:
1035:
628:
531:
195:
2252:
Principles and
Procedures of Statistics, with Special Reference to Biological Sciences
5373:
5287:
5254:
5117:
5078:
4889:
4858:
4322:
4276:
3881:
3583:
3410:
3174:
3169:
2972:
2502:
2482:
1879:
1321:{\displaystyle {\frac {1}{\sigma ^{2}}}\sum _{i=1}^{n}r_{i}^{2}\sim \chi _{n-1}^{2}.}
700:
314:
190:
3440:
2250:
2184:
A modern introduction to probability and statistics : understanding why and how
1891:(SAE) is the sum of the absolute values of the residuals, which is minimized in the
5229:
5162:
5139:
5054:
4384:
3680:
3578:
3513:
3455:
3377:
3332:
2357:
1823:
1210:{\displaystyle {\frac {1}{\sigma ^{2}}}\sum _{i=1}^{n}e_{i}^{2}\sim \chi _{n}^{2}.}
589:
180:
669:
The difference between the height of each man in the sample and the unobservable
5272:
5234:
4917:
4818:
4680:
4493:
4460:
3952:
3869:
3864:
3508:
3465:
3445:
3425:
3415:
3184:
2560:
2361:
1672: â 1 degrees of freedom. We can therefore use this quotient to find a
964:{\displaystyle {\overline {X}}\sim N\left(\mu ,{\frac {\sigma ^{2}}{n}}\right).}
715:
657:
569:
551:
226:
175:
680:
The difference between the height of each man in the sample and the observable
4118:
3598:
3298:
3229:
3179:
3154:
3074:
1913:
1359:
and the sample mean can be shown to be independent of each other, using, e.g.
564:
543:
511:
2234:
2201:
4271:
4123:
3743:
3538:
3450:
3435:
3430:
3395:
1445:{\displaystyle T={\frac {{\overline {X}}_{n}-\mu _{0}}{S_{n}/{\sqrt {n}}}},}
2165:
2440:
17:
3787:
3405:
3282:
3277:
3272:
3244:
1827:
5292:
4993:
2379:
1795:
of residuals at different data points (of the input variable) may vary
812:{\displaystyle X_{1},\dots ,X_{n}\sim N\left(\mu ,\sigma ^{2}\right)\,}
711:
5214:
4195:
4169:
4149:
3400:
3191:
2217:
Practical statistics for data scientists : 50 essential concepts
1713:. If all of the residuals are equal, or do not fan out, they exhibit
1799:
the errors themselves are identically distributed. Concretely, in a
1791:
However, because of the behavior of the process of regression, the
1772:
1664:, we know the probability distribution of this quotient: it has a
1576:
accounts for the standard deviation of the errors according to:
3134:
2275:(Online-Ausg. ed.). Cambridge: Cambridge University Press.
609:
5103:
4670:
4417:
3716:
3486:
3103:
3047:
2455:
2326:
733:
If we assume a normally distributed population with mean Ό and
1720:
However, a terminological difference arises in the expression
1529:
represents the sample standard deviation for a sample of size
29:
3043:
889:{\displaystyle {\overline {X}}={X_{1}+\cdots +X_{n} \over n}}
647:(or fitting deviation), on the other hand, is an observable
526:
are two closely related and easily confused measures of the
1826:. This is particularly important in the case of detecting
627:) is the amount by which an observation differs from its
706:
One can standardize statistical errors (especially of a
2219:(First ed.). Sebastopol, CA: O'Reilly Media Inc.
737:Ï, and choose individuals independently, then we have
714:(or "standard score"), and standardize residuals in a
1700:
is subtle and important, and leads to the concept of
1584:
1543:
1508:
1461:
1372:
1237:
1132:
1051:
987:
908:
834:
746:
562:
is the difference between the observed value and the
4956:
Autoregressive conditional heteroskedasticity (ARCH)
5263:
5200:
5153:
5116:
5071:
5053:
5020:
5011:
4969:
4916:
4877:
4826:
4817:
4738:
4695:
4625:
4591:
4545:
4512:
4474:
4441:
4353:
4262:
4181:
4136:
4104:
4057:
4002:
3928:
3919:
3729:
3671:
3645:
3597:
3552:
3499:
3386:
3341:
3315:
3297:
3253:
3205:
3125:
3116:
2965:
2914:
2886:
2840:
2786:
2758:
2723:
2682:
2651:
2613:
2602:
2569:
2526:
2493:
2367:Journal of the Royal Statistical Society, Series B
2249:
2143:
1906:(MR) is always zero for least-squares estimators.
1638:
1568:
1521:
1494:
1444:
1320:
1209:
1093:
1023:
963:
888:
811:
568:value of the quantity of interest (for example, a
604:Suppose there is a series of observations from a
1870:(SSE) is the MSE multiplied by the sample size.
4504:Multivariate adaptive regression splines (MARS)
2248:Steel, Robert G. D.; Torrie, James H. (1960).
576:, where the concepts are sometimes called the
3059:
2467:
2364:(1968). "A general definition of residuals".
2297:"7.3: Types of Outliers in Linear Regression"
2177:
2175:
1339: â 1 degrees of freedom results in
491:
8:
2079:Introductory Econometrics: A Modern Approach
1834:Other uses of the word "error" in statistics
1495:{\displaystyle {\overline {X}}_{n}-\mu _{0}}
1094:{\displaystyle r_{i}=X_{i}-{\overline {X}}.}
899:is a random variable distributed such that:
2325:Cook, R. Dennis; Weisberg, Sanford (1982).
5113:
5100:
5017:
4823:
4692:
4667:
4438:
4414:
4142:
3925:
3726:
3713:
3496:
3483:
3122:
3113:
3100:
3066:
3052:
3044:
2610:
2474:
2460:
2452:
498:
484:
91:
1625:
1619:
1606:
1596:
1583:
1559:
1554:
1548:
1542:
1513:
1507:
1486:
1473:
1463:
1460:
1429:
1424:
1418:
1406:
1393:
1383:
1379:
1371:
1309:
1298:
1285:
1280:
1270:
1259:
1247:
1238:
1236:
1198:
1193:
1180:
1175:
1165:
1154:
1142:
1133:
1131:
1078:
1069:
1056:
1050:
1020:
1005:
992:
986:
942:
936:
909:
907:
874:
855:
848:
835:
833:
808:
797:
770:
751:
745:
80:Learn how and when to remove this message
2215:Peter Bruce; Andrew Bruce (2017-05-10).
572:). The distinction is most important in
43:This article includes a list of general
2983:Numerical smoothing and differentiation
2035:
1732:. If that sum of squares is divided by
661:could serve as a good estimator of the
420:
306:
106:
99:
5030:KaplanâMeier estimator (product limit)
1866:(RMSE) is the square-root of MSE. The
631:, the latter being based on the whole
584:and where they lead to the concept of
2328:Residuals and Influence in Regression
7:
5385:Statistical deviation and dispersion
5340:
5040:Accelerated failure time (AFT) model
2518:Iteratively reweighted least squares
612:of that distribution (the so-called
546:" (not necessarily observable). The
5352:
4635:Analysis of variance (ANOVA, anova)
1228: â 1 degrees of freedom:
1024:{\displaystyle e_{i}=X_{i}-\mu ,\,}
4730:CochranâMantelâHaenszel statistics
3356:Pearson product-moment correlation
2536:Pearson product-moment correlation
49:it lacks sufficient corresponding
25:
2393:(2nd ed.). New York: Wiley.
2115:. Springer Singapore. p. 7.
1815:: endpoints have more influence.
1569:{\displaystyle S_{n}/{\sqrt {n}}}
5351:
5339:
5327:
5314:
5313:
3016:
2439:
2146:Intermediate statistical methods
2082:. Cengage Learning. p. 57.
1916:
465:
34:
4989:Least-squares spectral analysis
1357:sum of squares of the residuals
413:Least-squares spectral analysis
351:Generalized estimating equation
171:Multinomial logistic regression
146:Vector generalized linear model
3970:Mean-unbiased minimum-variance
2273:Applied linear models with SAS
2142:Wetherill, G. Barrie. (1981).
1953:Innovation (signal processing)
1943:Error detection and correction
1811:of various data points on the
1728:, and not of the unobservable
636:The expected value, being the
1:
5283:Geographic information system
4499:Simultaneous equations models
1993:Reduced chi-squared statistic
232:Nonlinear mixed-effects model
4466:Coefficient of determination
4077:Uniformly most powerful test
3006:Regression analysis category
2896:Response surface methodology
2331:(Repr. ed.). New York:
2150:. London: Chapman and Hall.
1988:Random and systematic errors
1601:
1468:
1388:
1083:
1038:values of zero, whereas the
914:
840:
608:and we want to estimate the
5035:Proportional hazards models
4979:Spectral density estimation
4961:Vector autoregression (VAR)
4395:Maximum posterior estimator
3627:Randomized controlled trial
2878:FrischâWaughâLovell theorem
2848:Mean and predicted response
2421:Encyclopedia of Mathematics
2186:. London: Springer London.
1875:Sum of squares of residuals
1752: â 1, instead of
1537:, and the denominator term
729:In univariate distributions
592:, "errors" are also called
434:Mean and predicted response
5406:
4795:Multivariate distributions
3215:Average absolute deviation
2528:Correlation and dependence
2387:Weisberg, Sanford (1985).
2271:Zelterman, Daniel (2010).
2003:Root mean square deviation
1958:Lack-of-fit sum of squares
1837:
1692:, the distinction between
1355:It is remarkable that the
1104:The sum of squares of the
227:Linear mixed-effects model
5309:
5112:
5099:
4783:Structural equation model
4691:
4666:
4437:
4413:
4145:
4119:Score/Lagrange multiplier
3725:
3712:
3534:Sample size determination
3495:
3482:
3112:
3099:
3081:
3001:
2873:Minimum mean-square error
2760:Decomposition of variance
2664:Growth curve (statistics)
2633:Generalized least squares
2390:Applied Linear Regression
2076:Wooldridge, J.M. (2019).
2023:Type I and type II errors
1893:least absolute deviations
1650:probability distributions
393:Least absolute deviations
5278:Environmental statistics
4800:Elliptical distributions
4593:Generalized linear model
4522:Simple linear regression
4292:HodgesâLehmann estimator
3749:Probability distribution
3658:Stochastic approximation
3220:Coefficient of variation
2731:Generalized linear model
2623:Simple linear regression
2513:Non-linear least squares
2495:Computational statistics
1948:Explained sum of squares
1895:approach to regression.
1868:sum of squares of errors
1666:Student's t-distribution
1331:This difference between
1114:chi-squared distribution
141:Generalized linear model
4938:Cross-correlation (XCF)
4546:Non-standard predictors
3980:LehmannâScheffĂ© theorem
3653:Adaptive clinical trial
2256:. McGraw-Hill. p.
2046:A Guide to Econometrics
1813:regression coefficients
1502:represents the errors,
606:univariate distribution
64:more precise citations.
5334:Mathematics portal
5155:Engineering statistics
5063:NelsonâAalen estimator
4640:Analysis of covariance
4527:Ordinary least squares
4451:Pearson product-moment
3855:Statistical functional
3766:Empirical distribution
3599:Controlled experiments
3328:Frequency distribution
3106:Descriptive statistics
3023:Mathematics portal
2947:Orthogonal polynomials
2773:Analysis of covariance
2638:Weighted least squares
2628:Ordinary least squares
2579:Ordinary least squares
2049:. Wiley. p. 576.
1902:(ME) is the bias. The
1888:sum of absolute errors
1863:root mean square error
1783: â 1, where
1640:
1570:
1523:
1496:
1446:
1343:for the estimation of
1322:
1275:
1211:
1170:
1095:
1025:
965:
890:
813:
472:Mathematics portal
398:Iteratively reweighted
5250:Population statistics
5192:System identification
4926:Autocorrelation (ACF)
4854:Exponential smoothing
4768:Discriminant analysis
4763:Canonical correlation
4627:Partition of variance
4489:Regression validation
4333:(JonckheereâTerpstra)
4232:Likelihood-ratio test
3921:Frequentist inference
3833:Locationâscale family
3754:Sampling distribution
3719:Statistical inference
3686:Cross-sectional study
3673:Observational studies
3632:Randomized experiment
3461:Stem-and-leaf display
3263:Central limit theorem
2988:System identification
2952:Chebyshev polynomials
2937:Numerical integration
2888:Design of experiments
2832:Regression validation
2659:Polynomial regression
2584:Partial least squares
2301:Statistics LibreTexts
1702:studentized residuals
1641:
1571:
1524:
1522:{\displaystyle S_{n}}
1497:
1447:
1323:
1255:
1212:
1150:
1096:
1026:
966:
891:
814:
723:studentized residuals
586:studentized residuals
429:Regression validation
408:Bayesian multivariate
125:Polynomial regression
5380:Errors and residuals
5173:Probabilistic design
4758:Principal components
4601:Exponential families
4553:Nonlinear regression
4532:General linear model
4494:Mixed effects models
4484:Errors and residuals
4461:Confounding variable
4363:Bayesian probability
4341:Van der Waerden test
4331:Ordered alternative
4096:Multiple comparisons
3975:RaoâBlackwellization
3938:Estimating equations
3894:Statistical distance
3612:Factorial experiment
3145:Arithmetic-Geometric
2993:Moving least squares
2932:Approximation theory
2868:Studentized residual
2858:Errors and residuals
2853:GaussâMarkov theorem
2768:Analysis of variance
2690:Nonlinear regression
2669:Segmented regression
2643:General linear model
2561:Confounding variable
2508:Linear least squares
2448:at Wikimedia Commons
2446:Errors and residuals
2043:Kennedy, P. (2008).
2018:Studentized residual
1978:Propagation of error
1582:
1541:
1506:
1459:
1370:
1235:
1130:
1049:
985:
906:
832:
744:
721:, or more generally
665:mean. Then we have:
582:regression residuals
454:GaussâMarkov theorem
449:Studentized residual
439:Errors and residuals
273:Principal components
243:Nonlinear regression
130:General linear model
5390:Regression analysis
5245:Official statistics
5168:Methods engineering
4849:Seasonal adjustment
4617:Poisson regressions
4537:Bayesian regression
4476:Regression analysis
4456:Partial correlation
4428:Regression analysis
4027:Prediction interval
4022:Likelihood interval
4012:Confidence interval
4004:Interval estimation
3965:Unbiased estimators
3783:Model specification
3663:Up-and-down designs
3351:Partial correlation
3307:Index of dispersion
3225:Interquartile range
3011:Statistics category
2942:Gaussian quadrature
2827:Model specification
2794:Stepwise regression
2652:Predictor structure
2589:Total least squares
2571:Regression analysis
2556:Partial correlation
2487:regression analysis
2416:"Errors, theory of"
1998:Regression dilution
1973:Observational error
1968:Mean absolute error
1938:Consensus forecasts
1809:influence functions
1690:regression analysis
1674:confidence interval
1341:Bessel's correction
1314:
1290:
1203:
1185:
708:normal distribution
574:regression analysis
299:Errors-in-variables
166:Logistic regression
156:Binomial regression
101:Regression analysis
95:Part of a series on
5265:Spatial statistics
5145:Medical statistics
5045:First hitting time
4999:Whittle likelihood
4650:Degrees of freedom
4645:Multivariate ANOVA
4578:Heteroscedasticity
4390:Bayesian estimator
4355:Bayesian inference
4204:KolmogorovâSmirnov
4089:Randomization test
4059:Testing hypotheses
4032:Tolerance interval
3943:Maximum likelihood
3838:Exponential family
3771:Density estimation
3731:Statistical theory
3691:Natural experiment
3637:Scientific control
3554:Survey methodology
3240:Standard deviation
3028:Statistics outline
2927:Numerical analysis
1933:Absolute deviation
1924:Mathematics portal
1856:mean squared error
1762:degrees of freedom
1722:mean squared error
1711:heteroscedasticity
1636:
1566:
1519:
1492:
1442:
1318:
1294:
1276:
1207:
1189:
1171:
1121:degrees of freedom
1106:statistical errors
1091:
1021:
976:statistical errors
961:
886:
809:
735:standard deviation
540:statistical sample
186:Multinomial probit
27:Statistics concept
5367:
5366:
5305:
5304:
5301:
5300:
5240:National accounts
5210:Actuarial science
5202:Social statistics
5095:
5094:
5091:
5090:
5087:
5086:
5022:Survival function
5007:
5006:
4869:Granger causality
4710:Contingency table
4685:Survival analysis
4662:
4661:
4658:
4657:
4514:Linear regression
4409:
4408:
4405:
4404:
4380:Credible interval
4349:
4348:
4132:
4131:
3948:Method of moments
3817:Parametric family
3778:Statistical model
3708:
3707:
3704:
3703:
3622:Random assignment
3544:Statistical power
3478:
3477:
3474:
3473:
3323:Contingency table
3293:
3292:
3160:Generalized/power
3041:
3040:
3033:Statistics topics
2978:Calibration curve
2787:Model exploration
2754:
2753:
2724:Non-normal errors
2615:Linear regression
2606:statistical model
2444:Media related to
2226:978-1-4919-5296-2
2193:978-1-85233-896-1
2122:978-981-329-019-8
2089:978-1-337-67133-0
2056:978-1-4051-8257-7
1847:prediction errors
1840:Bias (statistics)
1801:linear regression
1760:is the number of
1634:
1604:
1564:
1471:
1437:
1434:
1391:
1253:
1148:
1086:
951:
917:
884:
843:
675:statistical error
621:statistical error
578:regression errors
508:
507:
161:Binary regression
120:Simple regression
115:Linear regression
90:
89:
82:
16:(Redirected from
5397:
5355:
5354:
5343:
5342:
5332:
5331:
5317:
5316:
5220:Crime statistics
5114:
5101:
5018:
4984:Fourier analysis
4971:Frequency domain
4951:
4898:
4864:Structural break
4824:
4773:Cluster analysis
4720:Log-linear model
4693:
4668:
4609:
4583:Homoscedasticity
4439:
4415:
4334:
4326:
4318:
4317:(KruskalâWallis)
4302:
4287:
4242:Cross validation
4227:
4209:AndersonâDarling
4156:
4143:
4114:Likelihood-ratio
4106:Parametric tests
4084:Permutation test
4067:1- & 2-tails
3958:Minimum distance
3930:Point estimation
3926:
3877:Optimal decision
3828:
3727:
3714:
3696:Quasi-experiment
3646:Adaptive designs
3497:
3484:
3361:Rank correlation
3123:
3114:
3101:
3068:
3061:
3054:
3045:
3021:
3020:
2778:Multivariate AOV
2674:Local regression
2611:
2603:Regression as a
2594:Ridge regression
2541:Rank correlation
2476:
2469:
2462:
2453:
2443:
2429:
2411:
2409:
2407:
2383:
2353:
2351:
2349:
2333:Chapman and Hall
2312:
2311:
2309:
2308:
2293:
2287:
2286:
2268:
2262:
2261:
2255:
2245:
2239:
2238:
2212:
2206:
2205:
2179:
2170:
2169:
2149:
2139:
2133:
2132:
2130:
2129:
2109:Das, P. (2019).
2106:
2100:
2099:
2097:
2096:
2073:
2067:
2066:
2064:
2063:
2040:
1926:
1921:
1920:
1822:which is called
1715:homoscedasticity
1645:
1643:
1642:
1637:
1635:
1630:
1629:
1620:
1615:
1611:
1610:
1605:
1597:
1575:
1573:
1572:
1567:
1565:
1560:
1558:
1553:
1552:
1528:
1526:
1525:
1520:
1518:
1517:
1501:
1499:
1498:
1493:
1491:
1490:
1478:
1477:
1472:
1464:
1451:
1449:
1448:
1443:
1438:
1436:
1435:
1430:
1428:
1423:
1422:
1412:
1411:
1410:
1398:
1397:
1392:
1384:
1380:
1327:
1325:
1324:
1319:
1313:
1308:
1289:
1284:
1274:
1269:
1254:
1252:
1251:
1239:
1216:
1214:
1213:
1208:
1202:
1197:
1184:
1179:
1169:
1164:
1149:
1147:
1146:
1134:
1100:
1098:
1097:
1092:
1087:
1079:
1074:
1073:
1061:
1060:
1030:
1028:
1027:
1022:
1010:
1009:
997:
996:
970:
968:
967:
962:
957:
953:
952:
947:
946:
937:
918:
910:
895:
893:
892:
887:
885:
880:
879:
878:
860:
859:
849:
844:
836:
818:
816:
815:
810:
807:
803:
802:
801:
775:
774:
756:
755:
500:
493:
486:
470:
469:
377:Ridge regression
212:Multilevel model
92:
85:
78:
74:
71:
65:
60:this article by
51:inline citations
38:
37:
30:
21:
5405:
5404:
5400:
5399:
5398:
5396:
5395:
5394:
5370:
5369:
5368:
5363:
5326:
5297:
5259:
5196:
5182:quality control
5149:
5131:Clinical trials
5108:
5083:
5067:
5055:Hazard function
5049:
5003:
4965:
4949:
4912:
4908:BreuschâGodfrey
4896:
4873:
4813:
4788:Factor analysis
4734:
4715:Graphical model
4687:
4654:
4621:
4607:
4587:
4541:
4508:
4470:
4433:
4432:
4401:
4345:
4332:
4324:
4316:
4300:
4285:
4264:Rank statistics
4258:
4237:Model selection
4225:
4183:Goodness of fit
4177:
4154:
4128:
4100:
4053:
3998:
3987:Median unbiased
3915:
3826:
3759:Order statistic
3721:
3700:
3667:
3641:
3593:
3548:
3491:
3489:Data collection
3470:
3382:
3337:
3311:
3289:
3249:
3201:
3118:Continuous data
3108:
3095:
3077:
3072:
3042:
3037:
3015:
2997:
2961:
2957:Chebyshev nodes
2910:
2906:Bayesian design
2882:
2863:Goodness of fit
2836:
2809:
2799:Model selection
2782:
2750:
2719:
2678:
2647:
2604:
2598:
2565:
2522:
2489:
2480:
2436:
2414:
2405:
2403:
2401:
2386:
2362:Snell, E. Joyce
2356:
2347:
2345:
2343:
2324:
2321:
2319:Further reading
2316:
2315:
2306:
2304:
2295:
2294:
2290:
2283:
2270:
2269:
2265:
2247:
2246:
2242:
2227:
2214:
2213:
2209:
2194:
2181:
2180:
2173:
2158:
2141:
2140:
2136:
2127:
2125:
2123:
2108:
2107:
2103:
2094:
2092:
2090:
2075:
2074:
2070:
2061:
2059:
2057:
2042:
2041:
2037:
2032:
2027:
1963:Margin of error
1922:
1915:
1912:
1842:
1836:
1686:
1621:
1595:
1591:
1580:
1579:
1544:
1539:
1538:
1509:
1504:
1503:
1482:
1462:
1457:
1456:
1414:
1413:
1402:
1382:
1381:
1368:
1367:
1353:
1345:sample variance
1243:
1233:
1232:
1138:
1128:
1127:
1065:
1052:
1047:
1046:
1001:
988:
983:
982:
938:
929:
925:
904:
903:
870:
851:
850:
830:
829:
793:
786:
782:
766:
747:
742:
741:
731:
602:
556:population mean
504:
464:
444:Goodness of fit
151:Discrete choice
86:
75:
69:
66:
56:Please help to
55:
39:
35:
28:
23:
22:
15:
12:
11:
5:
5403:
5401:
5393:
5392:
5387:
5382:
5372:
5371:
5365:
5364:
5362:
5361:
5349:
5337:
5323:
5310:
5307:
5306:
5303:
5302:
5299:
5298:
5296:
5295:
5290:
5285:
5280:
5275:
5269:
5267:
5261:
5260:
5258:
5257:
5252:
5247:
5242:
5237:
5232:
5227:
5222:
5217:
5212:
5206:
5204:
5198:
5197:
5195:
5194:
5189:
5184:
5175:
5170:
5165:
5159:
5157:
5151:
5150:
5148:
5147:
5142:
5137:
5128:
5126:Bioinformatics
5122:
5120:
5110:
5109:
5104:
5097:
5096:
5093:
5092:
5089:
5088:
5085:
5084:
5082:
5081:
5075:
5073:
5069:
5068:
5066:
5065:
5059:
5057:
5051:
5050:
5048:
5047:
5042:
5037:
5032:
5026:
5024:
5015:
5009:
5008:
5005:
5004:
5002:
5001:
4996:
4991:
4986:
4981:
4975:
4973:
4967:
4966:
4964:
4963:
4958:
4953:
4945:
4940:
4935:
4934:
4933:
4931:partial (PACF)
4922:
4920:
4914:
4913:
4911:
4910:
4905:
4900:
4892:
4887:
4881:
4879:
4878:Specific tests
4875:
4874:
4872:
4871:
4866:
4861:
4856:
4851:
4846:
4841:
4836:
4830:
4828:
4821:
4815:
4814:
4812:
4811:
4810:
4809:
4808:
4807:
4792:
4791:
4790:
4780:
4778:Classification
4775:
4770:
4765:
4760:
4755:
4750:
4744:
4742:
4736:
4735:
4733:
4732:
4727:
4725:McNemar's test
4722:
4717:
4712:
4707:
4701:
4699:
4689:
4688:
4671:
4664:
4663:
4660:
4659:
4656:
4655:
4653:
4652:
4647:
4642:
4637:
4631:
4629:
4623:
4622:
4620:
4619:
4603:
4597:
4595:
4589:
4588:
4586:
4585:
4580:
4575:
4570:
4565:
4563:Semiparametric
4560:
4555:
4549:
4547:
4543:
4542:
4540:
4539:
4534:
4529:
4524:
4518:
4516:
4510:
4509:
4507:
4506:
4501:
4496:
4491:
4486:
4480:
4478:
4472:
4471:
4469:
4468:
4463:
4458:
4453:
4447:
4445:
4435:
4434:
4431:
4430:
4425:
4419:
4418:
4411:
4410:
4407:
4406:
4403:
4402:
4400:
4399:
4398:
4397:
4387:
4382:
4377:
4376:
4375:
4370:
4359:
4357:
4351:
4350:
4347:
4346:
4344:
4343:
4338:
4337:
4336:
4328:
4320:
4304:
4301:(MannâWhitney)
4296:
4295:
4294:
4281:
4280:
4279:
4268:
4266:
4260:
4259:
4257:
4256:
4255:
4254:
4249:
4244:
4234:
4229:
4226:(ShapiroâWilk)
4221:
4216:
4211:
4206:
4201:
4193:
4187:
4185:
4179:
4178:
4176:
4175:
4167:
4158:
4146:
4140:
4138:Specific tests
4134:
4133:
4130:
4129:
4127:
4126:
4121:
4116:
4110:
4108:
4102:
4101:
4099:
4098:
4093:
4092:
4091:
4081:
4080:
4079:
4069:
4063:
4061:
4055:
4054:
4052:
4051:
4050:
4049:
4044:
4034:
4029:
4024:
4019:
4014:
4008:
4006:
4000:
3999:
3997:
3996:
3991:
3990:
3989:
3984:
3983:
3982:
3977:
3962:
3961:
3960:
3955:
3950:
3945:
3934:
3932:
3923:
3917:
3916:
3914:
3913:
3908:
3903:
3902:
3901:
3891:
3886:
3885:
3884:
3874:
3873:
3872:
3867:
3862:
3852:
3847:
3842:
3841:
3840:
3835:
3830:
3814:
3813:
3812:
3807:
3802:
3792:
3791:
3790:
3785:
3775:
3774:
3773:
3763:
3762:
3761:
3751:
3746:
3741:
3735:
3733:
3723:
3722:
3717:
3710:
3709:
3706:
3705:
3702:
3701:
3699:
3698:
3693:
3688:
3683:
3677:
3675:
3669:
3668:
3666:
3665:
3660:
3655:
3649:
3647:
3643:
3642:
3640:
3639:
3634:
3629:
3624:
3619:
3614:
3609:
3603:
3601:
3595:
3594:
3592:
3591:
3589:Standard error
3586:
3581:
3576:
3575:
3574:
3569:
3558:
3556:
3550:
3549:
3547:
3546:
3541:
3536:
3531:
3526:
3521:
3519:Optimal design
3516:
3511:
3505:
3503:
3493:
3492:
3487:
3480:
3479:
3476:
3475:
3472:
3471:
3469:
3468:
3463:
3458:
3453:
3448:
3443:
3438:
3433:
3428:
3423:
3418:
3413:
3408:
3403:
3398:
3392:
3390:
3384:
3383:
3381:
3380:
3375:
3374:
3373:
3368:
3358:
3353:
3347:
3345:
3339:
3338:
3336:
3335:
3330:
3325:
3319:
3317:
3316:Summary tables
3313:
3312:
3310:
3309:
3303:
3301:
3295:
3294:
3291:
3290:
3288:
3287:
3286:
3285:
3280:
3275:
3265:
3259:
3257:
3251:
3250:
3248:
3247:
3242:
3237:
3232:
3227:
3222:
3217:
3211:
3209:
3203:
3202:
3200:
3199:
3194:
3189:
3188:
3187:
3182:
3177:
3172:
3167:
3162:
3157:
3152:
3150:Contraharmonic
3147:
3142:
3131:
3129:
3120:
3110:
3109:
3104:
3097:
3096:
3094:
3093:
3088:
3082:
3079:
3078:
3073:
3071:
3070:
3063:
3056:
3048:
3039:
3038:
3036:
3035:
3030:
3025:
3013:
3008:
3002:
2999:
2998:
2996:
2995:
2990:
2985:
2980:
2975:
2969:
2967:
2963:
2962:
2960:
2959:
2954:
2949:
2944:
2939:
2934:
2929:
2923:
2921:
2912:
2911:
2909:
2908:
2903:
2901:Optimal design
2898:
2892:
2890:
2884:
2883:
2881:
2880:
2875:
2870:
2865:
2860:
2855:
2850:
2844:
2842:
2838:
2837:
2835:
2834:
2829:
2824:
2823:
2822:
2817:
2812:
2807:
2796:
2790:
2788:
2784:
2783:
2781:
2780:
2775:
2770:
2764:
2762:
2756:
2755:
2752:
2751:
2749:
2748:
2743:
2738:
2733:
2727:
2725:
2721:
2720:
2718:
2717:
2712:
2707:
2702:
2700:Semiparametric
2697:
2692:
2686:
2684:
2680:
2679:
2677:
2676:
2671:
2666:
2661:
2655:
2653:
2649:
2648:
2646:
2645:
2640:
2635:
2630:
2625:
2619:
2617:
2608:
2600:
2599:
2597:
2596:
2591:
2586:
2581:
2575:
2573:
2567:
2566:
2564:
2563:
2558:
2553:
2547:
2545:Spearman's rho
2538:
2532:
2530:
2524:
2523:
2521:
2520:
2515:
2510:
2505:
2499:
2497:
2491:
2490:
2481:
2479:
2478:
2471:
2464:
2456:
2450:
2449:
2435:
2434:External links
2432:
2431:
2430:
2412:
2399:
2384:
2374:(2): 248â275.
2354:
2341:
2320:
2317:
2314:
2313:
2288:
2281:
2263:
2240:
2225:
2207:
2192:
2171:
2156:
2134:
2121:
2101:
2088:
2068:
2055:
2034:
2033:
2031:
2028:
2026:
2025:
2020:
2015:
2013:Standard error
2010:
2008:Sampling error
2005:
2000:
1995:
1990:
1985:
1983:Probable error
1980:
1975:
1970:
1965:
1960:
1955:
1950:
1945:
1940:
1935:
1929:
1928:
1927:
1911:
1908:
1885:Likewise, the
1835:
1832:
1685:
1682:
1633:
1628:
1624:
1618:
1614:
1609:
1603:
1600:
1594:
1590:
1587:
1563:
1557:
1551:
1547:
1533:, and unknown
1516:
1512:
1489:
1485:
1481:
1476:
1470:
1467:
1453:
1452:
1441:
1433:
1427:
1421:
1417:
1409:
1405:
1401:
1396:
1390:
1387:
1378:
1375:
1361:Basu's theorem
1352:
1349:
1329:
1328:
1317:
1312:
1307:
1304:
1301:
1297:
1293:
1288:
1283:
1279:
1273:
1268:
1265:
1262:
1258:
1250:
1246:
1242:
1218:
1217:
1206:
1201:
1196:
1192:
1188:
1183:
1178:
1174:
1168:
1163:
1160:
1157:
1153:
1145:
1141:
1137:
1102:
1101:
1090:
1085:
1082:
1077:
1072:
1068:
1064:
1059:
1055:
1032:
1031:
1019:
1016:
1013:
1008:
1004:
1000:
995:
991:
972:
971:
960:
956:
950:
945:
941:
935:
932:
928:
924:
921:
916:
913:
897:
896:
883:
877:
873:
869:
866:
863:
858:
854:
847:
842:
839:
820:
819:
806:
800:
796:
792:
789:
785:
781:
778:
773:
769:
765:
762:
759:
754:
750:
730:
727:
690:
689:
678:
629:expected value
614:location model
601:
598:
532:observed value
506:
505:
503:
502:
495:
488:
480:
477:
476:
475:
474:
459:
458:
457:
456:
451:
446:
441:
436:
431:
423:
422:
418:
417:
416:
415:
410:
405:
400:
395:
387:
386:
385:
384:
379:
374:
369:
364:
356:
355:
354:
353:
348:
343:
338:
330:
329:
328:
327:
322:
317:
309:
308:
304:
303:
302:
301:
293:
292:
291:
290:
285:
280:
275:
270:
265:
260:
255:
253:Semiparametric
250:
245:
237:
236:
235:
234:
229:
224:
222:Random effects
219:
214:
206:
205:
204:
203:
198:
196:Ordered probit
193:
188:
183:
178:
173:
168:
163:
158:
153:
148:
143:
135:
134:
133:
132:
127:
122:
117:
109:
108:
104:
103:
97:
96:
88:
87:
70:September 2016
42:
40:
33:
26:
24:
14:
13:
10:
9:
6:
4:
3:
2:
5402:
5391:
5388:
5386:
5383:
5381:
5378:
5377:
5375:
5360:
5359:
5350:
5348:
5347:
5338:
5336:
5335:
5330:
5324:
5322:
5321:
5312:
5311:
5308:
5294:
5291:
5289:
5288:Geostatistics
5286:
5284:
5281:
5279:
5276:
5274:
5271:
5270:
5268:
5266:
5262:
5256:
5255:Psychometrics
5253:
5251:
5248:
5246:
5243:
5241:
5238:
5236:
5233:
5231:
5228:
5226:
5223:
5221:
5218:
5216:
5213:
5211:
5208:
5207:
5205:
5203:
5199:
5193:
5190:
5188:
5185:
5183:
5179:
5176:
5174:
5171:
5169:
5166:
5164:
5161:
5160:
5158:
5156:
5152:
5146:
5143:
5141:
5138:
5136:
5132:
5129:
5127:
5124:
5123:
5121:
5119:
5118:Biostatistics
5115:
5111:
5107:
5102:
5098:
5080:
5079:Log-rank test
5077:
5076:
5074:
5070:
5064:
5061:
5060:
5058:
5056:
5052:
5046:
5043:
5041:
5038:
5036:
5033:
5031:
5028:
5027:
5025:
5023:
5019:
5016:
5014:
5010:
5000:
4997:
4995:
4992:
4990:
4987:
4985:
4982:
4980:
4977:
4976:
4974:
4972:
4968:
4962:
4959:
4957:
4954:
4952:
4950:(BoxâJenkins)
4946:
4944:
4941:
4939:
4936:
4932:
4929:
4928:
4927:
4924:
4923:
4921:
4919:
4915:
4909:
4906:
4904:
4903:DurbinâWatson
4901:
4899:
4893:
4891:
4888:
4886:
4885:DickeyâFuller
4883:
4882:
4880:
4876:
4870:
4867:
4865:
4862:
4860:
4859:Cointegration
4857:
4855:
4852:
4850:
4847:
4845:
4842:
4840:
4837:
4835:
4834:Decomposition
4832:
4831:
4829:
4825:
4822:
4820:
4816:
4806:
4803:
4802:
4801:
4798:
4797:
4796:
4793:
4789:
4786:
4785:
4784:
4781:
4779:
4776:
4774:
4771:
4769:
4766:
4764:
4761:
4759:
4756:
4754:
4751:
4749:
4746:
4745:
4743:
4741:
4737:
4731:
4728:
4726:
4723:
4721:
4718:
4716:
4713:
4711:
4708:
4706:
4705:Cohen's kappa
4703:
4702:
4700:
4698:
4694:
4690:
4686:
4682:
4678:
4674:
4669:
4665:
4651:
4648:
4646:
4643:
4641:
4638:
4636:
4633:
4632:
4630:
4628:
4624:
4618:
4614:
4610:
4604:
4602:
4599:
4598:
4596:
4594:
4590:
4584:
4581:
4579:
4576:
4574:
4571:
4569:
4566:
4564:
4561:
4559:
4558:Nonparametric
4556:
4554:
4551:
4550:
4548:
4544:
4538:
4535:
4533:
4530:
4528:
4525:
4523:
4520:
4519:
4517:
4515:
4511:
4505:
4502:
4500:
4497:
4495:
4492:
4490:
4487:
4485:
4482:
4481:
4479:
4477:
4473:
4467:
4464:
4462:
4459:
4457:
4454:
4452:
4449:
4448:
4446:
4444:
4440:
4436:
4429:
4426:
4424:
4421:
4420:
4416:
4412:
4396:
4393:
4392:
4391:
4388:
4386:
4383:
4381:
4378:
4374:
4371:
4369:
4366:
4365:
4364:
4361:
4360:
4358:
4356:
4352:
4342:
4339:
4335:
4329:
4327:
4321:
4319:
4313:
4312:
4311:
4308:
4307:Nonparametric
4305:
4303:
4297:
4293:
4290:
4289:
4288:
4282:
4278:
4277:Sample median
4275:
4274:
4273:
4270:
4269:
4267:
4265:
4261:
4253:
4250:
4248:
4245:
4243:
4240:
4239:
4238:
4235:
4233:
4230:
4228:
4222:
4220:
4217:
4215:
4212:
4210:
4207:
4205:
4202:
4200:
4198:
4194:
4192:
4189:
4188:
4186:
4184:
4180:
4174:
4172:
4168:
4166:
4164:
4159:
4157:
4152:
4148:
4147:
4144:
4141:
4139:
4135:
4125:
4122:
4120:
4117:
4115:
4112:
4111:
4109:
4107:
4103:
4097:
4094:
4090:
4087:
4086:
4085:
4082:
4078:
4075:
4074:
4073:
4070:
4068:
4065:
4064:
4062:
4060:
4056:
4048:
4045:
4043:
4040:
4039:
4038:
4035:
4033:
4030:
4028:
4025:
4023:
4020:
4018:
4015:
4013:
4010:
4009:
4007:
4005:
4001:
3995:
3992:
3988:
3985:
3981:
3978:
3976:
3973:
3972:
3971:
3968:
3967:
3966:
3963:
3959:
3956:
3954:
3951:
3949:
3946:
3944:
3941:
3940:
3939:
3936:
3935:
3933:
3931:
3927:
3924:
3922:
3918:
3912:
3909:
3907:
3904:
3900:
3897:
3896:
3895:
3892:
3890:
3887:
3883:
3882:loss function
3880:
3879:
3878:
3875:
3871:
3868:
3866:
3863:
3861:
3858:
3857:
3856:
3853:
3851:
3848:
3846:
3843:
3839:
3836:
3834:
3831:
3829:
3823:
3820:
3819:
3818:
3815:
3811:
3808:
3806:
3803:
3801:
3798:
3797:
3796:
3793:
3789:
3786:
3784:
3781:
3780:
3779:
3776:
3772:
3769:
3768:
3767:
3764:
3760:
3757:
3756:
3755:
3752:
3750:
3747:
3745:
3742:
3740:
3737:
3736:
3734:
3732:
3728:
3724:
3720:
3715:
3711:
3697:
3694:
3692:
3689:
3687:
3684:
3682:
3679:
3678:
3676:
3674:
3670:
3664:
3661:
3659:
3656:
3654:
3651:
3650:
3648:
3644:
3638:
3635:
3633:
3630:
3628:
3625:
3623:
3620:
3618:
3615:
3613:
3610:
3608:
3605:
3604:
3602:
3600:
3596:
3590:
3587:
3585:
3584:Questionnaire
3582:
3580:
3577:
3573:
3570:
3568:
3565:
3564:
3563:
3560:
3559:
3557:
3555:
3551:
3545:
3542:
3540:
3537:
3535:
3532:
3530:
3527:
3525:
3522:
3520:
3517:
3515:
3512:
3510:
3507:
3506:
3504:
3502:
3498:
3494:
3490:
3485:
3481:
3467:
3464:
3462:
3459:
3457:
3454:
3452:
3449:
3447:
3444:
3442:
3439:
3437:
3434:
3432:
3429:
3427:
3424:
3422:
3419:
3417:
3414:
3412:
3411:Control chart
3409:
3407:
3404:
3402:
3399:
3397:
3394:
3393:
3391:
3389:
3385:
3379:
3376:
3372:
3369:
3367:
3364:
3363:
3362:
3359:
3357:
3354:
3352:
3349:
3348:
3346:
3344:
3340:
3334:
3331:
3329:
3326:
3324:
3321:
3320:
3318:
3314:
3308:
3305:
3304:
3302:
3300:
3296:
3284:
3281:
3279:
3276:
3274:
3271:
3270:
3269:
3266:
3264:
3261:
3260:
3258:
3256:
3252:
3246:
3243:
3241:
3238:
3236:
3233:
3231:
3228:
3226:
3223:
3221:
3218:
3216:
3213:
3212:
3210:
3208:
3204:
3198:
3195:
3193:
3190:
3186:
3183:
3181:
3178:
3176:
3173:
3171:
3168:
3166:
3163:
3161:
3158:
3156:
3153:
3151:
3148:
3146:
3143:
3141:
3138:
3137:
3136:
3133:
3132:
3130:
3128:
3124:
3121:
3119:
3115:
3111:
3107:
3102:
3098:
3092:
3089:
3087:
3084:
3083:
3080:
3076:
3069:
3064:
3062:
3057:
3055:
3050:
3049:
3046:
3034:
3031:
3029:
3026:
3024:
3019:
3014:
3012:
3009:
3007:
3004:
3003:
3000:
2994:
2991:
2989:
2986:
2984:
2981:
2979:
2976:
2974:
2973:Curve fitting
2971:
2970:
2968:
2964:
2958:
2955:
2953:
2950:
2948:
2945:
2943:
2940:
2938:
2935:
2933:
2930:
2928:
2925:
2924:
2922:
2920:
2919:approximation
2917:
2913:
2907:
2904:
2902:
2899:
2897:
2894:
2893:
2891:
2889:
2885:
2879:
2876:
2874:
2871:
2869:
2866:
2864:
2861:
2859:
2856:
2854:
2851:
2849:
2846:
2845:
2843:
2839:
2833:
2830:
2828:
2825:
2821:
2818:
2816:
2813:
2811:
2810:
2802:
2801:
2800:
2797:
2795:
2792:
2791:
2789:
2785:
2779:
2776:
2774:
2771:
2769:
2766:
2765:
2763:
2761:
2757:
2747:
2744:
2742:
2739:
2737:
2734:
2732:
2729:
2728:
2726:
2722:
2716:
2713:
2711:
2708:
2706:
2703:
2701:
2698:
2696:
2695:Nonparametric
2693:
2691:
2688:
2687:
2685:
2681:
2675:
2672:
2670:
2667:
2665:
2662:
2660:
2657:
2656:
2654:
2650:
2644:
2641:
2639:
2636:
2634:
2631:
2629:
2626:
2624:
2621:
2620:
2618:
2616:
2612:
2609:
2607:
2601:
2595:
2592:
2590:
2587:
2585:
2582:
2580:
2577:
2576:
2574:
2572:
2568:
2562:
2559:
2557:
2554:
2551:
2550:Kendall's tau
2548:
2546:
2542:
2539:
2537:
2534:
2533:
2531:
2529:
2525:
2519:
2516:
2514:
2511:
2509:
2506:
2504:
2503:Least squares
2501:
2500:
2498:
2496:
2492:
2488:
2484:
2483:Least squares
2477:
2472:
2470:
2465:
2463:
2458:
2457:
2454:
2447:
2442:
2438:
2437:
2433:
2427:
2423:
2422:
2417:
2413:
2402:
2400:9780471879572
2396:
2392:
2391:
2385:
2381:
2377:
2373:
2369:
2368:
2363:
2359:
2358:Cox, David R.
2355:
2344:
2338:
2334:
2330:
2329:
2323:
2322:
2318:
2302:
2298:
2292:
2289:
2284:
2282:9780521761598
2278:
2274:
2267:
2264:
2259:
2254:
2253:
2244:
2241:
2236:
2232:
2228:
2222:
2218:
2211:
2208:
2203:
2199:
2195:
2189:
2185:
2178:
2176:
2172:
2167:
2163:
2159:
2157:0-412-16440-X
2153:
2148:
2147:
2138:
2135:
2124:
2118:
2114:
2113:
2105:
2102:
2091:
2085:
2081:
2080:
2072:
2069:
2058:
2052:
2048:
2047:
2039:
2036:
2029:
2024:
2021:
2019:
2016:
2014:
2011:
2009:
2006:
2004:
2001:
1999:
1996:
1994:
1991:
1989:
1986:
1984:
1981:
1979:
1976:
1974:
1971:
1969:
1966:
1964:
1961:
1959:
1956:
1954:
1951:
1949:
1946:
1944:
1941:
1939:
1936:
1934:
1931:
1930:
1925:
1919:
1914:
1909:
1907:
1905:
1904:mean residual
1901:
1896:
1894:
1890:
1889:
1883:
1881:
1880:least squares
1877:
1876:
1871:
1869:
1865:
1864:
1859:
1858:
1857:
1850:
1848:
1841:
1833:
1831:
1829:
1825:
1821:
1816:
1814:
1810:
1806:
1802:
1798:
1794:
1793:distributions
1789:
1786:
1782:
1779: â
1778:
1774:
1769:
1767:
1763:
1759:
1755:
1751:
1748: â
1747:
1743:
1739:
1735:
1731:
1727:
1723:
1718:
1716:
1712:
1707:
1703:
1699:
1695:
1691:
1683:
1681:
1679:
1675:
1671:
1667:
1663:
1659:
1655:
1651:
1646:
1631:
1626:
1622:
1616:
1612:
1607:
1598:
1592:
1588:
1585:
1577:
1561:
1555:
1549:
1545:
1536:
1532:
1514:
1510:
1487:
1483:
1479:
1474:
1465:
1439:
1431:
1425:
1419:
1415:
1407:
1403:
1399:
1394:
1385:
1376:
1373:
1366:
1365:
1364:
1362:
1358:
1350:
1348:
1346:
1342:
1338:
1334:
1315:
1310:
1305:
1302:
1299:
1295:
1291:
1286:
1281:
1277:
1271:
1266:
1263:
1260:
1256:
1248:
1244:
1240:
1231:
1230:
1229:
1227:
1223:
1204:
1199:
1194:
1190:
1186:
1181:
1176:
1172:
1166:
1161:
1158:
1155:
1151:
1143:
1139:
1135:
1126:
1125:
1124:
1122:
1119:
1115:
1111:
1108:, divided by
1107:
1088:
1080:
1075:
1070:
1066:
1062:
1057:
1053:
1045:
1044:
1043:
1041:
1037:
1017:
1014:
1011:
1006:
1002:
998:
993:
989:
981:
980:
979:
977:
958:
954:
948:
943:
939:
933:
930:
926:
922:
919:
911:
902:
901:
900:
881:
875:
871:
867:
864:
861:
856:
852:
845:
837:
828:
827:
826:
825:
804:
798:
794:
790:
787:
783:
779:
776:
771:
767:
763:
760:
757:
752:
748:
740:
739:
738:
736:
728:
726:
724:
720:
718:
713:
709:
704:
702:
701:almost surely
698:
697:
687:
683:
679:
676:
672:
668:
667:
666:
664:
660:
659:
654:
650:
646:
641:
639:
634:
630:
626:
622:
617:
615:
611:
607:
599:
597:
595:
591:
587:
583:
579:
575:
571:
567:
566:
561:
557:
553:
549:
545:
541:
537:
533:
529:
525:
521:
517:
513:
501:
496:
494:
489:
487:
482:
481:
479:
478:
473:
468:
463:
462:
461:
460:
455:
452:
450:
447:
445:
442:
440:
437:
435:
432:
430:
427:
426:
425:
424:
419:
414:
411:
409:
406:
404:
401:
399:
396:
394:
391:
390:
389:
388:
383:
380:
378:
375:
373:
370:
368:
365:
363:
360:
359:
358:
357:
352:
349:
347:
344:
342:
339:
337:
334:
333:
332:
331:
326:
323:
321:
318:
316:
315:Least squares
313:
312:
311:
310:
305:
300:
297:
296:
295:
294:
289:
286:
284:
281:
279:
276:
274:
271:
269:
266:
264:
261:
259:
256:
254:
251:
249:
248:Nonparametric
246:
244:
241:
240:
239:
238:
233:
230:
228:
225:
223:
220:
218:
217:Fixed effects
215:
213:
210:
209:
208:
207:
202:
199:
197:
194:
192:
191:Ordered logit
189:
187:
184:
182:
179:
177:
174:
172:
169:
167:
164:
162:
159:
157:
154:
152:
149:
147:
144:
142:
139:
138:
137:
136:
131:
128:
126:
123:
121:
118:
116:
113:
112:
111:
110:
105:
102:
98:
94:
93:
84:
81:
73:
63:
59:
53:
52:
46:
41:
32:
31:
19:
5356:
5344:
5325:
5318:
5230:Econometrics
5180: /
5163:Chemometrics
5140:Epidemiology
5133: /
5106:Applications
4948:ARIMA model
4895:Q-statistic
4844:Stationarity
4740:Multivariate
4683: /
4679: /
4677:Multivariate
4675: /
4615: /
4611: /
4483:
4385:Bayes factor
4284:Signed rank
4196:
4170:
4162:
4150:
3845:Completeness
3681:Cohort study
3579:Opinion poll
3514:Missing data
3501:Study design
3456:Scatter plot
3378:Scatter plot
3371:Spearman's Ï
3333:Grouped data
2966:Applications
2805:
2683:Non-standard
2419:
2404:. Retrieved
2389:
2371:
2365:
2346:. Retrieved
2327:
2305:. Retrieved
2303:. 2013-11-21
2300:
2291:
2272:
2266:
2251:
2243:
2216:
2210:
2183:
2145:
2137:
2126:. Retrieved
2111:
2104:
2093:. Retrieved
2078:
2071:
2060:. Retrieved
2045:
2038:
1903:
1899:
1897:
1886:
1884:
1873:
1872:
1867:
1861:
1855:
1853:
1851:
1843:
1824:studentizing
1819:
1817:
1804:
1796:
1792:
1790:
1784:
1780:
1776:
1770:
1765:
1757:
1753:
1749:
1745:
1741:
1733:
1729:
1725:
1719:
1705:
1697:
1693:
1687:
1677:
1669:
1661:
1657:
1653:
1647:
1578:
1534:
1530:
1454:
1354:
1336:
1332:
1330:
1225:
1221:
1219:
1117:
1109:
1105:
1103:
1039:
1033:
975:
973:
898:
821:
732:
716:
705:
693:
691:
685:
681:
674:
670:
662:
656:
655:people. The
652:
648:
644:
642:
624:
620:
618:
603:
600:Introduction
594:disturbances
593:
590:econometrics
581:
577:
563:
559:
547:
523:
519:
516:optimization
509:
438:
372:Non-negative
76:
67:
48:
5358:WikiProject
5273:Cartography
5235:Jurimetrics
5187:Reliability
4918:Time domain
4897:(LjungâBox)
4819:Time-series
4697:Categorical
4681:Time-series
4673:Categorical
4608:(Bernoulli)
4443:Correlation
4423:Correlation
4219:JarqueâBera
4191:Chi-squared
3953:M-estimator
3906:Asymptotics
3850:Sufficiency
3617:Interaction
3529:Replication
3509:Effect size
3466:Violin plot
3446:Radar chart
3426:Forest plot
3416:Correlogram
3366:Kendall's Ï
2406:23 February
2348:23 February
1684:Regressions
824:sample mean
696:independent
658:sample mean
625:disturbance
570:sample mean
552:observation
382:Regularized
346:Generalized
278:Least angle
176:Mixed logit
62:introducing
5374:Categories
5225:Demography
4943:ARMA model
4748:Regression
4325:(Friedman)
4286:(Wilcoxon)
4224:Normality
4214:Lilliefors
4161:Student's
4037:Resampling
3911:Robustness
3899:divergence
3889:Efficiency
3827:(monotone)
3822:Likelihood
3739:Population
3572:Stratified
3524:Population
3343:Dependence
3299:Count data
3230:Percentile
3207:Dispersion
3140:Arithmetic
3075:Statistics
2841:Background
2804:Mallows's
2342:041224280X
2307:2019-11-22
2128:2022-05-13
2095:2022-05-13
2062:2022-05-13
2030:References
1900:mean error
1838:See also:
1820:residuals,
719:-statistic
703:not zero.
684:mean is a
673:mean is a
671:population
663:population
633:population
544:true value
542:from its "
512:statistics
421:Background
325:Non-linear
307:Estimation
45:references
18:Mean error
4606:Logistic
4373:posterior
4299:Rank sum
4047:Jackknife
4042:Bootstrap
3860:Bootstrap
3795:Parameter
3744:Statistic
3539:Statistic
3451:Run chart
3436:Pie chart
3431:Histogram
3421:Fan chart
3396:Bar chart
3278:L-moments
3165:Geometric
2916:Numerical
2426:EMS Press
2235:987251007
2202:262680588
1726:residuals
1698:residuals
1676:for
1623:σ
1602:¯
1589:
1484:μ
1480:−
1469:¯
1404:μ
1400:−
1389:¯
1303:−
1296:χ
1292:∼
1257:∑
1245:σ
1222:residuals
1191:χ
1187:∼
1152:∑
1140:σ
1084:¯
1076:−
1040:residuals
1015:μ
1012:−
978:are then
940:σ
931:μ
920:∼
915:¯
865:⋯
841:¯
795:σ
788:μ
777:∼
761:…
677:, whereas
565:estimated
528:deviation
524:residuals
288:Segmented
5320:Category
5013:Survival
4890:Johansen
4613:Binomial
4568:Isotonic
4155:(normal)
3800:location
3607:Blocking
3562:Sampling
3441:QâQ plot
3406:Box plot
3388:Graphics
3283:Skewness
3273:Kurtosis
3245:Variance
3175:Heronian
3170:Harmonic
2746:Logistic
2736:Binomial
2715:Isotonic
2710:Quantile
1910:See also
1828:outliers
1756:, where
1112:, has a
1036:expected
822:and the
686:residual
649:estimate
645:residual
560:residual
403:Bayesian
341:Weighted
336:Ordinary
268:Isotonic
263:Quantile
5346:Commons
5293:Kriging
5178:Process
5135:studies
4994:Wavelet
4827:General
3994:Plug-in
3788:L space
3567:Cluster
3268:Moments
3086:Outline
2741:Poisson
2428:, 2001
2380:2984505
2166:7779780
1797:even if
712:z-score
710:) in a
558:). The
536:element
362:Partial
201:Poisson
58:improve
5215:Census
4805:Normal
4753:Manova
4573:Robust
4323:2-way
4315:1-way
4153:-test
3824:
3401:Biplot
3192:Median
3185:Lehmer
3127:Center
2705:Robust
2397:
2378:
2339:
2279:
2233:
2223:
2200:
2190:
2164:
2154:
2119:
2086:
2053:
1805:higher
1738:biased
1730:errors
1706:fitted
1694:errors
1656:, but
1455:where
1351:Remark
682:sample
550:of an
534:of an
530:of an
520:errors
320:Linear
258:Robust
181:Probit
107:Models
47:, but
4839:Trend
4368:prior
4310:anova
4199:-test
4173:-test
4165:-test
4072:Power
4017:Pivot
3810:shape
3805:scale
3255:Shape
3235:Range
3180:Heinz
3155:Cubic
3091:Index
2376:JSTOR
1773:ANOVA
1668:with
1116:with
1034:with
588:. In
548:error
538:of a
367:Total
283:Local
5072:Test
4272:Sign
4124:Wald
3197:Mode
3135:Mean
2485:and
2408:2013
2395:ISBN
2350:2013
2337:ISBN
2277:ISBN
2231:OCLC
2221:ISBN
2198:OCLC
2188:ISBN
2162:OCLC
2152:ISBN
2117:ISBN
2084:ISBN
2051:ISBN
1898:The
1852:The
1696:and
1648:The
1335:and
1042:are
974:The
694:not
638:mean
623:(or
610:mean
580:and
522:and
514:and
4252:BIC
4247:AIC
2820:BIC
2815:AIC
2258:288
1688:In
1586:Var
510:In
5376::
2424:,
2418:,
2372:30
2370:.
2360:;
2335:.
2299:.
2229:.
2196:.
2174:^
2160:.
1849::
1758:df
1744:=
1742:df
1717:.
1123::
725:.
643:A
619:A
596:.
518:,
4197:G
4171:F
4163:t
4151:Z
3870:V
3865:U
3067:e
3060:t
3053:v
2808:p
2806:C
2552:)
2543:(
2475:e
2468:t
2461:v
2410:.
2382:.
2352:.
2310:.
2285:.
2260:.
2237:.
2204:.
2168:.
2131:.
2098:.
2065:.
1785:p
1781:p
1777:n
1766:n
1764:(
1754:n
1750:p
1746:n
1734:n
1678:Ό
1670:n
1662:Ï
1658:Ï
1654:Ï
1632:n
1627:2
1617:=
1613:)
1608:n
1599:X
1593:(
1562:n
1556:/
1550:n
1546:S
1535:Ï
1531:n
1515:n
1511:S
1488:0
1475:n
1466:X
1440:,
1432:n
1426:/
1420:n
1416:S
1408:0
1395:n
1386:X
1377:=
1374:T
1337:n
1333:n
1316:.
1311:2
1306:1
1300:n
1287:2
1282:i
1278:r
1272:n
1267:1
1264:=
1261:i
1249:2
1241:1
1226:n
1205:.
1200:2
1195:n
1182:2
1177:i
1173:e
1167:n
1162:1
1159:=
1156:i
1144:2
1136:1
1118:n
1110:Ï
1089:.
1081:X
1071:i
1067:X
1063:=
1058:i
1054:r
1018:,
1007:i
1003:X
999:=
994:i
990:e
959:.
955:)
949:n
944:2
934:,
927:(
923:N
912:X
882:n
876:n
872:X
868:+
862:+
857:1
853:X
846:=
838:X
805:)
799:2
791:,
784:(
780:N
772:n
768:X
764:,
758:,
753:1
749:X
717:t
688:.
653:n
499:e
492:t
485:v
83:)
77:(
72:)
68:(
54:.
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.