6495:
722:
6481:
358:
6519:
6507:
1133:(MVUE). This is because an efficient estimator maintains equality on the CramĂ©râRao inequality for all parameter values, which means it attains the minimum variance for all parameters (the definition of the MVUE). The MVUE estimator, even if it exists, is not necessarily efficient, because "minimum" does not mean equality holds on the CramĂ©râRao inequality.
717:{\displaystyle {\begin{aligned}\operatorname {MSE} (T)&=\operatorname {E} =\operatorname {E} +\operatorname {E} -\theta )^{2}]\\&=\operatorname {E} )^{2}]+2E](\operatorname {E} -\theta )+(\operatorname {E} -\theta )^{2}\\&=\operatorname {var} (T)+(\operatorname {E} -\theta )^{2}\end{aligned}}}
3602:
For experimental designs, efficiency relates to the ability of a design to achieve the objective of the study with minimal expenditure of resources such as time and money. In simple cases, the relative efficiency of designs can be expressed as the ratio of the sample sizes required to achieve a given
3553:
Efficiency in statistics is important because they allow one to compare the performance of various estimators. Although an unbiased estimator is usually favored over a biased one, a more efficient biased estimator can sometimes be more valuable than a less efficient unbiased estimator. For example,
69:
of two procedures is the ratio of their efficiencies, although often this concept is used where the comparison is made between a given procedure and a notional "best possible" procedure. The efficiencies and the relative efficiency of two procedures theoretically depend on the sample size available
3524:
While efficiency is a desirable quality of an estimator, it must be weighed against other considerations, and an estimator that is efficient for certain distributions may well be inefficient for other distributions. Most significantly, estimators that are efficient for clean data from a simple
3010:
2320:
3533:
are a general class of estimators motivated by these concerns. They can be designed to yield both robustness and high relative efficiency, though possibly lower efficiency than traditional estimators for some cases. They can be very computationally complicated, however.
1940:
1341:. Generally, the variance measures the degree of dispersion of a random variable around its mean. Thus estimators with small variances are more concentrated, they estimate the parameters more precisely. We say that the estimator is a
2180:
180:
1295:
917:
803:
2818:
3381:
3281:
3554:
this can occur when the values of the biased estimator gathers around a number closer to the true value. Thus, estimator performance can be predicted easily by comparing their mean squared errors or variances.
363:
3525:
distribution, such as the normal distribution (which is symmetric, unimodal, and has thin tails) may not be robust to contamination by outliers, and may be inefficient for more complicated distributions. In
3194:
1009:
273:
In general, the spread of an estimator around the parameter Ξ is a measure of estimator efficiency and performance. This performance can be calculated by finding the mean squared error. More formally, let
3449:
1591:
2779:
2211:
1331:
2412:
The sample mean is thus more efficient than the sample median in this example. However, there may be measures by which the median performs better. For example, the median is far more robust to
3498:
is less efficient for a normal distribution, but is more robust (i.e., less affected) by changes in the distribution, and thus may be more efficient for a mixture distribution. Similarly, the
1735:
1060:
1835:
923:. This relationship can be determined by simplifying the more general case above for mean squared error; since the expected value of an unbiased estimator is equal to the parameter value,
959:
1827:
350:
216:
2357:
1995:
1768:
3490:), the presence of extreme values from the latter distribution (often "contaminating outliers") significantly reduces the efficiency of the sample mean as an estimator of
2105:
1374:
estimators. (Often it is not.) Since there are no good theoretical reasons to require that estimators are unbiased, this restriction is inconvenient. In fact, if we use
3463:â an estimator such as the sample mean is an efficient estimator of the population mean of a normal distribution, for example, but can be an inefficient estimator of a
3120:
3053:
2802:
2587:
2500:
1109:
3100:
2674:
2647:
2616:
2559:
2527:
2480:
2453:
2067:
2023:
1676:
262:â the function which quantifies the relative degree of undesirability of estimation errors of different magnitudes. The most common choice of the loss function is
5616:
3820:
3129:. This replaces the comparison of mean-squared-errors with comparing how often one estimator produces estimates closer to the true value than another estimator.
2407:
6121:
3769:
3073:
3033:
2384:
2203:
2043:
1652:
3199:
Relative efficiency of two such estimators can thus be interpreted as the relative sample size of one required to achieve the certainty of the other. Proof:
2112:
3541:, which are very simple statistics that are easy to compute and interpret, in many cases robust, and often sufficiently efficient for initial estimates. See
6271:
5895:
4536:
106:
5669:
6108:
3005:{\displaystyle e(T_{1},T_{2})={\frac {\operatorname {E} }{\operatorname {E} }}={\frac {\operatorname {var} (T_{2})}{\operatorname {var} (T_{1})}}}
1479:
1397:
Finite-sample efficiency is based on the variance, as a criterion according to which the estimators are judged. A more general approach is to use
3529:, more importance is placed on robustness and applicability to a wide variety of distributions, rather than efficiency on a single distribution.
1232:
858:
744:
1356:. However the converse is false: There exist point-estimation problems for which the minimum-variance mean-unbiased estimator is inefficient.
4140:
3928:
3860:
3796:
3289:
258:
that estimates the quantity of interest in some âbest possibleâ manner. The notion of âbest possibleâ relies upon the choice of a particular
3205:
4531:
4231:
5135:
4283:
3499:
1353:
1130:
3752:
3151:
964:
5918:
5810:
4192:
4170:
4118:
4088:
3961:
42:
procedure. Essentially, a more efficient estimator needs fewer input data or observations than a less efficient one to achieve the
3459:
Efficiency of an estimator may change significantly if the distribution changes, often dropping. This is one of the motivations of
3451:, so the relative efficiency expresses the relative sample size of the first estimator needed to match the variance of the second.
3386:
2315:{\displaystyle e\left({\widetilde {X}}\right)=\left({\frac {1}{N}}\right)\left({\frac {\pi }{2N}}\right)^{-1}=2/\pi \approx 0.64.}
6523:
6096:
5970:
1517:
819:
are two unbiased estimators for the same parameter Ξ, then the variance can be compared to determine performance. In this case,
6154:
5815:
5560:
4931:
4521:
3467:
of two normal distributions with the same mean and different variances. For example, if a distribution is a combination of 98%
2682:
1363:
Finite-sample efficient estimators are extremely rare. In fact, it was proved that efficient estimation is possible only in an
5145:
1626:, asymptotic normally distribution of estimator, and asymptotic variance-covariance matrix no worse than any other estimator.
6205:
5417:
5224:
5113:
5071:
3901:
3585:
2359:, or 57% greater than the variance of the mean â the standard error of the median will be 25% greater than that of the mean.
4310:
3663:
3566:, a meaningful measure of efficiency can be defined based on the sample size required for the test to achieve a given task
1303:
6448:
5407:
4023:
3673:
5457:
1958:
1219:
1115:
5999:
5948:
5933:
5923:
5792:
5664:
5631:
5412:
5242:
6068:
5369:
1685:
1614:
from the sample. Thus, the sample mean is a finite-sample efficient estimator for the mean of the normal distribution.
1359:
Historically, finite-sample efficiency was an early optimality criterion. However this criterion has some limitations:
6343:
6144:
5123:
4792:
4256:
4018:
3668:
1935:{\displaystyle {\overline {X}}={\frac {1}{N}}\sum _{n=1}^{N}X_{n}\sim {\mathcal {N}}\left(\mu ,{\frac {1}{N}}\right).}
1014:
6228:
6195:
1378:
as a selection criterion, many biased estimators will slightly outperform the âbestâ unbiased ones. For example, in
6545:
6200:
5943:
5702:
5608:
5588:
5496:
5207:
5025:
4508:
4380:
3126:
5374:
5140:
4998:
1391:
74:(defined as the limit of the relative efficiencies as the sample size grows) as the principal comparison measure.
5960:
5728:
5449:
5303:
5232:
5152:
5010:
4991:
4699:
4420:
3507:
3137:
In estimating the mean of uncorrelated, identically distributed variables we can take advantage of the fact that
1334:
1126:
is a lower bound of the variance of an unbiased estimator, representing the "best" an unbiased estimator can be.
6073:
1345:(in the class of unbiased estimators) if it reaches the lower bound in the CramĂ©râRao inequality above, for all
926:
6443:
6210:
5758:
5723:
5687:
5472:
4914:
4823:
4782:
4694:
4385:
4224:
3581:
3142:
2531:
1773:
1623:
1434:
289:
5480:
5464:
2416:, so that if the Gaussian model is questionable or approximate, there may advantages to using the median (see
1123:
231:
188:
43:
6352:
5965:
5905:
5842:
5202:
5064:
4904:
4818:
4013:
3510:, can significantly reduce the efficiency of estimators that assume a symmetric distribution or thin tails.
1379:
263:
6113:
6050:
6390:
6320:
5805:
5692:
4689:
4586:
4493:
4372:
4271:
6511:
5389:
6415:
6357:
6300:
6126:
6019:
5928:
5654:
5538:
5397:
5279:
5271:
5086:
4982:
4960:
4919:
4884:
4851:
4797:
4772:
4727:
4666:
4626:
4428:
4251:
3577:
2328:
1430:
94:
55:
6494:
5384:
3573:
2409:
the efficiency is higher than this (for example, a sample size of 3 gives an efficiency of about 74%).
1971:
6338:
5913:
5862:
5838:
5800:
5718:
5697:
5649:
5528:
5506:
5475:
5261:
5212:
5130:
5103:
5059:
5015:
4777:
4553:
4433:
3622:
3617:
3464:
2002:
1746:
1422:
1387:
1371:
1208:
83:
6485:
6410:
6333:
6014:
5778:
5771:
5733:
5641:
5621:
5593:
5326:
5192:
5187:
5177:
5169:
4987:
4948:
4838:
4828:
4737:
4516:
4472:
4390:
4315:
4217:
3953:
3744:
3627:
2046:
1655:
1441:
1410:
6060:
4036:
2072:
1405:
As an example, among the models encountered in practice, efficient estimators exist for: the mean
1401:
other than quadratic ones, in which case the finite-sample efficiency can no longer be formulated.
6499:
6310:
6164:
6009:
5885:
5782:
5766:
5743:
5520:
5254:
5237:
5197:
5108:
5003:
4965:
4936:
4896:
4856:
4802:
4719:
4405:
4400:
3814:
3698:
2566:
1954:
1611:
1375:
1364:
267:
230:) is the minimum possible variance for an unbiased estimator divided by its actual variance. The
219:
39:
3893:
3719:
6405:
6375:
6367:
6187:
6178:
6103:
6034:
5890:
5875:
5850:
5738:
5679:
5545:
5533:
5159:
5076:
5020:
4943:
4787:
4709:
4488:
4362:
4188:
4166:
4158:
4136:
4128:
4114:
4084:
3967:
3957:
3924:
3918:
3897:
3866:
3856:
3802:
3792:
3748:
3567:
3563:
3526:
3460:
2417:
3105:
3038:
2787:
2572:
2485:
6430:
6385:
6149:
6136:
6029:
6004:
5938:
5870:
5748:
5356:
5249:
5182:
5095:
5042:
4861:
4732:
4526:
4325:
4292:
3945:
3885:
3736:
1160:
1079:
4202:
3078:
2652:
2625:
2594:
2537:
2505:
2458:
2431:
6347:
6091:
5953:
5880:
5555:
5429:
5402:
5379:
5348:
4975:
4970:
4924:
4654:
4305:
4198:
4180:
3612:
2052:
2008:
1661:
4063:
3946:
3737:
2389:
2175:{\displaystyle {\widetilde {X}}\sim {\mathcal {N}}\left(\mu ,{\frac {\pi }{2N}}\right).}
6296:
6291:
4754:
4684:
4330:
4077:
3597:
3495:
3058:
3018:
2369:
2188:
2028:
1998:
1950:
1637:
1066:
6539:
6453:
6420:
6283:
6244:
6055:
6024:
5488:
5442:
5047:
4749:
4576:
4340:
4335:
4050:
3886:
1965:
1398:
259:
4606:
6395:
6328:
6305:
6220:
5550:
4846:
4744:
4679:
4621:
4543:
4498:
17:
1961:, the sample mean is efficient in the sense that its efficiency is unity (100%).
175:{\displaystyle e(T)={\frac {1/{\mathcal {I}}(\theta )}{\operatorname {var} (T)}}}
6438:
6400:
6083:
5984:
5846:
5659:
5626:
5118:
5035:
5030:
4674:
4631:
4611:
4591:
4581:
4350:
3542:
3538:
3530:
3519:
1740:
1508:
1383:
3689:
Fisher, R (1921). "On the
Mathematical Foundations of Theoretical Statistics".
5284:
4764:
4464:
4395:
4345:
4320:
4240:
3739:
A Modern
Introduction to Probability and Statistics: Understanding Why and How
3971:
3870:
3806:
5437:
5289:
4909:
4704:
4616:
4601:
4596:
4561:
2363:
1200:
1136:
Thus an efficient estimator need not exist, but if it does, it is the MVUE.
1069:
255:
86:
35:
1390:: Regardless of the outcome, its performance is worse than for example the
1290:{\displaystyle \operatorname {var} \ \geq \ {\mathcal {I}}_{\theta }^{-1},}
3850:
3786:
912:{\displaystyle \operatorname {var} (T_{1})>\operatorname {var} (T_{2})}
798:{\displaystyle \operatorname {MSE} (T_{1})<\operatorname {MSE} (T_{2})}
4953:
4571:
4448:
4443:
4438:
4410:
3503:
3138:
1679:
1223:
1111:
for all values of the parameter, then the estimator is called efficient.
51:
3944:
Wackerly, Dennis D.; Mendenhall, William; Scheaffer, Richard L. (2008).
6458:
6159:
3376:{\displaystyle s_{1}^{2}=n_{1}\sigma ^{2},\,s_{2}^{2}=n_{2}\sigma ^{2}}
3125:
An alternative to relative efficiency for comparing estimators, is the
2413:
2366:
efficiency — that is, the efficiency in the limit as sample size
59:
3276:{\displaystyle {\frac {e_{1}}{e_{2}}}={\frac {s_{1}^{2}}{s_{2}^{2}}}.}
6380:
5361:
5335:
5315:
4566:
4357:
3702:
3987:
Statistical
Measures of Accuracy for Riflemen and Missile Engineers
1370:
This notion of efficiency is sometimes restricted to the class of
2812:
The relative efficiency of two unbiased estimators is defined as
4300:
3189:{\displaystyle e\equiv \left({\frac {\sigma }{\mu }}\right)^{2}}
6269:
5836:
5583:
4882:
4652:
4269:
4213:
4064:"HodgesâLehmann Optimality for Testing Moment Condition Models"
1004:{\displaystyle \operatorname {MSE} (T)=\operatorname {var} (T)}
3952:(Seventh ed.). Belmont, CA: Thomson Brooks/Cole. p.
3791:. Kristine L. Bell, Zhi Tian (Second ed.). Hoboken, N.J.
3141:. In this case efficiency can be defined as the square of the
4209:
3920:
Data
Analysis and Graphics Using R: An Example-Based Approach
3444:{\displaystyle {\frac {e_{1}}{e_{2}}}={\frac {n_{1}}{n_{2}}}}
352:, which can be decomposed as a sum of its variance and bias:
70:
for the given procedure, but it is often possible to use the
2325:
In other words, the relative variance of the median will be
2133:
1898:
1704:
1311:
1265:
194:
138:
3691:
1586:{\displaystyle T(X)={\frac {1}{n}}\sum _{i=1}^{n}x_{i}\ .}
1382:
for dimension three or more, the mean-unbiased estimator,
3055:, in many cases the dependence drops out; if this is so,
2774:{\displaystyle \operatorname {E} \leq \operatorname {E} }
3645:
3643:
58:
between the estimated value and the "true" value in the
1307:
3389:
3292:
3208:
3154:
3108:
3081:
3061:
3041:
3021:
2821:
2790:
2685:
2655:
2628:
2597:
2575:
2540:
2508:
2488:
2461:
2434:
2392:
2372:
2331:
2214:
2191:
2115:
2075:
2055:
2031:
2011:
1974:
1838:
1776:
1749:
1688:
1664:
1640:
1520:
1367:, and only for the natural parameters of that family.
1326:{\displaystyle \scriptstyle {\mathcal {I}}_{\theta }}
1306:
1235:
1114:
Equivalently, the estimator achieves equality in the
1082:
1017:
967:
929:
861:
747:
361:
292:
191:
109:
6122:
Autoregressive conditional heteroskedasticity (ARCH)
3855:(7th ed., international ed.). Boston: Pearson.
6429:
6366:
6319:
6282:
6237:
6219:
6186:
6177:
6135:
6082:
6043:
5992:
5983:
5904:
5861:
5791:
5757:
5711:
5678:
5640:
5607:
5519:
5428:
5347:
5302:
5270:
5223:
5168:
5094:
5085:
4895:
4837:
4811:
4763:
4718:
4665:
4552:
4507:
4481:
4463:
4419:
4371:
4291:
4282:
3139:
the variance of the sum is the sum of the variances
4076:
4037:"Bahadur efficiency - Encyclopedia of Mathematics"
3443:
3375:
3275:
3188:
3114:
3094:
3067:
3047:
3027:
3004:
2796:
2773:
2668:
2641:
2610:
2581:
2553:
2521:
2494:
2474:
2447:
2401:
2378:
2351:
2314:
2197:
2174:
2099:
2061:
2037:
2017:
1989:
1934:
1821:
1762:
1729:
1670:
1646:
1585:
1325:
1289:
1103:
1054:
1003:
953:
911:
797:
716:
344:
210:
174:
4051:"Bahadur efficiency of the likelihood ratio test"
3584:) relate to the comparison of the performance of
1730:{\displaystyle X_{n}\sim {\mathcal {N}}(\mu ,1).}
50:is characterized by having the smallest possible
1055:{\displaystyle (\operatorname {E} -\theta )^{2}}
5670:Multivariate adaptive regression splines (MARS)
3917:Maindonald, John; Braun, W. John (2010-05-06).
3102:is preferable, regardless of the true value of
4225:
4183:; with the assistance of R. Hamböker (1994).
3771:Counterexamples in Probability and Statistics
3768:Romano, Joseph P.; Siegel, Andrew F. (1986).
8:
2804:, with strict inequality holding somewhere.
2569:(MSE) is smaller for at least some value of
3923:. Cambridge University Press. p. 104.
3075:being greater than one would indicate that
6279:
6266:
6183:
5989:
5858:
5833:
5604:
5580:
5308:
5091:
4892:
4879:
4662:
4649:
4288:
4279:
4266:
4232:
4218:
4210:
4079:The Oxford Dictionary of Statistical Terms
3819:: CS1 maint: location missing publisher (
3788:Detection estimation and modulation theory
3133:Estimators of the mean of u.i.d. variables
1610:, which is equal to the reciprocal of the
1184:are the data sampled from this model. Let
954:{\displaystyle \operatorname {E} =\theta }
3948:Mathematical statistics with applications
3433:
3423:
3417:
3406:
3396:
3390:
3388:
3367:
3357:
3344:
3339:
3334:
3325:
3315:
3302:
3297:
3291:
3262:
3257:
3247:
3242:
3236:
3225:
3215:
3209:
3207:
3180:
3166:
3153:
3107:
3086:
3080:
3060:
3040:
3020:
2990:
2966:
2950:
2935:
2919:
2892:
2876:
2857:
2845:
2832:
2820:
2789:
2762:
2746:
2718:
2702:
2684:
2660:
2654:
2633:
2627:
2602:
2596:
2574:
2545:
2539:
2513:
2507:
2487:
2466:
2460:
2439:
2433:
2391:
2371:
2335:
2330:
2298:
2283:
2264:
2245:
2223:
2222:
2213:
2190:
2149:
2132:
2131:
2117:
2116:
2114:
2086:
2081:
2076:
2074:
2054:
2030:
2010:
1976:
1975:
1973:
1914:
1897:
1896:
1887:
1877:
1866:
1852:
1839:
1837:
1822:{\displaystyle X_{1},X_{2},\ldots ,X_{N}}
1813:
1794:
1781:
1775:
1750:
1748:
1703:
1702:
1693:
1687:
1663:
1639:
1571:
1561:
1550:
1536:
1519:
1316:
1310:
1309:
1305:
1275:
1270:
1264:
1263:
1249:
1245:
1234:
1226:of this estimator is bounded from below:
1081:
1046:
1016:
966:
928:
900:
875:
860:
786:
761:
746:
704:
642:
542:
486:
413:
362:
360:
345:{\displaystyle \operatorname {MSE} (T)=E}
333:
291:
193:
192:
190:
137:
136:
131:
125:
108:
2386:tends to infinity. For finite values of
961:. Therefore, for an unbiased estimator,
3999:
3715:
3664:"Efficiency of a statistical procedure"
3649:
3639:
2185:The efficiency of the median for large
1480:independent and identically distributed
211:{\displaystyle {\mathcal {I}}(\theta )}
27:Quality measure of a statistical method
6196:KaplanâMeier estimator (product limit)
4111:The Cambridge Dictionary of Statistics
3892:. Cambridge University Press. p.
3812:
1444:with unknown mean but known variance:
1062:term drops out for being equal to 0.
38:, of an experimental design, or of a
7:
6506:
6206:Accelerated failure time (AFT) model
3730:
3728:
3684:
3682:
1953:) is equal to the reciprocal of the
1354:minimum variance unbiased estimators
6518:
5801:Analysis of variance (ANOVA, anova)
3537:A more traditional alternative are
2045:the sample median is approximately
1131:minimum variance unbiased estimator
1129:An efficient estimator is also the
54:, indicating that there is a small
5896:CochranâMantelâHaenszel statistics
4522:Pearson product-moment correlation
2903:
2860:
2730:
2686:
2352:{\displaystyle \pi /2\approx 1.57}
1352:. Efficient estimators are always
1021:
930:
734:performs better than an estimator
679:
617:
587:
523:
505:
461:
443:
425:
388:
278:be an estimator for the parameter
25:
3838:(3rd ed.). pp. 440â441.
2482:are estimators for the parameter
1957:from the sample and thus, by the
1343:finite-sample efficient estimator
6517:
6505:
6493:
6480:
6479:
3774:. Chapman and Hall. p. 194.
2591:the MSE does not exceed that of
1990:{\displaystyle {\widetilde {X}}}
6155:Least-squares spectral analysis
4133:Elements of Large-Sample Theory
3520:L-estimator § Applications
1763:{\displaystyle {\overline {X}}}
1622:Asymptotic efficiency requires
805:. For a more specific case, if
5136:Mean-unbiased minimum-variance
4113:. Cambridge University Press.
3586:statistical hypothesis testing
3514:Uses of inefficient estimators
2996:
2983:
2972:
2959:
2941:
2932:
2912:
2909:
2898:
2889:
2869:
2866:
2851:
2825:
2768:
2759:
2739:
2736:
2724:
2715:
2695:
2692:
1721:
1709:
1530:
1524:
1482:observations from this model:
1250:
1242:
1092:
1086:
1043:
1033:
1027:
1018:
998:
992:
980:
974:
942:
936:
906:
893:
881:
868:
792:
779:
767:
754:
701:
691:
685:
676:
670:
664:
639:
629:
623:
614:
608:
599:
593:
584:
581:
578:
572:
560:
548:
539:
535:
529:
514:
511:
492:
483:
473:
467:
455:
449:
434:
431:
419:
410:
397:
394:
378:
372:
339:
330:
317:
314:
305:
299:
205:
199:
166:
160:
149:
143:
119:
113:
72:asymptotic relative efficiency
34:is a measure of quality of an
1:
6449:Geographic information system
5665:Simultaneous equations models
4187:. Berlin: Walter de Gruyter.
4185:Parametric Statistical Theory
4135:. New York: Springer Verlag.
5632:Coefficient of determination
5243:Uniformly most powerful test
3785:Van Trees, Harry L. (2013).
3543:applications of L-estimators
3035:is in general a function of
2100:{\displaystyle {\pi }/{2N},}
1945:The variance of the mean, 1/
1844:
1755:
1503:. We estimate the parameter
282:. The mean squared error of
6201:Proportional hazards models
6145:Spectral density estimation
6127:Vector autoregression (VAR)
5561:Maximum posterior estimator
4793:Randomized controlled trial
4083:. Oxford University Press.
4062:Canay I. A. & Otsu, T.
4019:Encyclopedia of Mathematics
3849:Greene, William H. (2012).
3834:DeGroot; Schervish (2002).
3669:Encyclopedia of Mathematics
234:can be used to prove that
6562:
5961:Multivariate distributions
4381:Average absolute deviation
4165:(2nd ed.). Springer.
4163:Theory of Point Estimation
4109:Everitt, Brian S. (2002).
3836:Probability and Statistics
3595:
3517:
3127:Pitman closeness criterion
1634:Consider a sample of size
6475:
6278:
6265:
5949:Structural equation model
5857:
5832:
5603:
5579:
5311:
5285:Score/Lagrange multiplier
4891:
4878:
4700:Sample size determination
4661:
4648:
4278:
4265:
4247:
4012:Nikitin, Ya.Yu. (2001) ,
3582:HodgesâLehmann efficiency
1335:Fisher information matrix
270:criterion of optimality.
6444:Environmental statistics
5966:Elliptical distributions
5759:Generalized linear model
5688:Simple linear regression
5458:HodgesâLehmann estimator
4915:Probability distribution
4824:Stochastic approximation
4386:Coefficient of variation
4014:"Efficiency, asymptotic"
3549:Efficiency in statistics
3545:for further discussion.
3143:coefficient of variation
1624:Consistency (statistics)
1596:This estimator has mean
1440:Consider the model of a
1435:multinomial distribution
1140:Finite-sample efficiency
6104:Cross-correlation (XCF)
5712:Non-standard predictors
5146:LehmannâScheffĂ© theorem
4819:Adaptive clinical trial
3662:Nikulin, M.S. (2001) ,
3500:shape of a distribution
3115:{\displaystyle \theta }
3048:{\displaystyle \theta }
2797:{\displaystyle \theta }
2582:{\displaystyle \theta }
2495:{\displaystyle \theta }
1380:multivariate statistics
1207:. If this estimator is
6500:Mathematics portal
6321:Engineering statistics
6229:NelsonâAalen estimator
5806:Analysis of covariance
5693:Ordinary least squares
5617:Pearson product-moment
5021:Statistical functional
4932:Empirical distribution
4765:Controlled experiments
4494:Frequency distribution
4272:Descriptive statistics
4161:; Casella, G. (1998).
3985:Grubbs, Frank (1965).
3735:Dekking, F.M. (2007).
3445:
3377:
3277:
3190:
3116:
3096:
3069:
3049:
3029:
3006:
2798:
2775:
2670:
2643:
2612:
2583:
2555:
2523:
2496:
2476:
2449:
2403:
2380:
2362:Note that this is the
2353:
2316:
2199:
2176:
2101:
2063:
2039:
2019:
1991:
1936:
1882:
1823:
1764:
1731:
1672:
1648:
1587:
1566:
1413:(but not the variance
1337:of the model at point
1327:
1291:
1124:CramĂ©râRao lower bound
1105:
1104:{\displaystyle e(T)=1}
1056:
1005:
955:
913:
799:
718:
346:
212:
176:
6416:Population statistics
6358:System identification
6092:Autocorrelation (ACF)
6020:Exponential smoothing
5934:Discriminant analysis
5929:Canonical correlation
5793:Partition of variance
5655:Regression validation
5499:(JonckheereâTerpstra)
5398:Likelihood-ratio test
5087:Frequentist inference
4999:Locationâscale family
4920:Sampling distribution
4885:Statistical inference
4852:Cross-sectional study
4839:Observational studies
4798:Randomized experiment
4627:Stem-and-leaf display
4429:Central limit theorem
3884:Williams, D. (2001).
3743:. Springer. pp.
3596:Further information:
3518:Further information:
3446:
3378:
3278:
3191:
3117:
3097:
3095:{\displaystyle T_{1}}
3070:
3050:
3030:
3007:
2799:
2776:
2671:
2669:{\displaystyle T_{2}}
2644:
2642:{\displaystyle T_{1}}
2613:
2611:{\displaystyle T_{2}}
2584:
2556:
2554:{\displaystyle T_{2}}
2524:
2522:{\displaystyle T_{1}}
2497:
2477:
2475:{\displaystyle T_{2}}
2450:
2448:{\displaystyle T_{1}}
2404:
2381:
2354:
2317:
2200:
2177:
2102:
2064:
2040:
2020:
1992:
1959:CramĂ©râRao inequality
1937:
1862:
1824:
1765:
1732:
1673:
1649:
1618:Asymptotic efficiency
1588:
1546:
1511:of all observations:
1475:The data consists of
1392:JamesâStein estimator
1328:
1292:
1220:CramĂ©râRao inequality
1116:CramĂ©râRao inequality
1106:
1057:
1006:
956:
914:
848:than the variance of
800:
719:
347:
213:
177:
82:The efficiency of an
6339:Probabilistic design
5924:Principal components
5767:Exponential families
5719:Nonlinear regression
5698:General linear model
5660:Mixed effects models
5650:Errors and residuals
5627:Confounding variable
5529:Bayesian probability
5507:Van der Waerden test
5497:Ordered alternative
5262:Multiple comparisons
5141:RaoâBlackwellization
5104:Estimating equations
5060:Statistical distance
4778:Factorial experiment
4311:Arithmetic-Geometric
3852:Econometric analysis
3618:Consistent estimator
3465:mixture distribution
3387:
3290:
3206:
3152:
3106:
3079:
3059:
3039:
3019:
2819:
2788:
2683:
2653:
2626:
2595:
2573:
2538:
2506:
2486:
2459:
2432:
2390:
2370:
2329:
2212:
2189:
2113:
2073:
2062:{\displaystyle \mu }
2053:
2047:normally distributed
2029:
2018:{\displaystyle \mu }
2009:
1972:
1836:
1774:
1747:
1686:
1671:{\displaystyle \mu }
1662:
1638:
1518:
1423:Poisson distribution
1304:
1233:
1080:
1015:
965:
927:
859:
745:
359:
290:
246:Efficient estimators
222:of the sample. Thus
189:
107:
6411:Official statistics
6334:Methods engineering
6015:Seasonal adjustment
5783:Poisson regressions
5703:Bayesian regression
5642:Regression analysis
5622:Partial correlation
5594:Regression analysis
5193:Prediction interval
5188:Likelihood interval
5178:Confidence interval
5170:Interval estimation
5131:Unbiased estimators
4949:Model specification
4829:Up-and-down designs
4517:Partial correlation
4473:Index of dispersion
4391:Interquartile range
3628:Optimal instruments
3592:Experimental design
3349:
3307:
3267:
3252:
2808:Relative efficiency
2618:for any value of Ξ.
2424:Dominant estimators
1949:(the square of the
1656:normal distribution
1442:normal distribution
1411:normal distribution
1283:
837:if the variance of
266:, resulting in the
252:efficient estimator
67:relative efficiency
48:efficient estimator
18:Relative efficiency
6431:Spatial statistics
6311:Medical statistics
6211:First hitting time
6165:Whittle likelihood
5816:Degrees of freedom
5811:Multivariate ANOVA
5744:Heteroscedasticity
5556:Bayesian estimator
5521:Bayesian inference
5370:KolmogorovâSmirnov
5255:Randomization test
5225:Testing hypotheses
5198:Tolerance interval
5109:Maximum likelihood
5004:Exponential family
4937:Density estimation
4897:Statistical theory
4857:Natural experiment
4803:Scientific control
4720:Survey methodology
4406:Standard deviation
4075:Dodge, Y. (2006).
3578:Bahadur efficiency
3564:significance tests
3441:
3373:
3335:
3293:
3273:
3253:
3238:
3186:
3112:
3092:
3065:
3045:
3025:
3002:
2794:
2771:
2666:
2639:
2608:
2579:
2567:mean squared error
2551:
2519:
2492:
2472:
2445:
2402:{\displaystyle N,}
2399:
2376:
2349:
2312:
2195:
2172:
2097:
2059:
2035:
2015:
1987:
1955:Fisher information
1932:
1819:
1760:
1727:
1668:
1644:
1612:Fisher information
1583:
1425:, the probability
1376:mean squared error
1365:exponential family
1323:
1322:
1287:
1262:
1203:for the parameter
1101:
1052:
1001:
951:
919:for all values of
909:
795:
714:
712:
342:
268:mean squared error
220:Fisher information
208:
172:
40:hypothesis testing
6546:Estimation theory
6533:
6532:
6471:
6470:
6467:
6466:
6406:National accounts
6376:Actuarial science
6368:Social statistics
6261:
6260:
6257:
6256:
6253:
6252:
6188:Survival function
6173:
6172:
6035:Granger causality
5876:Contingency table
5851:Survival analysis
5828:
5827:
5824:
5823:
5680:Linear regression
5575:
5574:
5571:
5570:
5546:Credible interval
5515:
5514:
5298:
5297:
5114:Method of moments
4983:Parametric family
4944:Statistical model
4874:
4873:
4870:
4869:
4788:Random assignment
4710:Statistical power
4644:
4643:
4640:
4639:
4489:Contingency table
4459:
4458:
4326:Generalized/power
4142:978-0-387-98595-4
4129:Lehmann, Erich L.
3989:. pp. 26â27.
3930:978-1-139-48667-5
3888:Weighing the Odds
3862:978-0-273-75356-8
3798:978-1-299-66515-6
3623:Hodges' estimator
3574:Pitman efficiency
3527:robust statistics
3494:By contrast, the
3461:robust statistics
3439:
3412:
3268:
3231:
3174:
3068:{\displaystyle e}
3028:{\displaystyle e}
3000:
2945:
2418:Robust statistics
2379:{\displaystyle N}
2277:
2253:
2231:
2198:{\displaystyle N}
2162:
2125:
2038:{\displaystyle N}
1984:
1964:Now consider the
1922:
1860:
1847:
1758:
1647:{\displaystyle N}
1579:
1544:
1261:
1255:
170:
16:(Redirected from
6553:
6521:
6520:
6509:
6508:
6498:
6497:
6483:
6482:
6386:Crime statistics
6280:
6267:
6184:
6150:Fourier analysis
6137:Frequency domain
6117:
6064:
6030:Structural break
5990:
5939:Cluster analysis
5886:Log-linear model
5859:
5834:
5775:
5749:Homoscedasticity
5605:
5581:
5500:
5492:
5484:
5483:(KruskalâWallis)
5468:
5453:
5408:Cross validation
5393:
5375:AndersonâDarling
5322:
5309:
5280:Likelihood-ratio
5272:Parametric tests
5250:Permutation test
5233:1- & 2-tails
5124:Minimum distance
5096:Point estimation
5092:
5043:Optimal decision
4994:
4893:
4880:
4862:Quasi-experiment
4812:Adaptive designs
4663:
4650:
4527:Rank correlation
4289:
4280:
4267:
4234:
4227:
4220:
4211:
4206:
4181:Pfanzagl, Johann
4176:
4146:
4124:
4095:
4094:
4082:
4072:
4066:
4060:
4054:
4047:
4041:
4040:
4033:
4027:
4026:
4009:
4003:
3997:
3991:
3990:
3982:
3976:
3975:
3951:
3941:
3935:
3934:
3914:
3908:
3907:
3891:
3881:
3875:
3874:
3846:
3840:
3839:
3831:
3825:
3824:
3818:
3810:
3782:
3776:
3775:
3765:
3759:
3758:
3742:
3732:
3723:
3713:
3707:
3706:
3686:
3677:
3676:
3659:
3653:
3647:
3558:Hypothesis tests
3450:
3448:
3447:
3442:
3440:
3438:
3437:
3428:
3427:
3418:
3413:
3411:
3410:
3401:
3400:
3391:
3382:
3380:
3379:
3374:
3372:
3371:
3362:
3361:
3348:
3343:
3330:
3329:
3320:
3319:
3306:
3301:
3282:
3280:
3279:
3274:
3269:
3266:
3261:
3251:
3246:
3237:
3232:
3230:
3229:
3220:
3219:
3210:
3195:
3193:
3192:
3187:
3185:
3184:
3179:
3175:
3167:
3121:
3119:
3118:
3113:
3101:
3099:
3098:
3093:
3091:
3090:
3074:
3072:
3071:
3066:
3054:
3052:
3051:
3046:
3034:
3032:
3031:
3026:
3011:
3009:
3008:
3003:
3001:
2999:
2995:
2994:
2975:
2971:
2970:
2951:
2946:
2944:
2940:
2939:
2924:
2923:
2901:
2897:
2896:
2881:
2880:
2858:
2850:
2849:
2837:
2836:
2803:
2801:
2800:
2795:
2780:
2778:
2777:
2772:
2767:
2766:
2751:
2750:
2723:
2722:
2707:
2706:
2675:
2673:
2672:
2667:
2665:
2664:
2648:
2646:
2645:
2640:
2638:
2637:
2617:
2615:
2614:
2609:
2607:
2606:
2588:
2586:
2585:
2580:
2560:
2558:
2557:
2552:
2550:
2549:
2528:
2526:
2525:
2520:
2518:
2517:
2501:
2499:
2498:
2493:
2481:
2479:
2478:
2473:
2471:
2470:
2454:
2452:
2451:
2446:
2444:
2443:
2408:
2406:
2405:
2400:
2385:
2383:
2382:
2377:
2358:
2356:
2355:
2350:
2339:
2321:
2319:
2318:
2313:
2302:
2291:
2290:
2282:
2278:
2276:
2265:
2258:
2254:
2246:
2237:
2233:
2232:
2224:
2204:
2202:
2201:
2196:
2181:
2179:
2178:
2173:
2168:
2164:
2163:
2161:
2150:
2137:
2136:
2127:
2126:
2118:
2106:
2104:
2103:
2098:
2093:
2085:
2080:
2068:
2066:
2065:
2060:
2044:
2042:
2041:
2036:
2024:
2022:
2021:
2016:
1996:
1994:
1993:
1988:
1986:
1985:
1977:
1941:
1939:
1938:
1933:
1928:
1924:
1923:
1915:
1902:
1901:
1892:
1891:
1881:
1876:
1861:
1853:
1848:
1840:
1828:
1826:
1825:
1820:
1818:
1817:
1799:
1798:
1786:
1785:
1770:, of the sample
1769:
1767:
1766:
1761:
1759:
1751:
1736:
1734:
1733:
1728:
1708:
1707:
1698:
1697:
1677:
1675:
1674:
1669:
1653:
1651:
1650:
1645:
1609:
1600:and variance of
1592:
1590:
1589:
1584:
1577:
1576:
1575:
1565:
1560:
1545:
1537:
1502:
1474:
1351:
1332:
1330:
1329:
1324:
1321:
1320:
1315:
1314:
1296:
1294:
1293:
1288:
1282:
1274:
1269:
1268:
1259:
1253:
1217:
1198:
1183:
1161:parametric model
1158:
1110:
1108:
1107:
1102:
1061:
1059:
1058:
1053:
1051:
1050:
1010:
1008:
1007:
1002:
960:
958:
957:
952:
918:
916:
915:
910:
905:
904:
880:
879:
804:
802:
801:
796:
791:
790:
766:
765:
723:
721:
720:
715:
713:
709:
708:
651:
647:
646:
547:
546:
498:
491:
490:
418:
417:
351:
349:
348:
343:
338:
337:
232:CramĂ©râRao bound
217:
215:
214:
209:
198:
197:
181:
179:
178:
173:
171:
169:
152:
142:
141:
135:
126:
44:CramĂ©râRao bound
21:
6561:
6560:
6556:
6555:
6554:
6552:
6551:
6550:
6536:
6535:
6534:
6529:
6492:
6463:
6425:
6362:
6348:quality control
6315:
6297:Clinical trials
6274:
6249:
6233:
6221:Hazard function
6215:
6169:
6131:
6115:
6078:
6074:BreuschâGodfrey
6062:
6039:
5979:
5954:Factor analysis
5900:
5881:Graphical model
5853:
5820:
5787:
5773:
5753:
5707:
5674:
5636:
5599:
5598:
5567:
5511:
5498:
5490:
5482:
5466:
5451:
5430:Rank statistics
5424:
5403:Model selection
5391:
5349:Goodness of fit
5343:
5320:
5294:
5266:
5219:
5164:
5153:Median unbiased
5081:
4992:
4925:Order statistic
4887:
4866:
4833:
4807:
4759:
4714:
4657:
4655:Data collection
4636:
4548:
4503:
4477:
4455:
4415:
4367:
4284:Continuous data
4274:
4261:
4243:
4238:
4195:
4179:
4173:
4157:
4154:
4152:Further reading
4149:
4143:
4127:
4121:
4108:
4104:
4099:
4098:
4091:
4074:
4073:
4069:
4061:
4057:
4048:
4044:
4035:
4034:
4030:
4011:
4010:
4006:
3998:
3994:
3984:
3983:
3979:
3964:
3943:
3942:
3938:
3931:
3916:
3915:
3911:
3904:
3883:
3882:
3878:
3863:
3848:
3847:
3843:
3833:
3832:
3828:
3811:
3799:
3784:
3783:
3779:
3767:
3766:
3762:
3755:
3734:
3733:
3726:
3714:
3710:
3688:
3687:
3680:
3661:
3660:
3656:
3648:
3641:
3636:
3613:Bayes estimator
3609:
3600:
3594:
3560:
3551:
3522:
3516:
3457:
3429:
3419:
3402:
3392:
3385:
3384:
3363:
3353:
3321:
3311:
3288:
3287:
3221:
3211:
3204:
3203:
3162:
3161:
3150:
3149:
3135:
3104:
3103:
3082:
3077:
3076:
3057:
3056:
3037:
3036:
3017:
3016:
2986:
2976:
2962:
2952:
2931:
2915:
2902:
2888:
2872:
2859:
2841:
2828:
2817:
2816:
2810:
2786:
2785:
2758:
2742:
2714:
2698:
2681:
2680:
2656:
2651:
2650:
2629:
2624:
2623:
2598:
2593:
2592:
2571:
2570:
2541:
2536:
2535:
2509:
2504:
2503:
2484:
2483:
2462:
2457:
2456:
2435:
2430:
2429:
2426:
2388:
2387:
2368:
2367:
2327:
2326:
2269:
2260:
2259:
2241:
2218:
2210:
2209:
2187:
2186:
2154:
2142:
2138:
2111:
2110:
2071:
2070:
2051:
2050:
2027:
2026:
2007:
2006:
1970:
1969:
1907:
1903:
1883:
1834:
1833:
1809:
1790:
1777:
1772:
1771:
1745:
1744:
1689:
1684:
1683:
1660:
1659:
1636:
1635:
1632:
1630:Example: Median
1620:
1605: /
1601:
1567:
1516:
1515:
1499:
1493:
1483:
1451:
1445:
1346:
1308:
1302:
1301:
1231:
1230:
1212:
1185:
1180:
1174:
1164:
1151:
1145:
1142:
1078:
1077:
1072:of a parameter
1042:
1013:
1012:
963:
962:
925:
924:
896:
871:
857:
856:
854:
843:
836:
825:
817:
811:
782:
757:
743:
742:
740:
733:
711:
710:
700:
649:
648:
638:
538:
496:
495:
482:
409:
381:
357:
356:
329:
288:
287:
248:
187:
186:
153:
127:
105:
104:
100:is defined as
80:
30:In statistics,
28:
23:
22:
15:
12:
11:
5:
6559:
6557:
6549:
6548:
6538:
6537:
6531:
6530:
6528:
6527:
6515:
6503:
6489:
6476:
6473:
6472:
6469:
6468:
6465:
6464:
6462:
6461:
6456:
6451:
6446:
6441:
6435:
6433:
6427:
6426:
6424:
6423:
6418:
6413:
6408:
6403:
6398:
6393:
6388:
6383:
6378:
6372:
6370:
6364:
6363:
6361:
6360:
6355:
6350:
6341:
6336:
6331:
6325:
6323:
6317:
6316:
6314:
6313:
6308:
6303:
6294:
6292:Bioinformatics
6288:
6286:
6276:
6275:
6270:
6263:
6262:
6259:
6258:
6255:
6254:
6251:
6250:
6248:
6247:
6241:
6239:
6235:
6234:
6232:
6231:
6225:
6223:
6217:
6216:
6214:
6213:
6208:
6203:
6198:
6192:
6190:
6181:
6175:
6174:
6171:
6170:
6168:
6167:
6162:
6157:
6152:
6147:
6141:
6139:
6133:
6132:
6130:
6129:
6124:
6119:
6111:
6106:
6101:
6100:
6099:
6097:partial (PACF)
6088:
6086:
6080:
6079:
6077:
6076:
6071:
6066:
6058:
6053:
6047:
6045:
6044:Specific tests
6041:
6040:
6038:
6037:
6032:
6027:
6022:
6017:
6012:
6007:
6002:
5996:
5994:
5987:
5981:
5980:
5978:
5977:
5976:
5975:
5974:
5973:
5958:
5957:
5956:
5946:
5944:Classification
5941:
5936:
5931:
5926:
5921:
5916:
5910:
5908:
5902:
5901:
5899:
5898:
5893:
5891:McNemar's test
5888:
5883:
5878:
5873:
5867:
5865:
5855:
5854:
5837:
5830:
5829:
5826:
5825:
5822:
5821:
5819:
5818:
5813:
5808:
5803:
5797:
5795:
5789:
5788:
5786:
5785:
5769:
5763:
5761:
5755:
5754:
5752:
5751:
5746:
5741:
5736:
5731:
5729:Semiparametric
5726:
5721:
5715:
5713:
5709:
5708:
5706:
5705:
5700:
5695:
5690:
5684:
5682:
5676:
5675:
5673:
5672:
5667:
5662:
5657:
5652:
5646:
5644:
5638:
5637:
5635:
5634:
5629:
5624:
5619:
5613:
5611:
5601:
5600:
5597:
5596:
5591:
5585:
5584:
5577:
5576:
5573:
5572:
5569:
5568:
5566:
5565:
5564:
5563:
5553:
5548:
5543:
5542:
5541:
5536:
5525:
5523:
5517:
5516:
5513:
5512:
5510:
5509:
5504:
5503:
5502:
5494:
5486:
5470:
5467:(MannâWhitney)
5462:
5461:
5460:
5447:
5446:
5445:
5434:
5432:
5426:
5425:
5423:
5422:
5421:
5420:
5415:
5410:
5400:
5395:
5392:(ShapiroâWilk)
5387:
5382:
5377:
5372:
5367:
5359:
5353:
5351:
5345:
5344:
5342:
5341:
5333:
5324:
5312:
5306:
5304:Specific tests
5300:
5299:
5296:
5295:
5293:
5292:
5287:
5282:
5276:
5274:
5268:
5267:
5265:
5264:
5259:
5258:
5257:
5247:
5246:
5245:
5235:
5229:
5227:
5221:
5220:
5218:
5217:
5216:
5215:
5210:
5200:
5195:
5190:
5185:
5180:
5174:
5172:
5166:
5165:
5163:
5162:
5157:
5156:
5155:
5150:
5149:
5148:
5143:
5128:
5127:
5126:
5121:
5116:
5111:
5100:
5098:
5089:
5083:
5082:
5080:
5079:
5074:
5069:
5068:
5067:
5057:
5052:
5051:
5050:
5040:
5039:
5038:
5033:
5028:
5018:
5013:
5008:
5007:
5006:
5001:
4996:
4980:
4979:
4978:
4973:
4968:
4958:
4957:
4956:
4951:
4941:
4940:
4939:
4929:
4928:
4927:
4917:
4912:
4907:
4901:
4899:
4889:
4888:
4883:
4876:
4875:
4872:
4871:
4868:
4867:
4865:
4864:
4859:
4854:
4849:
4843:
4841:
4835:
4834:
4832:
4831:
4826:
4821:
4815:
4813:
4809:
4808:
4806:
4805:
4800:
4795:
4790:
4785:
4780:
4775:
4769:
4767:
4761:
4760:
4758:
4757:
4755:Standard error
4752:
4747:
4742:
4741:
4740:
4735:
4724:
4722:
4716:
4715:
4713:
4712:
4707:
4702:
4697:
4692:
4687:
4685:Optimal design
4682:
4677:
4671:
4669:
4659:
4658:
4653:
4646:
4645:
4642:
4641:
4638:
4637:
4635:
4634:
4629:
4624:
4619:
4614:
4609:
4604:
4599:
4594:
4589:
4584:
4579:
4574:
4569:
4564:
4558:
4556:
4550:
4549:
4547:
4546:
4541:
4540:
4539:
4534:
4524:
4519:
4513:
4511:
4505:
4504:
4502:
4501:
4496:
4491:
4485:
4483:
4482:Summary tables
4479:
4478:
4476:
4475:
4469:
4467:
4461:
4460:
4457:
4456:
4454:
4453:
4452:
4451:
4446:
4441:
4431:
4425:
4423:
4417:
4416:
4414:
4413:
4408:
4403:
4398:
4393:
4388:
4383:
4377:
4375:
4369:
4368:
4366:
4365:
4360:
4355:
4354:
4353:
4348:
4343:
4338:
4333:
4328:
4323:
4318:
4316:Contraharmonic
4313:
4308:
4297:
4295:
4286:
4276:
4275:
4270:
4263:
4262:
4260:
4259:
4254:
4248:
4245:
4244:
4239:
4237:
4236:
4229:
4222:
4214:
4208:
4207:
4193:
4177:
4171:
4153:
4150:
4148:
4147:
4141:
4125:
4119:
4105:
4103:
4100:
4097:
4096:
4089:
4067:
4055:
4049:Arcones M. A.
4042:
4028:
4004:
4002:, p. 321.
3992:
3977:
3962:
3936:
3929:
3909:
3902:
3876:
3861:
3841:
3826:
3797:
3777:
3760:
3754:978-1852338961
3753:
3724:
3708:
3678:
3654:
3652:, p. 128.
3638:
3637:
3635:
3632:
3631:
3630:
3625:
3620:
3615:
3608:
3605:
3598:Optimal design
3593:
3590:
3562:For comparing
3559:
3556:
3550:
3547:
3515:
3512:
3456:
3453:
3436:
3432:
3426:
3422:
3416:
3409:
3405:
3399:
3395:
3370:
3366:
3360:
3356:
3352:
3347:
3342:
3338:
3333:
3328:
3324:
3318:
3314:
3310:
3305:
3300:
3296:
3284:
3283:
3272:
3265:
3260:
3256:
3250:
3245:
3241:
3235:
3228:
3224:
3218:
3214:
3197:
3196:
3183:
3178:
3173:
3170:
3165:
3160:
3157:
3134:
3131:
3111:
3089:
3085:
3064:
3044:
3024:
3013:
3012:
2998:
2993:
2989:
2985:
2982:
2979:
2974:
2969:
2965:
2961:
2958:
2955:
2949:
2943:
2938:
2934:
2930:
2927:
2922:
2918:
2914:
2911:
2908:
2905:
2900:
2895:
2891:
2887:
2884:
2879:
2875:
2871:
2868:
2865:
2862:
2856:
2853:
2848:
2844:
2840:
2835:
2831:
2827:
2824:
2809:
2806:
2793:
2784:holds for all
2782:
2781:
2770:
2765:
2761:
2757:
2754:
2749:
2745:
2741:
2738:
2735:
2732:
2729:
2726:
2721:
2717:
2713:
2710:
2705:
2701:
2697:
2694:
2691:
2688:
2663:
2659:
2636:
2632:
2620:
2619:
2605:
2601:
2589:
2578:
2548:
2544:
2516:
2512:
2491:
2469:
2465:
2442:
2438:
2425:
2422:
2398:
2395:
2375:
2348:
2345:
2342:
2338:
2334:
2323:
2322:
2311:
2308:
2305:
2301:
2297:
2294:
2289:
2286:
2281:
2275:
2272:
2268:
2263:
2257:
2252:
2249:
2244:
2240:
2236:
2230:
2227:
2221:
2217:
2194:
2183:
2182:
2171:
2167:
2160:
2157:
2153:
2148:
2145:
2141:
2135:
2130:
2124:
2121:
2096:
2092:
2089:
2084:
2079:
2058:
2034:
2014:
2005:estimator for
1997:. This is an
1983:
1980:
1951:standard error
1943:
1942:
1931:
1927:
1921:
1918:
1913:
1910:
1906:
1900:
1895:
1890:
1886:
1880:
1875:
1872:
1869:
1865:
1859:
1856:
1851:
1846:
1843:
1816:
1812:
1808:
1805:
1802:
1797:
1793:
1789:
1784:
1780:
1757:
1754:
1726:
1723:
1720:
1717:
1714:
1711:
1706:
1701:
1696:
1692:
1667:
1643:
1631:
1628:
1619:
1616:
1594:
1593:
1582:
1574:
1570:
1564:
1559:
1556:
1553:
1549:
1543:
1540:
1535:
1532:
1529:
1526:
1523:
1497:
1491:
1449:
1403:
1402:
1399:loss functions
1395:
1368:
1319:
1313:
1298:
1297:
1286:
1281:
1278:
1273:
1267:
1258:
1252:
1248:
1244:
1241:
1238:
1178:
1172:
1149:
1141:
1138:
1100:
1097:
1094:
1091:
1088:
1085:
1049:
1045:
1041:
1038:
1035:
1032:
1029:
1026:
1023:
1020:
1000:
997:
994:
991:
988:
985:
982:
979:
976:
973:
970:
950:
947:
944:
941:
938:
935:
932:
908:
903:
899:
895:
892:
889:
886:
883:
878:
874:
870:
867:
864:
852:
841:
834:
828:more efficient
823:
815:
809:
794:
789:
785:
781:
778:
775:
772:
769:
764:
760:
756:
753:
750:
738:
731:
725:
724:
707:
703:
699:
696:
693:
690:
687:
684:
681:
678:
675:
672:
669:
666:
663:
660:
657:
654:
652:
650:
645:
641:
637:
634:
631:
628:
625:
622:
619:
616:
613:
610:
607:
604:
601:
598:
595:
592:
589:
586:
583:
580:
577:
574:
571:
568:
565:
562:
559:
556:
553:
550:
545:
541:
537:
534:
531:
528:
525:
522:
519:
516:
513:
510:
507:
504:
501:
499:
497:
494:
489:
485:
481:
478:
475:
472:
469:
466:
463:
460:
457:
454:
451:
448:
445:
442:
439:
436:
433:
430:
427:
424:
421:
416:
412:
408:
405:
402:
399:
396:
393:
390:
387:
384:
382:
380:
377:
374:
371:
368:
365:
364:
341:
336:
332:
328:
325:
322:
319:
316:
313:
310:
307:
304:
301:
298:
295:
247:
244:
207:
204:
201:
196:
183:
182:
168:
165:
162:
159:
156:
151:
148:
145:
140:
134:
130:
124:
121:
118:
115:
112:
79:
76:
26:
24:
14:
13:
10:
9:
6:
4:
3:
2:
6558:
6547:
6544:
6543:
6541:
6526:
6525:
6516:
6514:
6513:
6504:
6502:
6501:
6496:
6490:
6488:
6487:
6478:
6477:
6474:
6460:
6457:
6455:
6454:Geostatistics
6452:
6450:
6447:
6445:
6442:
6440:
6437:
6436:
6434:
6432:
6428:
6422:
6421:Psychometrics
6419:
6417:
6414:
6412:
6409:
6407:
6404:
6402:
6399:
6397:
6394:
6392:
6389:
6387:
6384:
6382:
6379:
6377:
6374:
6373:
6371:
6369:
6365:
6359:
6356:
6354:
6351:
6349:
6345:
6342:
6340:
6337:
6335:
6332:
6330:
6327:
6326:
6324:
6322:
6318:
6312:
6309:
6307:
6304:
6302:
6298:
6295:
6293:
6290:
6289:
6287:
6285:
6284:Biostatistics
6281:
6277:
6273:
6268:
6264:
6246:
6245:Log-rank test
6243:
6242:
6240:
6236:
6230:
6227:
6226:
6224:
6222:
6218:
6212:
6209:
6207:
6204:
6202:
6199:
6197:
6194:
6193:
6191:
6189:
6185:
6182:
6180:
6176:
6166:
6163:
6161:
6158:
6156:
6153:
6151:
6148:
6146:
6143:
6142:
6140:
6138:
6134:
6128:
6125:
6123:
6120:
6118:
6116:(BoxâJenkins)
6112:
6110:
6107:
6105:
6102:
6098:
6095:
6094:
6093:
6090:
6089:
6087:
6085:
6081:
6075:
6072:
6070:
6069:DurbinâWatson
6067:
6065:
6059:
6057:
6054:
6052:
6051:DickeyâFuller
6049:
6048:
6046:
6042:
6036:
6033:
6031:
6028:
6026:
6025:Cointegration
6023:
6021:
6018:
6016:
6013:
6011:
6008:
6006:
6003:
6001:
6000:Decomposition
5998:
5997:
5995:
5991:
5988:
5986:
5982:
5972:
5969:
5968:
5967:
5964:
5963:
5962:
5959:
5955:
5952:
5951:
5950:
5947:
5945:
5942:
5940:
5937:
5935:
5932:
5930:
5927:
5925:
5922:
5920:
5917:
5915:
5912:
5911:
5909:
5907:
5903:
5897:
5894:
5892:
5889:
5887:
5884:
5882:
5879:
5877:
5874:
5872:
5871:Cohen's kappa
5869:
5868:
5866:
5864:
5860:
5856:
5852:
5848:
5844:
5840:
5835:
5831:
5817:
5814:
5812:
5809:
5807:
5804:
5802:
5799:
5798:
5796:
5794:
5790:
5784:
5780:
5776:
5770:
5768:
5765:
5764:
5762:
5760:
5756:
5750:
5747:
5745:
5742:
5740:
5737:
5735:
5732:
5730:
5727:
5725:
5724:Nonparametric
5722:
5720:
5717:
5716:
5714:
5710:
5704:
5701:
5699:
5696:
5694:
5691:
5689:
5686:
5685:
5683:
5681:
5677:
5671:
5668:
5666:
5663:
5661:
5658:
5656:
5653:
5651:
5648:
5647:
5645:
5643:
5639:
5633:
5630:
5628:
5625:
5623:
5620:
5618:
5615:
5614:
5612:
5610:
5606:
5602:
5595:
5592:
5590:
5587:
5586:
5582:
5578:
5562:
5559:
5558:
5557:
5554:
5552:
5549:
5547:
5544:
5540:
5537:
5535:
5532:
5531:
5530:
5527:
5526:
5524:
5522:
5518:
5508:
5505:
5501:
5495:
5493:
5487:
5485:
5479:
5478:
5477:
5474:
5473:Nonparametric
5471:
5469:
5463:
5459:
5456:
5455:
5454:
5448:
5444:
5443:Sample median
5441:
5440:
5439:
5436:
5435:
5433:
5431:
5427:
5419:
5416:
5414:
5411:
5409:
5406:
5405:
5404:
5401:
5399:
5396:
5394:
5388:
5386:
5383:
5381:
5378:
5376:
5373:
5371:
5368:
5366:
5364:
5360:
5358:
5355:
5354:
5352:
5350:
5346:
5340:
5338:
5334:
5332:
5330:
5325:
5323:
5318:
5314:
5313:
5310:
5307:
5305:
5301:
5291:
5288:
5286:
5283:
5281:
5278:
5277:
5275:
5273:
5269:
5263:
5260:
5256:
5253:
5252:
5251:
5248:
5244:
5241:
5240:
5239:
5236:
5234:
5231:
5230:
5228:
5226:
5222:
5214:
5211:
5209:
5206:
5205:
5204:
5201:
5199:
5196:
5194:
5191:
5189:
5186:
5184:
5181:
5179:
5176:
5175:
5173:
5171:
5167:
5161:
5158:
5154:
5151:
5147:
5144:
5142:
5139:
5138:
5137:
5134:
5133:
5132:
5129:
5125:
5122:
5120:
5117:
5115:
5112:
5110:
5107:
5106:
5105:
5102:
5101:
5099:
5097:
5093:
5090:
5088:
5084:
5078:
5075:
5073:
5070:
5066:
5063:
5062:
5061:
5058:
5056:
5053:
5049:
5048:loss function
5046:
5045:
5044:
5041:
5037:
5034:
5032:
5029:
5027:
5024:
5023:
5022:
5019:
5017:
5014:
5012:
5009:
5005:
5002:
5000:
4997:
4995:
4989:
4986:
4985:
4984:
4981:
4977:
4974:
4972:
4969:
4967:
4964:
4963:
4962:
4959:
4955:
4952:
4950:
4947:
4946:
4945:
4942:
4938:
4935:
4934:
4933:
4930:
4926:
4923:
4922:
4921:
4918:
4916:
4913:
4911:
4908:
4906:
4903:
4902:
4900:
4898:
4894:
4890:
4886:
4881:
4877:
4863:
4860:
4858:
4855:
4853:
4850:
4848:
4845:
4844:
4842:
4840:
4836:
4830:
4827:
4825:
4822:
4820:
4817:
4816:
4814:
4810:
4804:
4801:
4799:
4796:
4794:
4791:
4789:
4786:
4784:
4781:
4779:
4776:
4774:
4771:
4770:
4768:
4766:
4762:
4756:
4753:
4751:
4750:Questionnaire
4748:
4746:
4743:
4739:
4736:
4734:
4731:
4730:
4729:
4726:
4725:
4723:
4721:
4717:
4711:
4708:
4706:
4703:
4701:
4698:
4696:
4693:
4691:
4688:
4686:
4683:
4681:
4678:
4676:
4673:
4672:
4670:
4668:
4664:
4660:
4656:
4651:
4647:
4633:
4630:
4628:
4625:
4623:
4620:
4618:
4615:
4613:
4610:
4608:
4605:
4603:
4600:
4598:
4595:
4593:
4590:
4588:
4585:
4583:
4580:
4578:
4577:Control chart
4575:
4573:
4570:
4568:
4565:
4563:
4560:
4559:
4557:
4555:
4551:
4545:
4542:
4538:
4535:
4533:
4530:
4529:
4528:
4525:
4523:
4520:
4518:
4515:
4514:
4512:
4510:
4506:
4500:
4497:
4495:
4492:
4490:
4487:
4486:
4484:
4480:
4474:
4471:
4470:
4468:
4466:
4462:
4450:
4447:
4445:
4442:
4440:
4437:
4436:
4435:
4432:
4430:
4427:
4426:
4424:
4422:
4418:
4412:
4409:
4407:
4404:
4402:
4399:
4397:
4394:
4392:
4389:
4387:
4384:
4382:
4379:
4378:
4376:
4374:
4370:
4364:
4361:
4359:
4356:
4352:
4349:
4347:
4344:
4342:
4339:
4337:
4334:
4332:
4329:
4327:
4324:
4322:
4319:
4317:
4314:
4312:
4309:
4307:
4304:
4303:
4302:
4299:
4298:
4296:
4294:
4290:
4287:
4285:
4281:
4277:
4273:
4268:
4264:
4258:
4255:
4253:
4250:
4249:
4246:
4242:
4235:
4230:
4228:
4223:
4221:
4216:
4215:
4212:
4204:
4200:
4196:
4194:3-11-013863-8
4190:
4186:
4182:
4178:
4174:
4172:0-387-98502-6
4168:
4164:
4160:
4159:Lehmann, E.L.
4156:
4155:
4151:
4144:
4138:
4134:
4130:
4126:
4122:
4120:0-521-81099-X
4116:
4112:
4107:
4106:
4101:
4092:
4090:0-19-920613-9
4086:
4081:
4080:
4071:
4068:
4065:
4059:
4056:
4052:
4046:
4043:
4038:
4032:
4029:
4025:
4021:
4020:
4015:
4008:
4005:
4001:
3996:
3993:
3988:
3981:
3978:
3973:
3969:
3965:
3963:9780495110811
3959:
3955:
3950:
3949:
3940:
3937:
3932:
3926:
3922:
3921:
3913:
3910:
3905:
3899:
3895:
3890:
3889:
3880:
3877:
3872:
3868:
3864:
3858:
3854:
3853:
3845:
3842:
3837:
3830:
3827:
3822:
3816:
3808:
3804:
3800:
3794:
3790:
3789:
3781:
3778:
3773:
3772:
3764:
3761:
3756:
3750:
3746:
3741:
3740:
3731:
3729:
3725:
3721:
3717:
3712:
3709:
3704:
3700:
3696:
3692:
3685:
3683:
3679:
3675:
3671:
3670:
3665:
3658:
3655:
3651:
3646:
3644:
3640:
3633:
3629:
3626:
3624:
3621:
3619:
3616:
3614:
3611:
3610:
3606:
3604:
3599:
3591:
3589:
3587:
3583:
3579:
3575:
3571:
3569:
3565:
3557:
3555:
3548:
3546:
3544:
3540:
3535:
3532:
3528:
3521:
3513:
3511:
3509:
3505:
3501:
3497:
3493:
3489:
3485:
3481:
3477:
3474:
3470:
3466:
3462:
3454:
3452:
3434:
3430:
3424:
3420:
3414:
3407:
3403:
3397:
3393:
3368:
3364:
3358:
3354:
3350:
3345:
3340:
3336:
3331:
3326:
3322:
3316:
3312:
3308:
3303:
3298:
3294:
3270:
3263:
3258:
3254:
3248:
3243:
3239:
3233:
3226:
3222:
3216:
3212:
3202:
3201:
3200:
3181:
3176:
3171:
3168:
3163:
3158:
3155:
3148:
3147:
3146:
3144:
3140:
3132:
3130:
3128:
3123:
3109:
3087:
3083:
3062:
3042:
3022:
2991:
2987:
2980:
2977:
2967:
2963:
2956:
2953:
2947:
2936:
2928:
2925:
2920:
2916:
2906:
2893:
2885:
2882:
2877:
2873:
2863:
2854:
2846:
2842:
2838:
2833:
2829:
2822:
2815:
2814:
2813:
2807:
2805:
2791:
2763:
2755:
2752:
2747:
2743:
2733:
2727:
2719:
2711:
2708:
2703:
2699:
2689:
2679:
2678:
2677:
2661:
2657:
2634:
2630:
2603:
2599:
2590:
2576:
2568:
2564:
2563:
2562:
2546:
2542:
2534:
2533:
2514:
2510:
2489:
2467:
2463:
2440:
2436:
2423:
2421:
2419:
2415:
2410:
2396:
2393:
2373:
2365:
2360:
2346:
2343:
2340:
2336:
2332:
2309:
2306:
2303:
2299:
2295:
2292:
2287:
2284:
2279:
2273:
2270:
2266:
2261:
2255:
2250:
2247:
2242:
2238:
2234:
2228:
2225:
2219:
2215:
2208:
2207:
2206:
2192:
2169:
2165:
2158:
2155:
2151:
2146:
2143:
2139:
2128:
2122:
2119:
2109:
2108:
2107:
2094:
2090:
2087:
2082:
2077:
2069:and variance
2056:
2048:
2032:
2025:. For large
2012:
2004:
2000:
1981:
1978:
1967:
1966:sample median
1962:
1960:
1956:
1952:
1948:
1929:
1925:
1919:
1916:
1911:
1908:
1904:
1893:
1888:
1884:
1878:
1873:
1870:
1867:
1863:
1857:
1854:
1849:
1841:
1832:
1831:
1830:
1829:, defined as
1814:
1810:
1806:
1803:
1800:
1795:
1791:
1787:
1782:
1778:
1752:
1742:
1737:
1724:
1718:
1715:
1712:
1699:
1694:
1690:
1681:
1665:
1657:
1654:drawn from a
1641:
1629:
1627:
1625:
1617:
1615:
1613:
1608:
1604:
1599:
1580:
1572:
1568:
1562:
1557:
1554:
1551:
1547:
1541:
1538:
1533:
1527:
1521:
1514:
1513:
1512:
1510:
1506:
1500:
1490:
1486:
1481:
1478:
1472:
1468:
1464:
1460:
1456:
1452:
1443:
1438:
1436:
1432:
1428:
1424:
1420:
1417:), parameter
1416:
1412:
1408:
1400:
1396:
1393:
1389:
1385:
1381:
1377:
1373:
1369:
1366:
1362:
1361:
1360:
1357:
1355:
1349:
1344:
1340:
1336:
1317:
1284:
1279:
1276:
1271:
1256:
1246:
1239:
1236:
1229:
1228:
1227:
1225:
1221:
1216:
1210:
1206:
1202:
1196:
1192:
1188:
1181:
1171:
1167:
1162:
1156:
1152:
1139:
1137:
1134:
1132:
1127:
1125:
1121:
1117:
1112:
1098:
1095:
1089:
1083:
1075:
1071:
1068:
1063:
1047:
1039:
1036:
1030:
1024:
995:
989:
986:
983:
977:
971:
968:
948:
945:
939:
933:
922:
901:
897:
890:
887:
884:
876:
872:
865:
862:
851:
847:
840:
833:
829:
822:
818:
808:
787:
783:
776:
773:
770:
762:
758:
751:
748:
737:
730:
727:An estimator
705:
697:
694:
688:
682:
673:
667:
661:
658:
655:
653:
643:
635:
632:
626:
620:
611:
605:
602:
596:
590:
575:
569:
566:
563:
557:
554:
551:
543:
532:
526:
520:
517:
508:
502:
500:
487:
479:
476:
470:
464:
458:
452:
446:
440:
437:
428:
422:
414:
406:
403:
400:
391:
385:
383:
375:
369:
366:
355:
354:
353:
334:
326:
323:
320:
311:
308:
302:
296:
293:
286:is the value
285:
281:
277:
271:
269:
265:
261:
260:loss function
257:
253:
245:
243:
241:
237:
233:
229:
225:
221:
202:
163:
157:
154:
146:
132:
128:
122:
116:
110:
103:
102:
101:
99:
96:
92:
88:
85:
77:
75:
73:
68:
63:
61:
57:
53:
49:
45:
41:
37:
33:
19:
6522:
6510:
6491:
6484:
6396:Econometrics
6346: /
6329:Chemometrics
6306:Epidemiology
6299: /
6272:Applications
6114:ARIMA model
6061:Q-statistic
6010:Stationarity
5906:Multivariate
5849: /
5845: /
5843:Multivariate
5841: /
5781: /
5777: /
5551:Bayes factor
5450:Signed rank
5362:
5336:
5328:
5316:
5054:
5011:Completeness
4847:Cohort study
4745:Opinion poll
4680:Missing data
4667:Study design
4622:Scatter plot
4544:Scatter plot
4537:Spearman's Ï
4499:Grouped data
4184:
4162:
4132:
4110:
4078:
4070:
4058:
4045:
4031:
4017:
4007:
4000:Everitt 2002
3995:
3986:
3980:
3947:
3939:
3919:
3912:
3887:
3879:
3851:
3844:
3835:
3829:
3787:
3780:
3770:
3763:
3738:
3716:Everitt 2002
3711:
3694:
3690:
3667:
3657:
3650:Everitt 2002
3601:
3588:procedures.
3572:
3561:
3552:
3539:L-estimators
3536:
3531:M-estimators
3523:
3496:trimmed mean
3491:
3487:
3483:
3479:
3475:
3472:
3468:
3458:
3286:Now because
3285:
3198:
3136:
3124:
3014:
2811:
2783:
2621:
2530:
2427:
2411:
2361:
2324:
2184:
1963:
1946:
1944:
1738:
1633:
1621:
1606:
1602:
1597:
1595:
1504:
1495:
1488:
1484:
1476:
1470:
1466:
1462:
1458:
1454:
1447:
1439:
1426:
1418:
1414:
1406:
1404:
1388:inadmissible
1358:
1347:
1342:
1338:
1299:
1218:), then the
1214:
1204:
1194:
1190:
1186:
1176:
1169:
1165:
1154:
1147:
1143:
1135:
1128:
1119:
1113:
1073:
1064:
920:
849:
845:
838:
831:
827:
820:
813:
806:
735:
728:
726:
283:
279:
275:
272:
251:
249:
239:
235:
227:
223:
184:
97:
90:
81:
71:
66:
64:
47:
31:
29:
6524:WikiProject
6439:Cartography
6401:Jurimetrics
6353:Reliability
6084:Time domain
6063:(LjungâBox)
5985:Time-series
5863:Categorical
5847:Time-series
5839:Categorical
5774:(Bernoulli)
5609:Correlation
5589:Correlation
5385:JarqueâBera
5357:Chi-squared
5119:M-estimator
5072:Asymptotics
5016:Sufficiency
4783:Interaction
4695:Replication
4675:Effect size
4632:Violin plot
4612:Radar chart
4592:Forest plot
4582:Correlogram
4532:Kendall's Ï
3697:: 309â368.
3603:objective.
3508:heavy tails
2529:is said to
1741:sample mean
1509:sample mean
1384:sample mean
1222:states the
6391:Demography
6109:ARMA model
5914:Regression
5491:(Friedman)
5452:(Wilcoxon)
5390:Normality
5380:Lilliefors
5327:Student's
5203:Resampling
5077:Robustness
5065:divergence
5055:Efficiency
4993:(monotone)
4988:Likelihood
4905:Population
4738:Stratified
4690:Population
4509:Dependence
4465:Count data
4396:Percentile
4373:Dispersion
4306:Arithmetic
4241:Statistics
4102:References
3903:052100618X
3718:, p.
3502:, such as
3455:Robustness
2649:dominates
2622:Formally,
2364:asymptotic
2049:with mean
2003:consistent
1507:using the
1211:(that is,
78:Estimators
32:efficiency
5772:Logistic
5539:posterior
5465:Rank sum
5213:Jackknife
5208:Bootstrap
5026:Bootstrap
4961:Parameter
4910:Statistic
4705:Statistic
4617:Run chart
4602:Pie chart
4597:Histogram
4587:Fan chart
4562:Bar chart
4444:L-moments
4331:Geometric
4024:EMS Press
3972:183886598
3871:726074601
3815:cite book
3807:851161356
3674:EMS Press
3478:) and 2%
3365:σ
3323:σ
3172:μ
3169:σ
3159:≡
3110:θ
3043:θ
3015:Although
2981:
2957:
2929:θ
2926:−
2907:
2886:θ
2883:−
2864:
2792:θ
2756:θ
2753:−
2734:
2728:≤
2712:θ
2709:−
2690:
2577:θ
2490:θ
2344:≈
2333:π
2307:≈
2304:π
2285:−
2267:π
2229:~
2152:π
2144:μ
2129:∼
2123:~
2078:π
2057:μ
2013:μ
1982:~
1909:μ
1894:∼
1864:∑
1845:¯
1804:…
1756:¯
1713:μ
1700:∼
1678:and unit
1666:μ
1548:∑
1318:θ
1277:−
1272:θ
1257:≥
1240:
1201:estimator
1070:estimator
1040:θ
1037:−
1025:
1011:, as the
990:
972:
949:θ
934:
891:
866:
777:
752:
698:θ
695:−
683:
662:
636:θ
633:−
621:
606:θ
603:−
591:
567:−
527:
521:−
509:
480:θ
477:−
465:
447:
441:−
429:
407:θ
404:−
392:
370:
327:θ
324:−
297:
264:quadratic
256:estimator
203:θ
158:
147:θ
95:parameter
87:estimator
36:estimator
6540:Category
6486:Category
6179:Survival
6056:Johansen
5779:Binomial
5734:Isotonic
5321:(normal)
4966:location
4773:Blocking
4728:Sampling
4607:QâQ plot
4572:Box plot
4554:Graphics
4449:Skewness
4439:Kurtosis
4411:Variance
4341:Heronian
4336:Harmonic
4131:(1998).
4053:preprint
3607:See also
3504:skewness
3383:we have
3145:, i.e.,
2532:dominate
2414:outliers
2205:is thus
1999:unbiased
1682:, i.e.,
1680:variance
1658:of mean
1431:binomial
1372:unbiased
1224:variance
1209:unbiased
1144:Suppose
1118:for all
1076:attains
1067:unbiased
84:unbiased
62:sense.
56:deviance
52:variance
6512:Commons
6459:Kriging
6344:Process
6301:studies
6160:Wavelet
5993:General
5160:Plug-in
4954:L space
4733:Cluster
4434:Moments
4252:Outline
4203:1291393
2502:, then
1429:in the
1421:of the
1409:of the
1333:is the
1159:} is a
855:, i.e.
846:smaller
242:) †1.
218:is the
93:, of a
60:L2 norm
6381:Census
5971:Normal
5919:Manova
5739:Robust
5489:2-way
5481:1-way
5319:-test
4990:
4567:Biplot
4358:Median
4351:Lehmer
4293:Center
4201:
4191:
4169:
4139:
4117:
4087:
3970:
3960:
3927:
3900:
3869:
3859:
3805:
3795:
3751:
3747:â305.
3701:
1578:
1300:where
1260:
1254:
1199:be an
1122:. The
1120:θ
1074:θ
1065:If an
254:is an
185:where
98:θ
46:. An
6005:Trend
5534:prior
5476:anova
5365:-test
5339:-test
5331:-test
5238:Power
5183:Pivot
4976:shape
4971:scale
4421:Shape
4401:Range
4346:Heinz
4321:Cubic
4257:Index
3703:91208
3699:JSTOR
3634:Notes
3568:power
2310:0.64.
1494:, âŠ,
1386:, is
1175:, âŠ,
830:than
6238:Test
5438:Sign
5290:Wald
4363:Mode
4301:Mean
4189:ISBN
4167:ISBN
4137:ISBN
4115:ISBN
4085:ISBN
3968:OCLC
3958:ISBN
3925:ISBN
3898:ISBN
3867:OCLC
3857:ISBN
3821:link
3803:OCLC
3793:ISBN
3749:ISBN
3580:(or
3576:and
2565:its
2561:if:
2455:and
2347:1.57
2001:and
1739:The
1465:) |
1213:E =
1163:and
1157:â Î
885:>
812:and
771:<
65:The
5418:BIC
5413:AIC
3954:445
3894:165
3745:303
3720:128
3695:222
3506:or
2978:var
2954:var
2676:if
2428:If
2420:).
1487:= (
1433:or
1350:â Î
1237:var
1168:= (
987:var
969:MSE
888:var
863:var
844:is
826:is
774:MSE
749:MSE
741:if
659:var
367:MSE
294:MSE
250:An
155:var
6542::
4199:MR
4197:.
4022:,
4016:,
3966:.
3956:.
3896:.
3865:.
3817:}}
3813:{{
3801:.
3727:^
3693:.
3681:^
3672:,
3666:,
3642:^
3570:.
3492:Ό.
3486:10
3484:Ό,
3473:Ό,
3122:.
1968:,
1743:,
1473:}.
1469:â
1461:,
1453:=
1446:{
1437:.
1189:=
1153:|
1146:{
89:,
5363:G
5337:F
5329:t
5317:Z
5036:V
5031:U
4233:e
4226:t
4219:v
4205:.
4175:.
4145:.
4123:.
4093:.
4039:.
3974:.
3933:.
3906:.
3873:.
3823:)
3809:.
3757:.
3722:.
3705:.
3488:Ï
3482:(
3480:N
3476:Ï
3471:(
3469:N
3435:2
3431:n
3425:1
3421:n
3415:=
3408:2
3404:e
3398:1
3394:e
3369:2
3359:2
3355:n
3351:=
3346:2
3341:2
3337:s
3332:,
3327:2
3317:1
3313:n
3309:=
3304:2
3299:1
3295:s
3271:.
3264:2
3259:2
3255:s
3249:2
3244:1
3240:s
3234:=
3227:2
3223:e
3217:1
3213:e
3182:2
3177:)
3164:(
3156:e
3088:1
3084:T
3063:e
3023:e
2997:)
2992:1
2988:T
2984:(
2973:)
2968:2
2964:T
2960:(
2948:=
2942:]
2937:2
2933:)
2921:1
2917:T
2913:(
2910:[
2904:E
2899:]
2894:2
2890:)
2878:2
2874:T
2870:(
2867:[
2861:E
2855:=
2852:)
2847:2
2843:T
2839:,
2834:1
2830:T
2826:(
2823:e
2769:]
2764:2
2760:)
2748:2
2744:T
2740:(
2737:[
2731:E
2725:]
2720:2
2716:)
2704:1
2700:T
2696:(
2693:[
2687:E
2662:2
2658:T
2635:1
2631:T
2604:2
2600:T
2547:2
2543:T
2515:1
2511:T
2468:2
2464:T
2441:1
2437:T
2397:,
2394:N
2374:N
2341:2
2337:/
2300:/
2296:2
2293:=
2288:1
2280:)
2274:N
2271:2
2262:(
2256:)
2251:N
2248:1
2243:(
2239:=
2235:)
2226:X
2220:(
2216:e
2193:N
2170:.
2166:)
2159:N
2156:2
2147:,
2140:(
2134:N
2120:X
2095:,
2091:N
2088:2
2083:/
2033:N
1979:X
1947:N
1930:.
1926:)
1920:N
1917:1
1912:,
1905:(
1899:N
1889:n
1885:X
1879:N
1874:1
1871:=
1868:n
1858:N
1855:1
1850:=
1842:X
1815:N
1811:X
1807:,
1801:,
1796:2
1792:X
1788:,
1783:1
1779:X
1753:X
1725:.
1722:)
1719:1
1716:,
1710:(
1705:N
1695:n
1691:X
1642:N
1607:n
1603:Ï
1598:Ξ
1581:.
1573:i
1569:x
1563:n
1558:1
1555:=
1552:i
1542:n
1539:1
1534:=
1531:)
1528:X
1525:(
1522:T
1505:Ξ
1501:)
1498:n
1496:x
1492:1
1489:x
1485:X
1477:n
1471:R
1467:Ξ
1463:Ï
1459:Ξ
1457:(
1455:N
1450:Ξ
1448:P
1427:p
1419:λ
1415:Ï
1407:Ό
1394:.
1348:Ξ
1339:Ξ
1312:I
1285:,
1280:1
1266:I
1251:]
1247:T
1243:[
1215:Ξ
1205:Ξ
1197:)
1195:X
1193:(
1191:T
1187:T
1182:)
1179:n
1177:X
1173:1
1170:X
1166:X
1155:Ξ
1150:Ξ
1148:P
1099:1
1096:=
1093:)
1090:T
1087:(
1084:e
1048:2
1044:)
1034:]
1031:T
1028:[
1022:E
1019:(
999:)
996:T
993:(
984:=
981:)
978:T
975:(
946:=
943:]
940:T
937:[
931:E
921:Ξ
907:)
902:2
898:T
894:(
882:)
877:1
873:T
869:(
853:1
850:T
842:2
839:T
835:1
832:T
824:2
821:T
816:2
814:T
810:1
807:T
793:)
788:2
784:T
780:(
768:)
763:1
759:T
755:(
739:2
736:T
732:1
729:T
706:2
702:)
692:]
689:T
686:[
680:E
677:(
674:+
671:)
668:T
665:(
656:=
644:2
640:)
630:]
627:T
624:[
618:E
615:(
612:+
609:)
600:]
597:T
594:[
588:E
585:(
582:]
579:]
576:T
573:[
570:E
564:T
561:[
558:E
555:2
552:+
549:]
544:2
540:)
536:]
533:T
530:[
524:E
518:T
515:(
512:[
506:E
503:=
493:]
488:2
484:)
474:]
471:T
468:[
462:E
459:+
456:]
453:T
450:[
444:E
438:T
435:(
432:[
426:E
423:=
420:]
415:2
411:)
401:T
398:(
395:[
389:E
386:=
379:)
376:T
373:(
340:]
335:2
331:)
321:T
318:(
315:[
312:E
309:=
306:)
303:T
300:(
284:T
280:Ξ
276:T
240:T
238:(
236:e
228:T
226:(
224:e
206:)
200:(
195:I
167:)
164:T
161:(
150:)
144:(
139:I
133:/
129:1
123:=
120:)
117:T
114:(
111:e
91:T
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.