1488:
5784:
381:
3202:
1172:
2658:
2837:
5770:
1483:{\displaystyle {\begin{aligned}{\hat {\theta }}_{\mathrm {MAP} }(x)&={\underset {\theta }{\operatorname {arg\,max} }}\ f(\theta \mid x)\\&={\underset {\theta }{\operatorname {arg\,max} }}\ {\frac {f(x\mid \theta )\,g(\theta )}{\displaystyle \int _{\Theta }f(x\mid \vartheta )\,g(\vartheta )\,d\vartheta }}\\&={\underset {\theta }{\operatorname {arg\,max} }}\ f(x\mid \theta )\,g(\theta ).\end{aligned}}\!}
1876:
36:
2352:
5808:
5796:
149:
3197:{\displaystyle {\hat {\mu }}_{\mathrm {MAP} }={\frac {\sigma _{m}^{2}\,n}{\sigma _{m}^{2}\,n+\sigma _{v}^{2}}}\left({\frac {1}{n}}\sum _{j=1}^{n}x_{j}\right)+{\frac {\sigma _{v}^{2}}{\sigma _{m}^{2}\,n+\sigma _{v}^{2}}}\,\mu _{0}={\frac {\sigma _{m}^{2}\left(\sum _{j=1}^{n}x_{j}\right)+\sigma _{v}^{2}\,\mu _{0}}{\sigma _{m}^{2}\,n+\sigma _{v}^{2}}}.}
2653:{\displaystyle g(\mu )f(x\mid \mu )=\pi (\mu )L(\mu )={\frac {1}{{\sqrt {2\pi }}\sigma _{m}}}\exp \left(-{\frac {1}{2}}\left({\frac {\mu -\mu _{0}}{\sigma _{m}}}\right)^{2}\right)\prod _{j=1}^{n}{\frac {1}{{\sqrt {2\pi }}\sigma _{v}}}\exp \left(-{\frac {1}{2}}\left({\frac {x_{j}-\mu }{\sigma _{v}}}\right)^{2}\right),}
1054:
1852:(under the 0â1 loss function), it is not very representative of Bayesian methods in general. This is because MAP estimates are point estimates, whereas Bayesian methods are characterized by the use of distributions to summarize data and draw inferences: thus, Bayesian methods tend to report the posterior
2822:
1703:
794:
1868:âand for a continuous posterior distribution there is no loss function which suggests the MAP is the optimal point estimator. In addition, the posterior distribution may often not have a simple analytic form: in this case, the distribution can be simulated using
927:
3326:
2689:
1177:
678:
1594:
693:
1913:
As an example of the difference between Bayes estimators mentioned above (mean and median estimators) and using a MAP estimate, consider the case where there is a need to classify inputs
2313:
2236:
464:(that quantifies the additional information available through prior knowledge of a related event) over the quantity one wants to estimate. MAP estimation can therefore be seen as a
3245:
2186:
593:
53:
1774:
1750:
1535:
1515:
1160:
1117:
1097:
1049:{\displaystyle \theta \mapsto f(\theta \mid x)={\frac {f(x\mid \theta )\,g(\theta )}{\displaystyle \int _{\Theta }f(x\mid \vartheta )\,g(\vartheta )\,d\vartheta }}\!}
915:
883:
863:
817:
633:
494:
2086:
2059:
2012:
1985:
1958:
2681:
2333:
2259:
4905:
1933:
as either positive or negative (for example, loans as risky or safe). Suppose there are just three possible hypotheses about the correct method of classification
1906:
Finally, unlike ML estimators, the MAP estimate is not invariant under reparameterization. Switching from one parameterization to another involves introducing a
5410:
2126:
2106:
2032:
1931:
1726:
1575:
1555:
1137:
1077:
843:
613:
558:
534:
514:
411:
5560:
5184:
202:
3825:
3254:
1864:. This is both because these estimators are optimal under squared-error and linear-error loss respectivelyâwhich are more representative of typical
4958:
284:
5397:
100:
3489:
3427:
3397:
3417:
72:
3820:
3520:
4424:
3572:
2817:{\displaystyle \sum _{j=1}^{n}\left({\frac {x_{j}-\mu }{\sigma _{v}}}\right)^{2}+\left({\frac {\mu -\mu _{0}}{\sigma _{m}}}\right)^{2}.}
5812:
79:
1826:
1815:
5207:
5099:
3470:
3451:
404:
367:
119:
5385:
5259:
294:
86:
197:
5443:
5104:
4220:
3810:
1907:
4434:
5494:
4706:
4513:
4402:
4360:
258:
57:
3599:
2061:
classifies it as positive, whereas the other two classify it as negative. Using the MAP estimate for the correct classifier
68:
5839:
5737:
4696:
4746:
1698:{\displaystyle L(\theta ,a)={\begin{cases}0,&{\text{if }}|a-\theta |<c,\\1,&{\text{otherwise}},\\\end{cases}}}
789:{\displaystyle {\hat {\theta }}_{\mathrm {MLE} }(x)={\underset {\theta }{\operatorname {arg\,max} }}\ f(x\mid \theta )\!}
5834:
5288:
5237:
5222:
5212:
5081:
4953:
4920:
4701:
4531:
465:
397:
289:
227:
5357:
4658:
641:
5632:
5433:
4412:
4081:
3545:
5517:
5484:
1895:. In such a case, the usual recommendation is that one should choose the highest mode: this is not always feasible (
5489:
5232:
4991:
4897:
4877:
4785:
4496:
4314:
3797:
3669:
1807:
457:
279:
248:
4663:
4429:
4287:
46:
5249:
5017:
4738:
4592:
4521:
4441:
4299:
4280:
3988:
3709:
2264:
1900:
1811:
341:
222:
5362:
5732:
5499:
5047:
5012:
4976:
4761:
4203:
4112:
4071:
3983:
3674:
3513:
3347:
Bassett, Robert; Deride, Julio (2018-01-30). "Maximum a posteriori estimators as a limit of Bayes estimators".
3248:
1869:
1777:
362:
274:
4769:
4753:
2194:
93:
5641:
5254:
5194:
5131:
4491:
4353:
4343:
4193:
4107:
3480:
Hald, Anders (2007). "Gauss's
Derivation of the Normal Distribution and the Method of Least Squares, 1809".
3217:
5402:
5339:
5679:
5609:
5094:
4981:
3978:
3875:
3782:
3661:
3560:
2139:
894:
445:
253:
5800:
4678:
5704:
5646:
5589:
5415:
5308:
5217:
4943:
4827:
4686:
4568:
4560:
4375:
4271:
4249:
4208:
4173:
4140:
4086:
4061:
4016:
3955:
3915:
3717:
3540:
3422:. Cambridge Series in Statistical and Probabilistic Mathematics. Cambridge: Cambridge University Press.
537:
156:
5783:
4673:
2108:
is classified as positive, whereas the Bayes estimators would average over all hypotheses and classify
1903:
issues arise). Furthermore, the highest mode may be uncharacteristic of the majority of the posterior.
380:
563:
5627:
5202:
5151:
5127:
5089:
5007:
4986:
4938:
4817:
4795:
4764:
4550:
4501:
4419:
4392:
4348:
4304:
4066:
3842:
3722:
3208:
2340:
1892:
1880:
1793:
336:
217:
187:
1624:
5774:
5699:
5622:
5303:
5067:
5060:
5022:
4930:
4910:
4882:
4615:
4481:
4476:
4466:
4458:
4276:
4237:
4127:
4117:
4026:
3805:
3761:
3679:
3604:
3506:
1896:
1837:
1494:
890:
684:
425:
168:
160:
140:
5349:
5788:
5599:
5453:
5298:
5174:
5071:
5055:
5032:
4809:
4543:
4526:
4486:
4397:
4292:
4254:
4225:
4185:
4145:
4091:
4008:
3694:
3689:
3352:
1833:
1804:
823:
461:
453:
452:
of an unobserved quantity on the basis of empirical data. It is closely related to the method of
385:
310:
182:
212:
5844:
5694:
5664:
5656:
5476:
5467:
5392:
5323:
5179:
5164:
5139:
5027:
4968:
4834:
4822:
4448:
4365:
4309:
4232:
4076:
3998:
3777:
3651:
3485:
3466:
3447:
3423:
3393:
3370:
1861:
1578:
1163:
918:
441:
315:
192:
164:
1759:
1735:
1520:
1500:
1145:
1102:
1082:
900:
868:
848:
802:
618:
479:
5719:
5674:
5438:
5425:
5318:
5293:
5227:
5159:
5037:
4645:
4538:
4471:
4384:
4331:
4150:
4021:
3815:
3614:
3581:
3362:
449:
207:
2064:
2037:
1990:
1963:
1936:
5636:
5380:
5242:
5169:
4844:
4718:
4691:
4668:
4637:
4264:
4259:
4213:
3943:
3594:
2666:
2336:
2318:
2244:
2238:
1849:
1797:
1753:
1729:
886:
243:
5585:
5580:
4043:
3973:
3619:
2111:
2091:
2017:
1916:
1711:
1560:
1540:
1122:
1062:
828:
598:
543:
519:
499:
5828:
5742:
5709:
5572:
5533:
5344:
5313:
4777:
4731:
4336:
4038:
3865:
3629:
3624:
3211:
between the prior mean and the sample mean weighted by their respective covariances.
1888:
1865:
1848:
While only mild conditions are required for MAP estimation to be a limiting case of
1585:
3895:
17:
5684:
5617:
5594:
5509:
4839:
4135:
4033:
3968:
3910:
3832:
3787:
1872:
techniques, while optimization to find its mode(s) may be difficult or impossible.
357:
1517:
and therefore plays no role in the optimization. Observe that the MAP estimate of
3482:
A History of
Parametric Statistical Inference from Bernoulli to Fisher, 1713â1935
3321:{\displaystyle {\hat {\mu }}_{\mathrm {MAP} }\to {\hat {\mu }}_{\mathrm {MLE} }.}
1883:
in which the highest mode is uncharacteristic of the majority of the distribution
5727:
5689:
5372:
5273:
5135:
4948:
4915:
4407:
4324:
4319:
3963:
3920:
3900:
3880:
3870:
3639:
35:
4573:
4053:
3753:
3684:
3634:
3609:
3529:
3366:
1875:
1819:
3374:
2014:
with posteriors 0.4, 0.3 and 0.3 respectively. Suppose given a new instance,
1792:
Analytically, when the mode(s) of the posterior distribution can be given in
4726:
4578:
4198:
3993:
3905:
3890:
3885:
3850:
4242:
3860:
3737:
3732:
3727:
3699:
5747:
5448:
1899:
is a difficult problem), nor in some cases even possible (such as when
148:
5669:
4650:
4624:
4604:
3855:
3646:
1857:
3357:
476:
Assume that we want to estimate an unobserved population parameter
3589:
1853:
1732:
approaches the MAP estimator, provided that the distribution of
5558:
5125:
4872:
4171:
3941:
3558:
3502:
2189:
29:
3498:
1829:. This does not require derivatives of the posterior density.
1142:
The method of maximum a posteriori estimation then estimates
2663:
which is equivalent to minimizing the following function of
3247:
is called a non-informative prior and leads to an improper
1691:
27:
Method of estimating the parameters of a statistical model
3392:. Cambridge, Massachusetts: MIT Press. pp. 151â152.
1822:, which have to be evaluated analytically or numerically.
1752:
is quasi-concave. But generally a MAP estimator is not a
1493:
The denominator of the posterior distribution (so-called
1166:
of the posterior distribution of this random variable:
440:
is an estimate of an unknown quantity, that equals the
3257:
3220:
2840:
2692:
2669:
2355:
2321:
2267:
2247:
2197:
2142:
2114:
2094:
2067:
2040:
2020:
1993:
1966:
1939:
1919:
1762:
1738:
1714:
1597:
1563:
1543:
1523:
1503:
1348:
1175:
1148:
1125:
1105:
1085:
1065:
993:
930:
903:
871:
851:
831:
805:
696:
644:
621:
601:
566:
546:
522:
502:
482:
5411:
Autoregressive conditional heteroskedasticity (ARCH)
5718:
5655:
5608:
5571:
5526:
5508:
5475:
5466:
5424:
5371:
5332:
5281:
5272:
5193:
5150:
5080:
5046:
5000:
4967:
4929:
4896:
4808:
4717:
4636:
4591:
4559:
4512:
4457:
4383:
4374:
4184:
4126:
4100:
4052:
4007:
3954:
3841:
3796:
3770:
3752:
3708:
3660:
3580:
3571:
3390:
Machine learning : a probabilistic perspective
60:. Unsourced material may be challenged and removed.
3320:
3239:
3196:
2816:
2675:
2652:
2327:
2307:
2253:
2230:
2180:
2120:
2100:
2080:
2053:
2026:
2006:
1979:
1952:
1925:
1768:
1744:
1720:
1697:
1569:
1549:
1529:
1509:
1482:
1154:
1131:
1111:
1091:
1071:
1048:
909:
877:
857:
837:
811:
788:
672:
627:
607:
587:
552:
528:
508:
488:
1479:
1045:
785:
673:{\displaystyle \theta \mapsto f(x\mid \theta )\!}
669:
4959:Multivariate adaptive regression splines (MARS)
2335:. Note that the normal distribution is its own
1788:MAP estimates can be computed in several ways:
2346:The function to be maximized is then given by
1537:coincides with the ML estimate when the prior
3514:
3463:Parameter Estimation: Principles and Problems
1910:that impacts on the location of the maximum.
405:
8:
1497:) is always positive and does not depend on
615:when the underlying population parameter is
2308:{\displaystyle N(\mu _{0},\sigma _{m}^{2})}
5568:
5555:
5472:
5278:
5147:
5122:
4893:
4869:
4597:
4380:
4181:
4168:
3951:
3938:
3577:
3568:
3555:
3521:
3507:
3499:
456:(ML) estimation, but employs an augmented
412:
398:
131:
3356:
3302:
3301:
3290:
3289:
3272:
3271:
3260:
3259:
3256:
3225:
3219:
3182:
3177:
3166:
3160:
3155:
3143:
3138:
3132:
3127:
3109:
3099:
3088:
3073:
3068:
3061:
3052:
3047:
3038:
3033:
3022:
3016:
3011:
3000:
2995:
2989:
2975:
2965:
2954:
2940:
2926:
2921:
2910:
2904:
2899:
2889:
2883:
2878:
2871:
2855:
2854:
2843:
2842:
2839:
2805:
2793:
2782:
2769:
2755:
2743:
2726:
2719:
2708:
2697:
2691:
2668:
2636:
2624:
2607:
2600:
2585:
2562:
2548:
2542:
2536:
2525:
2510:
2498:
2487:
2474:
2459:
2436:
2422:
2416:
2354:
2320:
2296:
2291:
2278:
2266:
2246:
2219:
2214:
2196:
2169:
2150:
2141:
2113:
2093:
2072:
2066:
2045:
2039:
2019:
1998:
1992:
1971:
1965:
1944:
1938:
1918:
1761:
1737:
1713:
1680:
1654:
1640:
1635:
1619:
1596:
1562:
1542:
1522:
1502:
1459:
1422:
1412:
1410:
1390:
1377:
1353:
1334:
1313:
1294:
1284:
1282:
1235:
1225:
1223:
1194:
1193:
1182:
1181:
1176:
1174:
1147:
1124:
1104:
1084:
1064:
1035:
1022:
998:
979:
958:
929:
902:
870:
850:
830:
804:
748:
738:
736:
711:
710:
699:
698:
695:
643:
620:
600:
565:
545:
521:
501:
481:
120:Learn how and when to remove this message
1874:
1818:. This usually requires first or second
285:Integrated nested Laplace approximations
3336:
2315:. We wish to find the MAP estimate of
2231:{\displaystyle N(\mu ,\sigma _{v}^{2})}
349:
328:
302:
266:
235:
174:
139:
5485:KaplanâMeier estimator (product limit)
3484:. New York: Springer. pp. 55â61.
3240:{\displaystyle \sigma _{m}\to \infty }
799:is the maximum likelihood estimate of
2136:Suppose that we are given a sequence
7:
5795:
5495:Accelerated failure time (AFT) model
3411:
3409:
3342:
3340:
2181:{\displaystyle (x_{1},\dots ,x_{n})}
58:adding citations to reliable sources
5807:
5090:Analysis of variance (ANOVA, anova)
3419:Essentials of Statistical Inference
3416:Young, G. A.; Smith, R. L. (2005).
5185:CochranâMantelâHaenszel statistics
3811:Pearson product-moment correlation
3309:
3306:
3303:
3279:
3276:
3273:
3234:
2862:
2859:
2856:
1827:expectation-maximization algorithm
1429:
1426:
1423:
1419:
1416:
1413:
1354:
1301:
1298:
1295:
1291:
1288:
1285:
1242:
1239:
1236:
1232:
1229:
1226:
1201:
1198:
1195:
1106:
999:
755:
752:
749:
745:
742:
739:
718:
715:
712:
468:of maximum likelihood estimation.
448:. The MAP can be used to obtain a
25:
1887:In many types of models, such as
865:exists. This allows us to treat
69:"Maximum a posteriori estimation"
5806:
5794:
5782:
5769:
5768:
588:{\displaystyle f(x\mid \theta )}
430:maximum a posteriori probability
379:
295:Approximate Bayesian computation
147:
34:
5444:Least-squares spectral analysis
2339:, so we will be able to find a
321:Maximum a posteriori estimation
45:needs additional citations for
4425:Mean-unbiased minimum-variance
3295:
3285:
3265:
3231:
2848:
2410:
2404:
2398:
2392:
2383:
2371:
2365:
2359:
2302:
2271:
2225:
2201:
2175:
2143:
1655:
1641:
1613:
1601:
1469:
1463:
1456:
1444:
1387:
1381:
1374:
1362:
1344:
1338:
1331:
1319:
1269:
1257:
1213:
1207:
1187:
1032:
1026:
1019:
1007:
989:
983:
976:
964:
952:
940:
934:
782:
770:
730:
724:
704:
666:
654:
648:
582:
570:
1:
5738:Geographic information system
4954:Simultaneous equations models
3444:Optimal Statistical Decisions
1879:An example of a density of a
496:on the basis of observations
4921:Coefficient of determination
4532:Uniformly most powerful test
3461:Sorenson, Harold W. (1980).
2241:and a prior distribution of
228:Principle of maximum entropy
5490:Proportional hazards models
5434:Spectral density estimation
5416:Vector autoregression (VAR)
4850:Maximum posterior estimator
4082:Randomized controlled trial
198:Bernsteinâvon Mises theorem
5861:
5250:Multivariate distributions
3670:Average absolute deviation
5764:
5567:
5554:
5238:Structural equation model
5146:
5121:
4892:
4868:
4600:
4574:Score/Lagrange multiplier
4180:
4167:
3989:Sample size determination
3950:
3937:
3567:
3554:
3536:
3388:Murphy, Kevin P. (2012).
3367:10.1007/s10107-018-1241-0
1825:Via a modification of an
1812:conjugate gradient method
1796:. This is the case when
223:Principle of indifference
5733:Environmental statistics
5255:Elliptical distributions
5048:Generalized linear model
4977:Simple linear regression
4747:HodgesâLehmann estimator
4204:Probability distribution
4113:Stochastic approximation
3675:Coefficient of variation
3349:Mathematical Programming
3249:probability distribution
3207:which turns out to be a
1870:Markov chain Monte Carlo
893:. We can calculate the
275:Markov chain Monte Carlo
5393:Cross-correlation (XCF)
5001:Non-standard predictors
4435:LehmannâScheffĂ© theorem
4108:Adaptive clinical trial
1891:, the posterior may be
1860:instead, together with
1769:{\displaystyle \theta }
1745:{\displaystyle \theta }
1530:{\displaystyle \theta }
1510:{\displaystyle \theta }
1155:{\displaystyle \theta }
1112:{\displaystyle \Theta }
1092:{\displaystyle \theta }
1079:is density function of
910:{\displaystyle \theta }
878:{\displaystyle \theta }
858:{\displaystyle \theta }
812:{\displaystyle \theta }
628:{\displaystyle \theta }
489:{\displaystyle \theta }
280:Laplace's approximation
267:Posterior approximation
5789:Mathematics portal
5610:Engineering statistics
5518:NelsonâAalen estimator
5095:Analysis of covariance
4982:Ordinary least squares
4906:Pearson product-moment
4310:Statistical functional
4221:Empirical distribution
4054:Controlled experiments
3783:Frequency distribution
3561:Descriptive statistics
3322:
3241:
3198:
3104:
2970:
2827:Thus, we see that the
2818:
2713:
2677:
2654:
2541:
2329:
2309:
2255:
2232:
2182:
2122:
2102:
2082:
2055:
2028:
2008:
1981:
1954:
1927:
1884:
1770:
1746:
1722:
1699:
1571:
1551:
1531:
1511:
1484:
1156:
1133:
1113:
1093:
1073:
1050:
911:
895:posterior distribution
879:
859:
839:
813:
790:
674:
635:. Then the function:
629:
609:
595:is the probability of
589:
554:
530:
510:
490:
458:optimization objective
446:posterior distribution
386:Mathematics portal
329:Evidence approximation
5705:Population statistics
5647:System identification
5381:Autocorrelation (ACF)
5309:Exponential smoothing
5223:Discriminant analysis
5218:Canonical correlation
5082:Partition of variance
4944:Regression validation
4788:(JonckheereâTerpstra)
4687:Likelihood-ratio test
4376:Frequentist inference
4288:Locationâscale family
4209:Sampling distribution
4174:Statistical inference
4141:Cross-sectional study
4128:Observational studies
4087:Randomized experiment
3916:Stem-and-leaf display
3718:Central limit theorem
3323:
3242:
3199:
3084:
2950:
2819:
2693:
2678:
2655:
2521:
2330:
2310:
2256:
2233:
2183:
2123:
2103:
2083:
2081:{\displaystyle h_{1}}
2056:
2054:{\displaystyle h_{1}}
2029:
2009:
2007:{\displaystyle h_{3}}
1982:
1980:{\displaystyle h_{2}}
1955:
1953:{\displaystyle h_{1}}
1928:
1878:
1771:
1747:
1723:
1700:
1572:
1552:
1532:
1512:
1485:
1157:
1134:
1114:
1094:
1074:
1051:
912:
880:
860:
840:
814:
791:
675:
630:
610:
590:
555:
538:sampling distribution
531:
511:
491:
460:which incorporates a
290:Variational inference
5840:Logic and statistics
5628:Probabilistic design
5213:Principal components
5056:Exponential families
5008:Nonlinear regression
4987:General linear model
4949:Mixed effects models
4939:Errors and residuals
4916:Confounding variable
4818:Bayesian probability
4796:Van der Waerden test
4786:Ordered alternative
4551:Multiple comparisons
4430:RaoâBlackwellization
4393:Estimating equations
4349:Statistical distance
4067:Factorial experiment
3600:Arithmetic-Geometric
3442:DeGroot, M. (1970).
3255:
3218:
3209:linear interpolation
2838:
2690:
2676:{\displaystyle \mu }
2667:
2353:
2341:closed-form solution
2328:{\displaystyle \mu }
2319:
2265:
2254:{\displaystyle \mu }
2245:
2195:
2140:
2112:
2092:
2065:
2038:
2018:
1991:
1964:
1937:
1917:
1881:bimodal distribution
1760:
1736:
1712:
1595:
1561:
1541:
1521:
1501:
1173:
1146:
1123:
1103:
1083:
1063:
928:
901:
869:
849:
829:
803:
694:
642:
619:
599:
564:
544:
520:
500:
480:
368:Posterior predictive
337:Evidence lower bound
218:Likelihood principle
188:Bayesian probability
54:improve this article
18:Maximum a posteriori
5835:Bayesian estimation
5700:Official statistics
5623:Methods engineering
5304:Seasonal adjustment
5072:Poisson regressions
4992:Bayesian regression
4931:Regression analysis
4911:Partial correlation
4883:Regression analysis
4482:Prediction interval
4477:Likelihood interval
4467:Confidence interval
4459:Interval estimation
4420:Unbiased estimators
4238:Model specification
4118:Up-and-down designs
3806:Partial correlation
3762:Index of dispersion
3680:Interquartile range
3187:
3165:
3137:
3078:
3043:
3021:
3005:
2931:
2909:
2888:
2301:
2224:
1897:global optimization
1838:simulated annealing
1495:marginal likelihood
891:Bayesian statistics
685:likelihood function
426:Bayesian statistics
141:Bayesian statistics
135:Part of a series on
5720:Spatial statistics
5600:Medical statistics
5500:First hitting time
5454:Whittle likelihood
5105:Degrees of freedom
5100:Multivariate ANOVA
5033:Heteroscedasticity
4845:Bayesian estimator
4810:Bayesian inference
4659:KolmogorovâSmirnov
4544:Randomization test
4514:Testing hypotheses
4487:Tolerance interval
4398:Maximum likelihood
4293:Exponential family
4226:Density estimation
4186:Statistical theory
4146:Natural experiment
4092:Scientific control
4009:Survey methodology
3695:Standard deviation
3318:
3237:
3194:
3173:
3151:
3123:
3064:
3029:
3007:
2991:
2917:
2895:
2874:
2831:for Ό is given by
2814:
2673:
2650:
2325:
2305:
2287:
2251:
2228:
2210:
2178:
2118:
2098:
2078:
2051:
2024:
2004:
1977:
1950:
1923:
1885:
1862:credible intervals
1834:Monte Carlo method
1766:
1742:
1718:
1695:
1690:
1567:
1557:is uniform (i.e.,
1547:
1527:
1507:
1480:
1477:
1436:
1397:
1308:
1249:
1152:
1129:
1109:
1089:
1069:
1046:
1042:
907:
875:
855:
835:
824:prior distribution
822:Now assume that a
809:
786:
762:
687:and the estimate:
670:
625:
605:
585:
550:
526:
506:
486:
462:prior distribution
454:maximum likelihood
311:Bayesian estimator
259:Hierarchical model
183:Bayesian inference
5822:
5821:
5760:
5759:
5756:
5755:
5695:National accounts
5665:Actuarial science
5657:Social statistics
5550:
5549:
5546:
5545:
5542:
5541:
5477:Survival function
5462:
5461:
5324:Granger causality
5165:Contingency table
5140:Survival analysis
5117:
5116:
5113:
5112:
4969:Linear regression
4864:
4863:
4860:
4859:
4835:Credible interval
4804:
4803:
4587:
4586:
4403:Method of moments
4272:Parametric family
4233:Statistical model
4163:
4162:
4159:
4158:
4077:Random assignment
3999:Statistical power
3933:
3932:
3929:
3928:
3778:Contingency table
3748:
3747:
3615:Generalized/power
3491:978-0-387-46409-1
3465:. Marcel Dekker.
3429:978-0-521-83971-6
3399:978-0-262-01802-9
3298:
3268:
3189:
3045:
2948:
2933:
2851:
2799:
2749:
2630:
2593:
2569:
2556:
2504:
2467:
2443:
2430:
2121:{\displaystyle x}
2101:{\displaystyle x}
2027:{\displaystyle x}
1926:{\displaystyle x}
1721:{\displaystyle c}
1683:
1638:
1579:constant function
1570:{\displaystyle g}
1550:{\displaystyle g}
1440:
1411:
1398:
1312:
1283:
1253:
1224:
1190:
1132:{\displaystyle g}
1119:is the domain of
1072:{\displaystyle g}
1043:
838:{\displaystyle g}
766:
737:
707:
608:{\displaystyle x}
553:{\displaystyle x}
529:{\displaystyle f}
509:{\displaystyle x}
422:
421:
316:Credible interval
249:Linear regression
130:
129:
122:
104:
16:(Redirected from
5852:
5810:
5809:
5798:
5797:
5787:
5786:
5772:
5771:
5675:Crime statistics
5569:
5556:
5473:
5439:Fourier analysis
5426:Frequency domain
5406:
5353:
5319:Structural break
5279:
5228:Cluster analysis
5175:Log-linear model
5148:
5123:
5064:
5038:Homoscedasticity
4894:
4870:
4789:
4781:
4773:
4772:(KruskalâWallis)
4757:
4742:
4697:Cross validation
4682:
4664:AndersonâDarling
4611:
4598:
4569:Likelihood-ratio
4561:Parametric tests
4539:Permutation test
4522:1- & 2-tails
4413:Minimum distance
4385:Point estimation
4381:
4332:Optimal decision
4283:
4182:
4169:
4151:Quasi-experiment
4101:Adaptive designs
3952:
3939:
3816:Rank correlation
3578:
3569:
3556:
3523:
3516:
3509:
3500:
3495:
3476:
3457:
3434:
3433:
3413:
3404:
3403:
3385:
3379:
3378:
3360:
3344:
3327:
3325:
3324:
3319:
3314:
3313:
3312:
3300:
3299:
3291:
3284:
3283:
3282:
3270:
3269:
3261:
3246:
3244:
3243:
3238:
3230:
3229:
3203:
3201:
3200:
3195:
3190:
3188:
3186:
3181:
3164:
3159:
3149:
3148:
3147:
3136:
3131:
3119:
3115:
3114:
3113:
3103:
3098:
3077:
3072:
3062:
3057:
3056:
3046:
3044:
3042:
3037:
3020:
3015:
3004:
2999:
2990:
2985:
2981:
2980:
2979:
2969:
2964:
2949:
2941:
2934:
2932:
2930:
2925:
2908:
2903:
2893:
2887:
2882:
2872:
2867:
2866:
2865:
2853:
2852:
2844:
2823:
2821:
2820:
2815:
2810:
2809:
2804:
2800:
2798:
2797:
2788:
2787:
2786:
2770:
2760:
2759:
2754:
2750:
2748:
2747:
2738:
2731:
2730:
2720:
2712:
2707:
2682:
2680:
2679:
2674:
2659:
2657:
2656:
2651:
2646:
2642:
2641:
2640:
2635:
2631:
2629:
2628:
2619:
2612:
2611:
2601:
2594:
2586:
2570:
2568:
2567:
2566:
2557:
2549:
2543:
2540:
2535:
2520:
2516:
2515:
2514:
2509:
2505:
2503:
2502:
2493:
2492:
2491:
2475:
2468:
2460:
2444:
2442:
2441:
2440:
2431:
2423:
2417:
2334:
2332:
2331:
2326:
2314:
2312:
2311:
2306:
2300:
2295:
2283:
2282:
2260:
2258:
2257:
2252:
2239:random variables
2237:
2235:
2234:
2229:
2223:
2218:
2187:
2185:
2184:
2179:
2174:
2173:
2155:
2154:
2127:
2125:
2124:
2119:
2107:
2105:
2104:
2099:
2087:
2085:
2084:
2079:
2077:
2076:
2060:
2058:
2057:
2052:
2050:
2049:
2033:
2031:
2030:
2025:
2013:
2011:
2010:
2005:
2003:
2002:
1986:
1984:
1983:
1978:
1976:
1975:
1959:
1957:
1956:
1951:
1949:
1948:
1932:
1930:
1929:
1924:
1850:Bayes estimation
1798:conjugate priors
1775:
1773:
1772:
1767:
1751:
1749:
1748:
1743:
1727:
1725:
1724:
1719:
1704:
1702:
1701:
1696:
1694:
1693:
1684:
1681:
1658:
1644:
1639:
1636:
1576:
1574:
1573:
1568:
1556:
1554:
1553:
1548:
1536:
1534:
1533:
1528:
1516:
1514:
1513:
1508:
1489:
1487:
1486:
1481:
1478:
1438:
1437:
1432:
1403:
1399:
1358:
1357:
1347:
1314:
1310:
1309:
1304:
1275:
1251:
1250:
1245:
1206:
1205:
1204:
1192:
1191:
1183:
1161:
1159:
1158:
1153:
1138:
1136:
1135:
1130:
1118:
1116:
1115:
1110:
1098:
1096:
1095:
1090:
1078:
1076:
1075:
1070:
1055:
1053:
1052:
1047:
1044:
1003:
1002:
992:
959:
916:
914:
913:
908:
884:
882:
881:
876:
864:
862:
861:
856:
844:
842:
841:
836:
818:
816:
815:
810:
795:
793:
792:
787:
764:
763:
758:
723:
722:
721:
709:
708:
700:
683:is known as the
679:
677:
676:
671:
634:
632:
631:
626:
614:
612:
611:
606:
594:
592:
591:
586:
559:
557:
556:
551:
535:
533:
532:
527:
515:
513:
512:
507:
495:
493:
492:
487:
414:
407:
400:
384:
383:
350:Model evaluation
151:
132:
125:
118:
114:
111:
105:
103:
62:
38:
30:
21:
5860:
5859:
5855:
5854:
5853:
5851:
5850:
5849:
5825:
5824:
5823:
5818:
5781:
5752:
5714:
5651:
5637:quality control
5604:
5586:Clinical trials
5563:
5538:
5522:
5510:Hazard function
5504:
5458:
5420:
5404:
5367:
5363:BreuschâGodfrey
5351:
5328:
5268:
5243:Factor analysis
5189:
5170:Graphical model
5142:
5109:
5076:
5062:
5042:
4996:
4963:
4925:
4888:
4887:
4856:
4800:
4787:
4779:
4771:
4755:
4740:
4719:Rank statistics
4713:
4692:Model selection
4680:
4638:Goodness of fit
4632:
4609:
4583:
4555:
4508:
4453:
4442:Median unbiased
4370:
4281:
4214:Order statistic
4176:
4155:
4122:
4096:
4048:
4003:
3946:
3944:Data collection
3925:
3837:
3792:
3766:
3744:
3704:
3656:
3573:Continuous data
3563:
3550:
3532:
3527:
3492:
3479:
3473:
3460:
3454:
3446:. McGraw-Hill.
3441:
3438:
3437:
3430:
3415:
3414:
3407:
3400:
3387:
3386:
3382:
3346:
3345:
3338:
3333:
3288:
3258:
3253:
3252:
3251:; in this case
3221:
3216:
3215:
3150:
3139:
3105:
3083:
3079:
3063:
3048:
3006:
2971:
2939:
2935:
2894:
2873:
2841:
2836:
2835:
2789:
2778:
2771:
2765:
2764:
2739:
2722:
2721:
2715:
2714:
2688:
2687:
2665:
2664:
2620:
2603:
2602:
2596:
2595:
2581:
2577:
2558:
2547:
2494:
2483:
2476:
2470:
2469:
2455:
2451:
2432:
2421:
2351:
2350:
2337:conjugate prior
2317:
2316:
2274:
2263:
2262:
2243:
2242:
2193:
2192:
2165:
2146:
2138:
2137:
2134:
2110:
2109:
2090:
2089:
2068:
2063:
2062:
2041:
2036:
2035:
2016:
2015:
1994:
1989:
1988:
1967:
1962:
1961:
1940:
1935:
1934:
1915:
1914:
1901:identifiability
1846:
1816:Newton's method
1786:
1758:
1757:
1754:Bayes estimator
1734:
1733:
1730:Bayes estimator
1728:goes to 0, the
1710:
1709:
1689:
1688:
1678:
1669:
1668:
1633:
1620:
1593:
1592:
1588:is of the form
1559:
1558:
1539:
1538:
1519:
1518:
1499:
1498:
1476:
1475:
1401:
1400:
1349:
1315:
1273:
1272:
1216:
1180:
1171:
1170:
1144:
1143:
1121:
1120:
1101:
1100:
1081:
1080:
1061:
1060:
994:
960:
926:
925:
899:
898:
887:random variable
867:
866:
847:
846:
827:
826:
801:
800:
697:
692:
691:
640:
639:
617:
616:
597:
596:
562:
561:
542:
541:
518:
517:
498:
497:
478:
477:
474:
418:
378:
363:Model averaging
342:Nested sampling
254:Empirical Bayes
244:Conjugate prior
213:Cromwell's rule
126:
115:
109:
106:
63:
61:
51:
39:
28:
23:
22:
15:
12:
11:
5:
5858:
5856:
5848:
5847:
5842:
5837:
5827:
5826:
5820:
5819:
5817:
5816:
5804:
5792:
5778:
5765:
5762:
5761:
5758:
5757:
5754:
5753:
5751:
5750:
5745:
5740:
5735:
5730:
5724:
5722:
5716:
5715:
5713:
5712:
5707:
5702:
5697:
5692:
5687:
5682:
5677:
5672:
5667:
5661:
5659:
5653:
5652:
5650:
5649:
5644:
5639:
5630:
5625:
5620:
5614:
5612:
5606:
5605:
5603:
5602:
5597:
5592:
5583:
5581:Bioinformatics
5577:
5575:
5565:
5564:
5559:
5552:
5551:
5548:
5547:
5544:
5543:
5540:
5539:
5537:
5536:
5530:
5528:
5524:
5523:
5521:
5520:
5514:
5512:
5506:
5505:
5503:
5502:
5497:
5492:
5487:
5481:
5479:
5470:
5464:
5463:
5460:
5459:
5457:
5456:
5451:
5446:
5441:
5436:
5430:
5428:
5422:
5421:
5419:
5418:
5413:
5408:
5400:
5395:
5390:
5389:
5388:
5386:partial (PACF)
5377:
5375:
5369:
5368:
5366:
5365:
5360:
5355:
5347:
5342:
5336:
5334:
5333:Specific tests
5330:
5329:
5327:
5326:
5321:
5316:
5311:
5306:
5301:
5296:
5291:
5285:
5283:
5276:
5270:
5269:
5267:
5266:
5265:
5264:
5263:
5262:
5247:
5246:
5245:
5235:
5233:Classification
5230:
5225:
5220:
5215:
5210:
5205:
5199:
5197:
5191:
5190:
5188:
5187:
5182:
5180:McNemar's test
5177:
5172:
5167:
5162:
5156:
5154:
5144:
5143:
5126:
5119:
5118:
5115:
5114:
5111:
5110:
5108:
5107:
5102:
5097:
5092:
5086:
5084:
5078:
5077:
5075:
5074:
5058:
5052:
5050:
5044:
5043:
5041:
5040:
5035:
5030:
5025:
5020:
5018:Semiparametric
5015:
5010:
5004:
5002:
4998:
4997:
4995:
4994:
4989:
4984:
4979:
4973:
4971:
4965:
4964:
4962:
4961:
4956:
4951:
4946:
4941:
4935:
4933:
4927:
4926:
4924:
4923:
4918:
4913:
4908:
4902:
4900:
4890:
4889:
4886:
4885:
4880:
4874:
4873:
4866:
4865:
4862:
4861:
4858:
4857:
4855:
4854:
4853:
4852:
4842:
4837:
4832:
4831:
4830:
4825:
4814:
4812:
4806:
4805:
4802:
4801:
4799:
4798:
4793:
4792:
4791:
4783:
4775:
4759:
4756:(MannâWhitney)
4751:
4750:
4749:
4736:
4735:
4734:
4723:
4721:
4715:
4714:
4712:
4711:
4710:
4709:
4704:
4699:
4689:
4684:
4681:(ShapiroâWilk)
4676:
4671:
4666:
4661:
4656:
4648:
4642:
4640:
4634:
4633:
4631:
4630:
4622:
4613:
4601:
4595:
4593:Specific tests
4589:
4588:
4585:
4584:
4582:
4581:
4576:
4571:
4565:
4563:
4557:
4556:
4554:
4553:
4548:
4547:
4546:
4536:
4535:
4534:
4524:
4518:
4516:
4510:
4509:
4507:
4506:
4505:
4504:
4499:
4489:
4484:
4479:
4474:
4469:
4463:
4461:
4455:
4454:
4452:
4451:
4446:
4445:
4444:
4439:
4438:
4437:
4432:
4417:
4416:
4415:
4410:
4405:
4400:
4389:
4387:
4378:
4372:
4371:
4369:
4368:
4363:
4358:
4357:
4356:
4346:
4341:
4340:
4339:
4329:
4328:
4327:
4322:
4317:
4307:
4302:
4297:
4296:
4295:
4290:
4285:
4269:
4268:
4267:
4262:
4257:
4247:
4246:
4245:
4240:
4230:
4229:
4228:
4218:
4217:
4216:
4206:
4201:
4196:
4190:
4188:
4178:
4177:
4172:
4165:
4164:
4161:
4160:
4157:
4156:
4154:
4153:
4148:
4143:
4138:
4132:
4130:
4124:
4123:
4121:
4120:
4115:
4110:
4104:
4102:
4098:
4097:
4095:
4094:
4089:
4084:
4079:
4074:
4069:
4064:
4058:
4056:
4050:
4049:
4047:
4046:
4044:Standard error
4041:
4036:
4031:
4030:
4029:
4024:
4013:
4011:
4005:
4004:
4002:
4001:
3996:
3991:
3986:
3981:
3976:
3974:Optimal design
3971:
3966:
3960:
3958:
3948:
3947:
3942:
3935:
3934:
3931:
3930:
3927:
3926:
3924:
3923:
3918:
3913:
3908:
3903:
3898:
3893:
3888:
3883:
3878:
3873:
3868:
3863:
3858:
3853:
3847:
3845:
3839:
3838:
3836:
3835:
3830:
3829:
3828:
3823:
3813:
3808:
3802:
3800:
3794:
3793:
3791:
3790:
3785:
3780:
3774:
3772:
3771:Summary tables
3768:
3767:
3765:
3764:
3758:
3756:
3750:
3749:
3746:
3745:
3743:
3742:
3741:
3740:
3735:
3730:
3720:
3714:
3712:
3706:
3705:
3703:
3702:
3697:
3692:
3687:
3682:
3677:
3672:
3666:
3664:
3658:
3657:
3655:
3654:
3649:
3644:
3643:
3642:
3637:
3632:
3627:
3622:
3617:
3612:
3607:
3605:Contraharmonic
3602:
3597:
3586:
3584:
3575:
3565:
3564:
3559:
3552:
3551:
3549:
3548:
3543:
3537:
3534:
3533:
3528:
3526:
3525:
3518:
3511:
3503:
3497:
3496:
3490:
3477:
3471:
3458:
3452:
3436:
3435:
3428:
3405:
3398:
3380:
3335:
3334:
3332:
3329:
3317:
3311:
3308:
3305:
3297:
3294:
3287:
3281:
3278:
3275:
3267:
3264:
3236:
3233:
3228:
3224:
3205:
3204:
3193:
3185:
3180:
3176:
3172:
3169:
3163:
3158:
3154:
3146:
3142:
3135:
3130:
3126:
3122:
3118:
3112:
3108:
3102:
3097:
3094:
3091:
3087:
3082:
3076:
3071:
3067:
3060:
3055:
3051:
3041:
3036:
3032:
3028:
3025:
3019:
3014:
3010:
3003:
2998:
2994:
2988:
2984:
2978:
2974:
2968:
2963:
2960:
2957:
2953:
2947:
2944:
2938:
2929:
2924:
2920:
2916:
2913:
2907:
2902:
2898:
2892:
2886:
2881:
2877:
2870:
2864:
2861:
2858:
2850:
2847:
2825:
2824:
2813:
2808:
2803:
2796:
2792:
2785:
2781:
2777:
2774:
2768:
2763:
2758:
2753:
2746:
2742:
2737:
2734:
2729:
2725:
2718:
2711:
2706:
2703:
2700:
2696:
2672:
2661:
2660:
2649:
2645:
2639:
2634:
2627:
2623:
2618:
2615:
2610:
2606:
2599:
2592:
2589:
2584:
2580:
2576:
2573:
2565:
2561:
2555:
2552:
2546:
2539:
2534:
2531:
2528:
2524:
2519:
2513:
2508:
2501:
2497:
2490:
2486:
2482:
2479:
2473:
2466:
2463:
2458:
2454:
2450:
2447:
2439:
2435:
2429:
2426:
2420:
2415:
2412:
2409:
2406:
2403:
2400:
2397:
2394:
2391:
2388:
2385:
2382:
2379:
2376:
2373:
2370:
2367:
2364:
2361:
2358:
2343:analytically.
2324:
2304:
2299:
2294:
2290:
2286:
2281:
2277:
2273:
2270:
2250:
2227:
2222:
2217:
2213:
2209:
2206:
2203:
2200:
2177:
2172:
2168:
2164:
2161:
2158:
2153:
2149:
2145:
2133:
2130:
2117:
2097:
2075:
2071:
2048:
2044:
2023:
2001:
1997:
1974:
1970:
1947:
1943:
1922:
1889:mixture models
1866:loss functions
1845:
1842:
1841:
1840:
1830:
1823:
1801:
1785:
1782:
1765:
1741:
1717:
1706:
1705:
1692:
1687:
1679:
1677:
1674:
1671:
1670:
1667:
1664:
1661:
1657:
1653:
1650:
1647:
1643:
1634:
1632:
1629:
1626:
1625:
1623:
1618:
1615:
1612:
1609:
1606:
1603:
1600:
1566:
1546:
1526:
1506:
1491:
1490:
1474:
1471:
1468:
1465:
1462:
1458:
1455:
1452:
1449:
1446:
1443:
1435:
1431:
1428:
1425:
1421:
1418:
1415:
1409:
1406:
1404:
1402:
1396:
1393:
1389:
1386:
1383:
1380:
1376:
1373:
1370:
1367:
1364:
1361:
1356:
1352:
1346:
1343:
1340:
1337:
1333:
1330:
1327:
1324:
1321:
1318:
1307:
1303:
1300:
1297:
1293:
1290:
1287:
1281:
1278:
1276:
1274:
1271:
1268:
1265:
1262:
1259:
1256:
1248:
1244:
1241:
1238:
1234:
1231:
1228:
1222:
1219:
1217:
1215:
1212:
1209:
1203:
1200:
1197:
1189:
1186:
1179:
1178:
1151:
1128:
1108:
1088:
1068:
1057:
1056:
1041:
1038:
1034:
1031:
1028:
1025:
1021:
1018:
1015:
1012:
1009:
1006:
1001:
997:
991:
988:
985:
982:
978:
975:
972:
969:
966:
963:
957:
954:
951:
948:
945:
942:
939:
936:
933:
919:Bayes' theorem
906:
874:
854:
834:
808:
797:
796:
784:
781:
778:
775:
772:
769:
761:
757:
754:
751:
747:
744:
741:
735:
732:
729:
726:
720:
717:
714:
706:
703:
681:
680:
668:
665:
662:
659:
656:
653:
650:
647:
624:
604:
584:
581:
578:
575:
572:
569:
549:
525:
505:
485:
473:
470:
466:regularization
450:point estimate
420:
419:
417:
416:
409:
402:
394:
391:
390:
389:
388:
373:
372:
371:
370:
365:
360:
352:
351:
347:
346:
345:
344:
339:
331:
330:
326:
325:
324:
323:
318:
313:
305:
304:
300:
299:
298:
297:
292:
287:
282:
277:
269:
268:
264:
263:
262:
261:
256:
251:
246:
238:
237:
236:Model building
233:
232:
231:
230:
225:
220:
215:
210:
205:
200:
195:
193:Bayes' theorem
190:
185:
177:
176:
172:
171:
153:
152:
144:
143:
137:
136:
128:
127:
110:September 2011
42:
40:
33:
26:
24:
14:
13:
10:
9:
6:
4:
3:
2:
5857:
5846:
5843:
5841:
5838:
5836:
5833:
5832:
5830:
5815:
5814:
5805:
5803:
5802:
5793:
5791:
5790:
5785:
5779:
5777:
5776:
5767:
5766:
5763:
5749:
5746:
5744:
5743:Geostatistics
5741:
5739:
5736:
5734:
5731:
5729:
5726:
5725:
5723:
5721:
5717:
5711:
5710:Psychometrics
5708:
5706:
5703:
5701:
5698:
5696:
5693:
5691:
5688:
5686:
5683:
5681:
5678:
5676:
5673:
5671:
5668:
5666:
5663:
5662:
5660:
5658:
5654:
5648:
5645:
5643:
5640:
5638:
5634:
5631:
5629:
5626:
5624:
5621:
5619:
5616:
5615:
5613:
5611:
5607:
5601:
5598:
5596:
5593:
5591:
5587:
5584:
5582:
5579:
5578:
5576:
5574:
5573:Biostatistics
5570:
5566:
5562:
5557:
5553:
5535:
5534:Log-rank test
5532:
5531:
5529:
5525:
5519:
5516:
5515:
5513:
5511:
5507:
5501:
5498:
5496:
5493:
5491:
5488:
5486:
5483:
5482:
5480:
5478:
5474:
5471:
5469:
5465:
5455:
5452:
5450:
5447:
5445:
5442:
5440:
5437:
5435:
5432:
5431:
5429:
5427:
5423:
5417:
5414:
5412:
5409:
5407:
5405:(BoxâJenkins)
5401:
5399:
5396:
5394:
5391:
5387:
5384:
5383:
5382:
5379:
5378:
5376:
5374:
5370:
5364:
5361:
5359:
5358:DurbinâWatson
5356:
5354:
5348:
5346:
5343:
5341:
5340:DickeyâFuller
5338:
5337:
5335:
5331:
5325:
5322:
5320:
5317:
5315:
5314:Cointegration
5312:
5310:
5307:
5305:
5302:
5300:
5297:
5295:
5292:
5290:
5289:Decomposition
5287:
5286:
5284:
5280:
5277:
5275:
5271:
5261:
5258:
5257:
5256:
5253:
5252:
5251:
5248:
5244:
5241:
5240:
5239:
5236:
5234:
5231:
5229:
5226:
5224:
5221:
5219:
5216:
5214:
5211:
5209:
5206:
5204:
5201:
5200:
5198:
5196:
5192:
5186:
5183:
5181:
5178:
5176:
5173:
5171:
5168:
5166:
5163:
5161:
5160:Cohen's kappa
5158:
5157:
5155:
5153:
5149:
5145:
5141:
5137:
5133:
5129:
5124:
5120:
5106:
5103:
5101:
5098:
5096:
5093:
5091:
5088:
5087:
5085:
5083:
5079:
5073:
5069:
5065:
5059:
5057:
5054:
5053:
5051:
5049:
5045:
5039:
5036:
5034:
5031:
5029:
5026:
5024:
5021:
5019:
5016:
5014:
5013:Nonparametric
5011:
5009:
5006:
5005:
5003:
4999:
4993:
4990:
4988:
4985:
4983:
4980:
4978:
4975:
4974:
4972:
4970:
4966:
4960:
4957:
4955:
4952:
4950:
4947:
4945:
4942:
4940:
4937:
4936:
4934:
4932:
4928:
4922:
4919:
4917:
4914:
4912:
4909:
4907:
4904:
4903:
4901:
4899:
4895:
4891:
4884:
4881:
4879:
4876:
4875:
4871:
4867:
4851:
4848:
4847:
4846:
4843:
4841:
4838:
4836:
4833:
4829:
4826:
4824:
4821:
4820:
4819:
4816:
4815:
4813:
4811:
4807:
4797:
4794:
4790:
4784:
4782:
4776:
4774:
4768:
4767:
4766:
4763:
4762:Nonparametric
4760:
4758:
4752:
4748:
4745:
4744:
4743:
4737:
4733:
4732:Sample median
4730:
4729:
4728:
4725:
4724:
4722:
4720:
4716:
4708:
4705:
4703:
4700:
4698:
4695:
4694:
4693:
4690:
4688:
4685:
4683:
4677:
4675:
4672:
4670:
4667:
4665:
4662:
4660:
4657:
4655:
4653:
4649:
4647:
4644:
4643:
4641:
4639:
4635:
4629:
4627:
4623:
4621:
4619:
4614:
4612:
4607:
4603:
4602:
4599:
4596:
4594:
4590:
4580:
4577:
4575:
4572:
4570:
4567:
4566:
4564:
4562:
4558:
4552:
4549:
4545:
4542:
4541:
4540:
4537:
4533:
4530:
4529:
4528:
4525:
4523:
4520:
4519:
4517:
4515:
4511:
4503:
4500:
4498:
4495:
4494:
4493:
4490:
4488:
4485:
4483:
4480:
4478:
4475:
4473:
4470:
4468:
4465:
4464:
4462:
4460:
4456:
4450:
4447:
4443:
4440:
4436:
4433:
4431:
4428:
4427:
4426:
4423:
4422:
4421:
4418:
4414:
4411:
4409:
4406:
4404:
4401:
4399:
4396:
4395:
4394:
4391:
4390:
4388:
4386:
4382:
4379:
4377:
4373:
4367:
4364:
4362:
4359:
4355:
4352:
4351:
4350:
4347:
4345:
4342:
4338:
4337:loss function
4335:
4334:
4333:
4330:
4326:
4323:
4321:
4318:
4316:
4313:
4312:
4311:
4308:
4306:
4303:
4301:
4298:
4294:
4291:
4289:
4286:
4284:
4278:
4275:
4274:
4273:
4270:
4266:
4263:
4261:
4258:
4256:
4253:
4252:
4251:
4248:
4244:
4241:
4239:
4236:
4235:
4234:
4231:
4227:
4224:
4223:
4222:
4219:
4215:
4212:
4211:
4210:
4207:
4205:
4202:
4200:
4197:
4195:
4192:
4191:
4189:
4187:
4183:
4179:
4175:
4170:
4166:
4152:
4149:
4147:
4144:
4142:
4139:
4137:
4134:
4133:
4131:
4129:
4125:
4119:
4116:
4114:
4111:
4109:
4106:
4105:
4103:
4099:
4093:
4090:
4088:
4085:
4083:
4080:
4078:
4075:
4073:
4070:
4068:
4065:
4063:
4060:
4059:
4057:
4055:
4051:
4045:
4042:
4040:
4039:Questionnaire
4037:
4035:
4032:
4028:
4025:
4023:
4020:
4019:
4018:
4015:
4014:
4012:
4010:
4006:
4000:
3997:
3995:
3992:
3990:
3987:
3985:
3982:
3980:
3977:
3975:
3972:
3970:
3967:
3965:
3962:
3961:
3959:
3957:
3953:
3949:
3945:
3940:
3936:
3922:
3919:
3917:
3914:
3912:
3909:
3907:
3904:
3902:
3899:
3897:
3894:
3892:
3889:
3887:
3884:
3882:
3879:
3877:
3874:
3872:
3869:
3867:
3866:Control chart
3864:
3862:
3859:
3857:
3854:
3852:
3849:
3848:
3846:
3844:
3840:
3834:
3831:
3827:
3824:
3822:
3819:
3818:
3817:
3814:
3812:
3809:
3807:
3804:
3803:
3801:
3799:
3795:
3789:
3786:
3784:
3781:
3779:
3776:
3775:
3773:
3769:
3763:
3760:
3759:
3757:
3755:
3751:
3739:
3736:
3734:
3731:
3729:
3726:
3725:
3724:
3721:
3719:
3716:
3715:
3713:
3711:
3707:
3701:
3698:
3696:
3693:
3691:
3688:
3686:
3683:
3681:
3678:
3676:
3673:
3671:
3668:
3667:
3665:
3663:
3659:
3653:
3650:
3648:
3645:
3641:
3638:
3636:
3633:
3631:
3628:
3626:
3623:
3621:
3618:
3616:
3613:
3611:
3608:
3606:
3603:
3601:
3598:
3596:
3593:
3592:
3591:
3588:
3587:
3585:
3583:
3579:
3576:
3574:
3570:
3566:
3562:
3557:
3553:
3547:
3544:
3542:
3539:
3538:
3535:
3531:
3524:
3519:
3517:
3512:
3510:
3505:
3504:
3501:
3493:
3487:
3483:
3478:
3474:
3472:0-8247-6987-2
3468:
3464:
3459:
3455:
3453:0-07-016242-5
3449:
3445:
3440:
3439:
3431:
3425:
3421:
3420:
3412:
3410:
3406:
3401:
3395:
3391:
3384:
3381:
3376:
3372:
3368:
3364:
3359:
3354:
3350:
3343:
3341:
3337:
3330:
3328:
3315:
3292:
3262:
3250:
3226:
3222:
3212:
3210:
3191:
3183:
3178:
3174:
3170:
3167:
3161:
3156:
3152:
3144:
3140:
3133:
3128:
3124:
3120:
3116:
3110:
3106:
3100:
3095:
3092:
3089:
3085:
3080:
3074:
3069:
3065:
3058:
3053:
3049:
3039:
3034:
3030:
3026:
3023:
3017:
3012:
3008:
3001:
2996:
2992:
2986:
2982:
2976:
2972:
2966:
2961:
2958:
2955:
2951:
2945:
2942:
2936:
2927:
2922:
2918:
2914:
2911:
2905:
2900:
2896:
2890:
2884:
2879:
2875:
2868:
2845:
2834:
2833:
2832:
2830:
2829:MAP estimator
2811:
2806:
2801:
2794:
2790:
2783:
2779:
2775:
2772:
2766:
2761:
2756:
2751:
2744:
2740:
2735:
2732:
2727:
2723:
2716:
2709:
2704:
2701:
2698:
2694:
2686:
2685:
2684:
2670:
2647:
2643:
2637:
2632:
2625:
2621:
2616:
2613:
2608:
2604:
2597:
2590:
2587:
2582:
2578:
2574:
2571:
2563:
2559:
2553:
2550:
2544:
2537:
2532:
2529:
2526:
2522:
2517:
2511:
2506:
2499:
2495:
2488:
2484:
2480:
2477:
2471:
2464:
2461:
2456:
2452:
2448:
2445:
2437:
2433:
2427:
2424:
2418:
2413:
2407:
2401:
2395:
2389:
2386:
2380:
2377:
2374:
2368:
2362:
2356:
2349:
2348:
2347:
2344:
2342:
2338:
2322:
2297:
2292:
2288:
2284:
2279:
2275:
2268:
2261:is given by
2248:
2240:
2220:
2215:
2211:
2207:
2204:
2198:
2191:
2170:
2166:
2162:
2159:
2156:
2151:
2147:
2131:
2129:
2128:as negative.
2115:
2095:
2073:
2069:
2046:
2042:
2021:
1999:
1995:
1972:
1968:
1945:
1941:
1920:
1911:
1909:
1904:
1902:
1898:
1894:
1890:
1882:
1877:
1873:
1871:
1867:
1863:
1859:
1855:
1851:
1843:
1839:
1835:
1831:
1828:
1824:
1821:
1817:
1813:
1809:
1806:
1802:
1799:
1795:
1791:
1790:
1789:
1783:
1781:
1779:
1763:
1755:
1739:
1731:
1715:
1685:
1675:
1672:
1665:
1662:
1659:
1651:
1648:
1645:
1630:
1627:
1621:
1616:
1610:
1607:
1604:
1598:
1591:
1590:
1589:
1587:
1586:loss function
1582:
1580:
1564:
1544:
1524:
1504:
1496:
1472:
1466:
1460:
1453:
1450:
1447:
1441:
1433:
1407:
1405:
1394:
1391:
1384:
1378:
1371:
1368:
1365:
1359:
1350:
1341:
1335:
1328:
1325:
1322:
1316:
1305:
1279:
1277:
1266:
1263:
1260:
1254:
1246:
1220:
1218:
1210:
1184:
1169:
1168:
1167:
1165:
1149:
1140:
1126:
1086:
1066:
1039:
1036:
1029:
1023:
1016:
1013:
1010:
1004:
995:
986:
980:
973:
970:
967:
961:
955:
949:
946:
943:
937:
931:
924:
923:
922:
920:
904:
896:
892:
888:
872:
852:
832:
825:
820:
806:
779:
776:
773:
767:
759:
733:
727:
701:
690:
689:
688:
686:
663:
660:
657:
651:
645:
638:
637:
636:
622:
602:
579:
576:
573:
567:
547:
539:
523:
503:
483:
471:
469:
467:
463:
459:
455:
451:
447:
443:
439:
435:
431:
427:
415:
410:
408:
403:
401:
396:
395:
393:
392:
387:
382:
377:
376:
375:
374:
369:
366:
364:
361:
359:
356:
355:
354:
353:
348:
343:
340:
338:
335:
334:
333:
332:
327:
322:
319:
317:
314:
312:
309:
308:
307:
306:
301:
296:
293:
291:
288:
286:
283:
281:
278:
276:
273:
272:
271:
270:
265:
260:
257:
255:
252:
250:
247:
245:
242:
241:
240:
239:
234:
229:
226:
224:
221:
219:
216:
214:
211:
209:
208:Cox's theorem
206:
204:
201:
199:
196:
194:
191:
189:
186:
184:
181:
180:
179:
178:
173:
170:
166:
162:
158:
155:
154:
150:
146:
145:
142:
138:
134:
133:
124:
121:
113:
102:
99:
95:
92:
88:
85:
81:
78:
74:
71: â
70:
66:
65:Find sources:
59:
55:
49:
48:
43:This article
41:
37:
32:
31:
19:
5811:
5799:
5780:
5773:
5685:Econometrics
5635: /
5618:Chemometrics
5595:Epidemiology
5588: /
5561:Applications
5403:ARIMA model
5350:Q-statistic
5299:Stationarity
5195:Multivariate
5138: /
5134: /
5132:Multivariate
5130: /
5070: /
5066: /
4849:
4840:Bayes factor
4739:Signed rank
4651:
4625:
4617:
4605:
4300:Completeness
4136:Cohort study
4034:Opinion poll
3969:Missing data
3956:Study design
3911:Scatter plot
3833:Scatter plot
3826:Spearman's Ï
3788:Grouped data
3481:
3462:
3443:
3418:
3389:
3383:
3348:
3214:The case of
3213:
3206:
2828:
2826:
2662:
2345:
2135:
1912:
1905:
1886:
1847:
1810:such as the
1808:optimization
1787:
1707:
1583:
1492:
1141:
1058:
821:
798:
682:
475:
437:
433:
429:
423:
358:Bayes factor
320:
116:
107:
97:
90:
83:
76:
64:
52:Please help
47:verification
44:
5813:WikiProject
5728:Cartography
5690:Jurimetrics
5642:Reliability
5373:Time domain
5352:(LjungâBox)
5274:Time-series
5152:Categorical
5136:Time-series
5128:Categorical
5063:(Bernoulli)
4898:Correlation
4878:Correlation
4674:JarqueâBera
4646:Chi-squared
4408:M-estimator
4361:Asymptotics
4305:Sufficiency
4072:Interaction
3984:Replication
3964:Effect size
3921:Violin plot
3901:Radar chart
3881:Forest plot
3871:Correlogram
3821:Kendall's Ï
1893:multi-modal
1844:Limitations
1820:derivatives
1794:closed form
1784:Computation
472:Description
5829:Categories
5680:Demography
5398:ARMA model
5203:Regression
4780:(Friedman)
4741:(Wilcoxon)
4679:Normality
4669:Lilliefors
4616:Student's
4492:Resampling
4366:Robustness
4354:divergence
4344:Efficiency
4282:(monotone)
4277:Likelihood
4194:Population
4027:Stratified
3979:Population
3798:Dependence
3754:Count data
3685:Percentile
3662:Dispersion
3595:Arithmetic
3530:Statistics
3358:1611.05917
3331:References
560:, so that
303:Estimators
175:Background
161:Likelihood
80:newspapers
5061:Logistic
4828:posterior
4754:Rank sum
4502:Jackknife
4497:Bootstrap
4315:Bootstrap
4250:Parameter
4199:Statistic
3994:Statistic
3906:Run chart
3891:Pie chart
3886:Histogram
3876:Fan chart
3851:Bar chart
3733:L-moments
3620:Geometric
3375:0025-5610
3296:^
3293:μ
3286:→
3266:^
3263:μ
3235:∞
3232:→
3223:σ
3175:σ
3153:σ
3141:μ
3125:σ
3086:∑
3066:σ
3050:μ
3031:σ
3009:σ
2993:σ
2952:∑
2919:σ
2897:σ
2876:σ
2849:^
2846:μ
2791:σ
2780:μ
2776:−
2773:μ
2741:σ
2736:μ
2733:−
2695:∑
2671:μ
2622:σ
2617:μ
2614:−
2583:−
2575:
2560:σ
2554:π
2523:∏
2496:σ
2485:μ
2481:−
2478:μ
2457:−
2449:
2434:σ
2428:π
2408:μ
2396:μ
2390:π
2381:μ
2378:∣
2363:μ
2323:μ
2289:σ
2276:μ
2249:μ
2212:σ
2205:μ
2160:…
1805:numerical
1800:are used.
1764:θ
1740:θ
1682:otherwise
1652:θ
1649:−
1605:θ
1584:When the
1525:θ
1505:θ
1467:θ
1454:θ
1451:∣
1434:θ
1395:ϑ
1385:ϑ
1372:ϑ
1369:∣
1355:Θ
1351:∫
1342:θ
1329:θ
1326:∣
1306:θ
1264:∣
1261:θ
1247:θ
1188:^
1185:θ
1150:θ
1107:Θ
1087:θ
1040:ϑ
1030:ϑ
1017:ϑ
1014:∣
1000:Θ
996:∫
987:θ
974:θ
971:∣
947:∣
944:θ
935:↦
932:θ
905:θ
873:θ
853:θ
807:θ
780:θ
777:∣
760:θ
705:^
702:θ
664:θ
661:∣
649:↦
646:θ
623:θ
580:θ
577:∣
484:θ
203:Coherence
157:Posterior
5845:A priori
5775:Category
5468:Survival
5345:Johansen
5068:Binomial
5023:Isotonic
4610:(normal)
4255:location
4062:Blocking
4017:Sampling
3896:QâQ plot
3861:Box plot
3843:Graphics
3738:Skewness
3728:Kurtosis
3700:Variance
3630:Heronian
3625:Harmonic
3351:: 1â16.
1908:Jacobian
1778:discrete
1637:if
438:estimate
169:Evidence
5801:Commons
5748:Kriging
5633:Process
5590:studies
5449:Wavelet
5282:General
4449:Plug-in
4243:L space
4022:Cluster
3723:Moments
3541:Outline
2132:Example
1756:unless
1162:as the
536:be the
444:of the
94:scholar
5670:Census
5260:Normal
5208:Manova
5028:Robust
4778:2-way
4770:1-way
4608:-test
4279:
3856:Biplot
3647:Median
3640:Lehmer
3582:Center
3488:
3469:
3450:
3426:
3396:
3373:
1858:median
1836:using
1832:Via a
1439:
1311:
1252:
1059:where
917:using
889:as in
765:
516:. Let
96:
89:
82:
75:
67:
5294:Trend
4823:prior
4765:anova
4654:-test
4628:-test
4620:-test
4527:Power
4472:Pivot
4265:shape
4260:scale
3710:Shape
3690:Range
3635:Heinz
3610:Cubic
3546:Index
3353:arXiv
1577:is a
885:as a
845:over
165:Prior
101:JSTOR
87:books
5527:Test
4727:Sign
4579:Wald
3652:Mode
3590:Mean
3486:ISBN
3467:ISBN
3448:ISBN
3424:ISBN
3394:ISBN
3371:ISSN
1987:and
1854:mean
1803:Via
1660:<
1164:mode
442:mode
428:, a
73:news
4707:BIC
4702:AIC
3363:doi
2572:exp
2446:exp
2190:IID
2188:of
1856:or
1814:or
1776:is
1708:as
1581:).
897:of
540:of
434:MAP
424:In
56:by
5831::
3408:^
3369:.
3361:.
3339:^
2683::
2088:,
2034:,
1960:,
1780:.
1139:.
1099:,
921::
819:.
436:)
167:Ă·
163:Ă
159:=
4652:G
4626:F
4618:t
4606:Z
4325:V
4320:U
3522:e
3515:t
3508:v
3494:.
3475:.
3456:.
3432:.
3402:.
3377:.
3365::
3355::
3316:.
3310:E
3307:L
3304:M
3280:P
3277:A
3274:M
3227:m
3192:.
3184:2
3179:v
3171:+
3168:n
3162:2
3157:m
3145:0
3134:2
3129:v
3121:+
3117:)
3111:j
3107:x
3101:n
3096:1
3093:=
3090:j
3081:(
3075:2
3070:m
3059:=
3054:0
3040:2
3035:v
3027:+
3024:n
3018:2
3013:m
3002:2
2997:v
2987:+
2983:)
2977:j
2973:x
2967:n
2962:1
2959:=
2956:j
2946:n
2943:1
2937:(
2928:2
2923:v
2915:+
2912:n
2906:2
2901:m
2891:n
2885:2
2880:m
2869:=
2863:P
2860:A
2857:M
2812:.
2807:2
2802:)
2795:m
2784:0
2767:(
2762:+
2757:2
2752:)
2745:v
2728:j
2724:x
2717:(
2710:n
2705:1
2702:=
2699:j
2648:,
2644:)
2638:2
2633:)
2626:v
2609:j
2605:x
2598:(
2591:2
2588:1
2579:(
2564:v
2551:2
2545:1
2538:n
2533:1
2530:=
2527:j
2518:)
2512:2
2507:)
2500:m
2489:0
2472:(
2465:2
2462:1
2453:(
2438:m
2425:2
2419:1
2414:=
2411:)
2405:(
2402:L
2399:)
2393:(
2387:=
2384:)
2375:x
2372:(
2369:f
2366:)
2360:(
2357:g
2303:)
2298:2
2293:m
2285:,
2280:0
2272:(
2269:N
2226:)
2221:2
2216:v
2208:,
2202:(
2199:N
2176:)
2171:n
2167:x
2163:,
2157:,
2152:1
2148:x
2144:(
2116:x
2096:x
2074:1
2070:h
2047:1
2043:h
2022:x
2000:3
1996:h
1973:2
1969:h
1946:1
1942:h
1921:x
1716:c
1686:,
1676:,
1673:1
1666:,
1663:c
1656:|
1646:a
1642:|
1631:,
1628:0
1622:{
1617:=
1614:)
1611:a
1608:,
1602:(
1599:L
1565:g
1545:g
1473:.
1470:)
1464:(
1461:g
1457:)
1448:x
1445:(
1442:f
1430:x
1427:a
1424:m
1420:g
1417:r
1414:a
1408:=
1392:d
1388:)
1382:(
1379:g
1375:)
1366:x
1363:(
1360:f
1345:)
1339:(
1336:g
1332:)
1323:x
1320:(
1317:f
1302:x
1299:a
1296:m
1292:g
1289:r
1286:a
1280:=
1270:)
1267:x
1258:(
1255:f
1243:x
1240:a
1237:m
1233:g
1230:r
1227:a
1221:=
1214:)
1211:x
1208:(
1202:P
1199:A
1196:M
1127:g
1067:g
1037:d
1033:)
1027:(
1024:g
1020:)
1011:x
1008:(
1005:f
990:)
984:(
981:g
977:)
968:x
965:(
962:f
956:=
953:)
950:x
941:(
938:f
833:g
783:)
774:x
771:(
768:f
756:x
753:a
750:m
746:g
743:r
740:a
734:=
731:)
728:x
725:(
719:E
716:L
713:M
667:)
658:x
655:(
652:f
603:x
583:)
574:x
571:(
568:f
548:x
524:f
504:x
432:(
413:e
406:t
399:v
123:)
117:(
112:)
108:(
98:·
91:·
84:·
77:·
50:.
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.