2373:
4934:
2139:
1623:) that we assume are distributed according to a straight line with i.i.d. Gaussian residuals (with zero mean): this leads to the same statistical model as was used in the example with children's heights. The dimension of the statistical model is 3: the intercept of the line, the slope of the line, and the variance of the distribution of the residuals. (Note the set of all possible lines has dimension 2, even though geometrically, a line has dimension 1.)
4920:
1873:
4958:
4946:
1605:
2071:
In both those examples, the first model has a higher dimension than the second model (for the first example, the zero-mean model has dimension 1). Such is often, but not always, the case. As an example where they have the same dimension, the set of positive-mean
Gaussian distributions is
261:
constitute a statistical model: because with the assumption alone, we cannot calculate the probability of every event. In the example above, with the first assumption, calculating the probability of an event is easy. With some other examples, though, the calculation can be difficult, or even
1980:
if the first model can be transformed into the second model by imposing constraints on the parameters of the first model. As an example, the set of all
Gaussian distributions has, nested within it, the set of zero-mean Gaussian distributions: we constrain the mean in the set of all Gaussian
2094:, p. 75) state: "The majority of the problems in statistical inference can be considered to be problems related to statistical modeling. They are typically formulated as comparisons of several statistical models." Common criteria for comparing models include the following:
1435:
1204:. In the above example with children's heights, Îľ is a stochastic variable; without that stochastic variable, the model would be deterministic. Statistical models are often used even when the data-generating process being modeled is deterministic. For instance,
262:
impractical (e.g. it might require millions of years of computation). For an assumption to constitute a statistical model, such difficulty is acceptable: doing the calculation does not need to be practicable, just theoretically possible.
1212:). Choosing an appropriate statistical model to represent a given data-generating process is sometimes extremely difficult, and may require knowledge of both the process and relevant statistical analyses. Relatedly, the statistician
553:
179:
More generally, we can calculate the probability of any event: e.g. (1 and 2) or (3 and 3) or (5 and 6). The alternative statistical assumption is this: for each of the dice, the probability of the face 5 coming up is
707:
1324:
1111:
652:
449:
1160:. There are two assumptions: that height can be approximated by a linear function of age; that errors in the approximation are distributed as i.i.d. Gaussian. The assumptions are sufficient to specify
1600:{\displaystyle {\mathcal {P}}=\left\{F_{\mu ,\sigma }(x)\equiv {\frac {1}{{\sqrt {2\pi }}\sigma }}\exp \left(-{\frac {(x-\mu )^{2}}{2\sigma ^{2}}}\right):\mu \in \mathbb {R} ,\sigma >0\right\}}
1200:. Thus, in a statistical model specified via mathematical equations, some of the variables do not have specific values, but instead have probability distributions; i.e. some of the variables are
1359:
257:
The first statistical assumption constitutes a statistical model: because with the assumption alone, we can calculate the probability of any event. The alternative statistical assumption does
2333:
1850:
1790:
1764:
1650:
1824:
1269:
911:
298:
1182:
1158:
731:
394:
346:
1385:
1680:
is formally a single parameter with dimension 2, but it is often regarded as comprising 2 separate parametersâthe mean and the standard deviation. A statistical model is
1016:
754:
related to the age: e.g. when we know that a child is of age 7, this influences the chance of the child being 1.5 meters tall. We could formalize that relationship in a
1734:
1704:
1678:
1423:
1056:
1036:
951:
594:
469:
843:) cannot be the equation for a model of the dataâunless it exactly fits all the data points, i.e. all the data points lie perfectly on the line. The error term, Îľ
4055:
4560:
1134:
989:
931:
370:
318:
4710:
4334:
2975:
1859:
has said, "These typically involve fewer assumptions of structure and distributional form but usually contain strong assumptions about independencies".
478:
4108:
4547:
1233:
Those three purposes are essentially the same as the three purposes indicated by
Friendly & Meyer: prediction, estimation, description.
2970:
2670:
2347:
2394:
657:
3574:
2722:
4962:
1274:
1061:
602:
399:
1890:
1216:
has said, "How translation from subject-matter problem to statistical model is done is often the most critical part of an analysis".
4357:
4249:
2416:
2338:
1956:
2603:
254:
We cannot, however, calculate the probability of any other nontrivial event, as the probabilities of the other faces are unknown.
4535:
4409:
109:
The first statistical assumption is this: for each of the dice, the probability of each face (1, 2, 3, 4, 5, and 6) coming up is
1937:
4593:
4254:
3999:
3370:
2960:
2208:
98:(or set of statistical assumptions) with a certain property: that the assumption allows us to calculate the probability of any
3584:
1909:
4984:
4644:
3856:
3663:
3552:
3510:
1894:
63:
2749:
1855:
Parametric models are by far the most commonly used statistical models. Regarding semiparametric and nonparametric models,
4887:
3846:
2188:
747:
3896:
1916:
4438:
4387:
4372:
4362:
4231:
4103:
4070:
3851:
3681:
2213:
2105:
2095:
4507:
3808:
1329:
4994:
4989:
4782:
4583:
3562:
3231:
2695:
2081:
4667:
4634:
2387:
2381:
475:
of the model. If a parameterization is such that distinct parameter values give rise to distinct distributions, i.e.
4639:
4382:
4141:
4047:
4027:
3935:
3646:
3464:
2947:
2819:
2580:
2550:
1923:
99:
3813:
3579:
3437:
1883:
4399:
4167:
3888:
3742:
3671:
3591:
3449:
3430:
3138:
2859:
349:
4512:
2398:
4882:
4649:
4197:
4162:
4126:
3911:
3353:
3262:
3221:
3133:
2824:
2663:
2464:
1856:
1681:
1213:
3919:
3903:
1905:
4791:
4404:
4344:
4281:
3641:
3503:
3493:
3343:
3257:
2120:
51:
39:
4552:
4489:
1196:. What distinguishes a statistical model from other mathematical models is that a statistical model is non-
4829:
4759:
4244:
4131:
3128:
3025:
2932:
2811:
2710:
1829:
1769:
1743:
95:
47:
4950:
3828:
4854:
4796:
4739:
4565:
4458:
4367:
4093:
3977:
3836:
3718:
3710:
3525:
3421:
3399:
3358:
3323:
3290:
3236:
3211:
3166:
3105:
3065:
2867:
2690:
2441:
2203:
2168:
2109:
2087:
1629:
1426:
850:
472:
78:
and other non-random variables. As such, a statistical model is "a formal representation of a theory" (
71:
4933:
3823:
2138:
1795:
1244:
886:
737:, i.e. that it does not produce catastrophic errors when its assumptions about the data are incorrect.
273:
4777:
4352:
4301:
4277:
4239:
4157:
4136:
4088:
3967:
3945:
3914:
3700:
3651:
3569:
3542:
3498:
3454:
3216:
2992:
2872:
2515:
2153:
1707:
1197:
70:
are derived via statistical models. More generally, statistical models are part of the foundation of
1163:
1139:
849:, must be included in the equation, so that the model is consistent with all the data points. To do
712:
375:
327:
4924:
4849:
4772:
4453:
4217:
4210:
4172:
4080:
4060:
4032:
3765:
3631:
3626:
3616:
3608:
3426:
3387:
3277:
3267:
3176:
2955:
2911:
2829:
2754:
2656:
2611:
2173:
2113:
599:
A statistical model can sometimes distinguish two sets of probability distributions. The first set
573:
43:
4499:
1368:
4938:
4749:
4603:
4448:
4324:
4221:
4205:
4182:
3959:
3693:
3676:
3636:
3547:
3442:
3404:
3375:
3335:
3295:
3241:
3158:
2844:
2839:
2638:
2620:
2223:
2218:
2144:
1193:
994:
556:
35:
883:, and the variance of the Gaussian distribution. We can formally specify the model in the form (
396:
represents all of the models that are considered possible. This set is typically parameterized:
1930:
817:
An admissible model must be consistent with all the data points. Thus, a straight line (height
4844:
4814:
4806:
4626:
4617:
4542:
4473:
4329:
4314:
4289:
4177:
4118:
3984:
3972:
3598:
3515:
3459:
3226:
3148:
2927:
2801:
2557:
2485:
2437:
2258:
1970:
1209:
933:, of our model comprises the set of all possible pairs (age, height). Each possible value of
755:
734:
83:
79:
74:. A statistical model is usually specified as a mathematical relationship between one or more
1719:
1689:
1663:
1408:
1041:
1021:
936:
579:
454:
4869:
4824:
4588:
4575:
4468:
4443:
4377:
4309:
4187:
3795:
3688:
3621:
3534:
3481:
3300:
3171:
2965:
2764:
2731:
2630:
2590:
2524:
2477:
2351:
2198:
2183:
2178:
2163:
1981:
distributions to get the zero-mean distributions. As a second example, the quadratic model
1401:
1219:
There are three purposes for a statistical model, according to
Konishi & Kitagawa.
75:
1208:
is, in principle, a deterministic process; yet it is commonly modeled as stochastic (via a
200:). From that assumption, we can calculate the probability of both dice coming up 5:
4786:
4530:
4392:
4319:
3994:
3868:
3841:
3818:
3787:
3414:
3409:
3363:
3093:
2744:
2503:
1201:
561:
125:. From that assumption, we can calculate the probability of both dice coming up 5:
746:
Suppose that we have a population of children, with the ages of the children distributed
4735:
4730:
3193:
3123:
2769:
1119:
974:
916:
355:
303:
576:, the model is extended by adding a probability distribution over the parameter space
4978:
4892:
4859:
4722:
4683:
4494:
4463:
3927:
3881:
3486:
3188:
3015:
2779:
2774:
2507:
2124:
1712:
if it has both finite-dimensional and infinite-dimensional parameters. Formally, if
814:
identifies the child. This implies that height is predicted by age, with some error.
3045:
2642:
1425:
has finite dimension. As an example, if we assume that data arise from a univariate
4834:
4767:
4744:
4659:
3989:
3285:
3183:
3118:
3060:
2982:
2937:
2101:
1388:
1205:
321:
709:
is the set of models that could have generated the data which is much larger than
548:{\displaystyle F_{\theta _{1}}=F_{\theta _{2}}\Rightarrow \theta _{1}=\theta _{2}}
2072:
nested within the set of all
Gaussian distributions; they both have dimension 2.
17:
4877:
4839:
4522:
4423:
4285:
4098:
4065:
3557:
3474:
3469:
3113:
3070:
3050:
3030:
3020:
2789:
2595:
1872:
869:
Gaussian, with zero mean. In this instance, the model would have 3 parameters:
55:
3723:
3203:
2903:
2834:
2784:
2759:
2679:
2356:
2158:
2134:
751:
2529:
1660:
separate parameters. For example, with the univariate
Gaussian distribution,
50:). A statistical model represents, often in considerably idealized form, the
3876:
3728:
3348:
3143:
3055:
3040:
3035:
3000:
1740:
is the number of samples, both semiparametric and nonparametric models have
804:
is a parameter that age is multiplied by to obtain a prediction of height, Îľ
67:
2448:, Huizen, The Netherlands: Johannes van Kessel Publishing, pp. 271â304
1852:, then the model is semiparametric; otherwise, the model is nonparametric.
1615:, equals 2. As another example, suppose that the data consists of points (
3392:
3010:
2887:
2882:
2877:
2849:
2193:
2119:
Another way of comparing two statistical models is through the notion of
853:, we would first need to assume some probability distributions for the Îľ
733:. Such statistical models are key in checking that a given procedure is
4897:
4598:
1897: in this section. Unsourced material may be challenged and removed.
2634:
106:. We will study two different statistical assumptions about the dice.
4819:
3800:
3774:
3754:
3005:
2796:
1113:. (The parameterization is identifiable, and this is easy to check.)
866:
702:{\displaystyle {\mathcal {P}}=\{F_{\lambda }:\lambda \in \Lambda \}}
2625:
1319:{\displaystyle {\mathcal {P}}=\{F_{\theta }:\theta \in \Theta \}}
1106:{\displaystyle {\mathcal {P}}=\{F_{\theta }:\theta \in \Theta \}}
647:{\displaystyle {\mathcal {Q}}=\{F_{\theta }:\theta \in \Theta \}}
444:{\displaystyle {\mathcal {P}}=\{F_{\theta }:\theta \in \Theta \}}
2739:
197:
103:
4708:
4275:
4022:
3321:
3091:
2708:
2652:
2587:
Steps
Towards a Unified Basis for Scientific Models and Methods
2366:
1866:
654:
is the set of models considered for inference. The second set
2648:
1441:
1280:
1256:
1169:
1145:
1067:
898:
718:
663:
608:
405:
381:
333:
285:
1116:
In this example, the model is determined by (1) specifying
2086:
Comparing statistical models is fundamental for much of
102:. As an example, consider a pair of ordinary six-sided
94:
Informally, a statistical model can be thought of as a
2446:
Advising on
Research Methods: A consultant's companion
270:
In mathematical terms, a statistical model is a pair (
1832:
1798:
1772:
1746:
1722:
1692:
1666:
1632:
1438:
1411:
1371:
1332:
1277:
1247:
1166:
1142:
1122:
1064:
1044:
1024:
997:
977:
939:
919:
889:
715:
660:
605:
582:
481:
457:
402:
378:
358:
330:
306:
276:
4561:
Autoregressive conditional heteroskedasticity (ARCH)
4868:
4805:
4758:
4721:
4676:
4658:
4625:
4616:
4574:
4521:
4482:
4431:
4422:
4343:
4300:
4230:
4196:
4150:
4117:
4079:
4046:
3958:
3867:
3786:
3741:
3709:
3662:
3607:
3533:
3524:
3334:
3276:
3250:
3202:
3157:
3104:
2991:
2946:
2920:
2902:
2858:
2810:
2730:
2721:
750:, in the population. The height of a child will be
1844:
1818:
1784:
1758:
1728:
1698:
1672:
1644:
1599:
1417:
1379:
1353:
1318:
1263:
1176:
1152:
1128:
1105:
1050:
1030:
1010:
983:
945:
925:
905:
725:
701:
646:
588:
547:
463:
443:
388:
364:
340:
312:
292:
2609:Shmueli, G. (2010), "To explain or to predict?",
1706:is infinite dimensional. A statistical model is
1354:{\displaystyle \Theta \subseteq \mathbb {R} ^{k}}
1391:; other sets can be used, in principle). Here,
4109:Multivariate adaptive regression splines (MARS)
2295:
2091:
568:In some cases, the model can be more complex.
320:is the set of possible observations, i.e. the
2664:
2495:Information Criteria and Statistical Modeling
2307:
8:
1313:
1288:
1136:and (2) making some assumptions relevant to
1100:
1075:
696:
671:
641:
616:
438:
413:
2268:
2266:
859:. For instance, we might assume that the Îľ
4718:
4705:
4622:
4428:
4297:
4272:
4043:
4019:
3747:
3530:
3331:
3318:
3101:
3088:
2727:
2718:
2705:
2671:
2657:
2649:
1241:Suppose that we have a statistical model (
1192:A statistical model is a special class of
2624:
2528:
2417:Learn how and when to remove this message
2355:
2334:"Sufficiency and Approximate Sufficiency"
2272:
2025:has, nested within it, the linear model
1957:Learn how and when to remove this message
1831:
1802:
1797:
1771:
1745:
1721:
1691:
1665:
1656:, it is sometimes regarded as comprising
1652:is a single parameter that has dimension
1631:
1576:
1575:
1552:
1537:
1518:
1488:
1482:
1458:
1440:
1439:
1437:
1410:
1373:
1372:
1370:
1345:
1341:
1340:
1331:
1295:
1279:
1278:
1276:
1255:
1254:
1246:
1168:
1167:
1165:
1144:
1143:
1141:
1121:
1082:
1066:
1065:
1063:
1043:
1023:
1002:
996:
976:
938:
918:
897:
896:
888:
717:
716:
714:
678:
662:
661:
659:
623:
607:
606:
604:
581:
539:
526:
511:
506:
491:
486:
480:
456:
420:
404:
403:
401:
380:
379:
377:
357:
332:
331:
329:
305:
284:
283:
275:
2456:Model Selection and Multimodel Inference
2454:Burnham, K. P.; Anderson, D. R. (2002),
2380:This article includes a list of general
2235:
2050: + Îľ, Îľ ~ đŠ(0,
2016: + Îľ, Îľ ~ đŠ(0,
4635:KaplanâMeier estimator (product limit)
2112:together with its generalization, the
1399:of the model. The model is said to be
2440:(2008), "Modelling", in Adèr, H. J.;
1038:is the set of all possible values of
7:
4945:
4645:Accelerated failure time (AFT) model
2600:Statistical Modeling and Computation
2348:Institute of Mathematical Statistics
2254:
1895:adding citations to reliable sources
1845:{\displaystyle n\rightarrow \infty }
1785:{\displaystyle n\rightarrow \infty }
1759:{\displaystyle k\rightarrow \infty }
1229:Description of stochastic structures
4957:
4240:Analysis of variance (ANOVA, anova)
2469:Principles of Statistical Inference
2319:
2283:
2242:
1184:—as they are required to do.
4335:CochranâMantelâHaenszel statistics
2961:Pearson product-moment correlation
2493:Konishi, S.; Kitagawa, G. (2008),
2386:it lacks sufficient corresponding
1839:
1779:
1753:
1723:
1693:
1645:{\displaystyle \theta \in \Theta }
1639:
1412:
1333:
1310:
1097:
1025:
693:
638:
583:
458:
435:
25:
2556:Drton, M.; Sullivant, S. (2007),
2339:Annals of Mathematical Statistics
913:) as follows. The sample space,
54:. When referring specifically to
4956:
4944:
4932:
4919:
4918:
2371:
2137:
1871:
1819:{\displaystyle k/n\rightarrow 0}
1611:In this example, the dimension,
1264:{\displaystyle S,{\mathcal {P}}}
906:{\displaystyle S,{\mathcal {P}}}
555:(in other words, the mapping is
293:{\displaystyle S,{\mathcal {P}}}
46:(and similar data from a larger
4594:Least-squares spectral analysis
2458:(2nd ed.), Springer-Verlag
2209:Statistical model specification
1882:needs additional citations for
971:) determines a distribution on
3575:Mean-unbiased minimum-variance
2558:"Algebraic statistical models"
2508:"What is a statistical model?"
1836:
1810:
1776:
1750:
1534:
1521:
1476:
1470:
1326:. In notation, we write that
1177:{\displaystyle {\mathcal {P}}}
1153:{\displaystyle {\mathcal {P}}}
991:; denote that distribution by
726:{\displaystyle {\mathcal {Q}}}
519:
389:{\displaystyle {\mathcal {P}}}
341:{\displaystyle {\mathcal {P}}}
1:
4888:Geographic information system
4104:Simultaneous equations models
2482:Discrete Data Analysis with R
2189:Response modeling methodology
42:concerning the generation of
4071:Coefficient of determination
3682:Uniformly most powerful test
2471:, Cambridge University Press
2214:Statistical model validation
2106:Akaike information criterion
2092:Konishi & Kitagawa (2008
2059:âwe constrain the parameter
1429:, then we are assuming that
1380:{\displaystyle \mathbb {R} }
64:statistical hypothesis tests
58:, the corresponding term is
4640:Proportional hazards models
4584:Spectral density estimation
4566:Vector autoregression (VAR)
4000:Maximum posterior estimator
3232:Randomized controlled trial
2296:Konishi & Kitagawa 2008
2082:Statistical model selection
1976:Two statistical models are
1011:{\displaystyle F_{\theta }}
5011:
4400:Multivariate distributions
2820:Average absolute deviation
2581:Cambridge University Press
2551:Cambridge University Press
2079:
1968:
27:Type of mathematical model
4914:
4717:
4704:
4388:Structural equation model
4296:
4271:
4042:
4018:
3750:
3724:Score/Lagrange multiplier
3330:
3317:
3139:Sample size determination
3100:
3087:
2717:
2704:
2686:
2598:; Chan, J. C. C. (2014),
2308:Friendly & Meyer 2016
1226:Extraction of information
758:model, like this: height
350:probability distributions
4883:Environmental statistics
4405:Elliptical distributions
4198:Generalized linear model
4127:Simple linear regression
3897:HodgesâLehmann estimator
3354:Probability distribution
3263:Stochastic approximation
2825:Coefficient of variation
2575:Freedman, D. A. (2009),
1969:Not to be confused with
4543:Cross-correlation (XCF)
4151:Non-standard predictors
3585:LehmannâScheffĂŠ theorem
3258:Adaptive clinical trial
2585:Helland, I. S. (2010),
2545:Davison, A. C. (2008),
2401:more precise citations.
2357:10.1214/aoms/1177700372
2332:Le Cam, Lucien (1964).
1729:{\displaystyle \Theta }
1699:{\displaystyle \Theta }
1673:{\displaystyle \theta }
1418:{\displaystyle \Theta }
1365:is a positive integer (
1051:{\displaystyle \theta }
1031:{\displaystyle \Theta }
946:{\displaystyle \theta }
810:is the error term, and
589:{\displaystyle \Theta }
464:{\displaystyle \Theta }
52:data-generating process
40:statistical assumptions
38:that embodies a set of
4939:Mathematics portal
4760:Engineering statistics
4668:NelsonâAalen estimator
4245:Analysis of covariance
4132:Ordinary least squares
4056:Pearson product-moment
3460:Statistical functional
3371:Empirical distribution
3204:Controlled experiments
2933:Frequency distribution
2711:Descriptive statistics
2530:10.1214/aos/1035844977
1846:
1820:
1786:
1760:
1730:
1700:
1674:
1646:
1601:
1419:
1381:
1355:
1320:
1265:
1178:
1154:
1130:
1107:
1052:
1032:
1012:
985:
947:
927:
907:
727:
703:
648:
590:
549:
465:
445:
390:
366:
342:
314:
294:
196:(because the dice are
96:statistical assumption
68:statistical estimators
4985:Mathematical modeling
4855:Population statistics
4797:System identification
4531:Autocorrelation (ACF)
4459:Exponential smoothing
4373:Discriminant analysis
4368:Canonical correlation
4232:Partition of variance
4094:Regression validation
3938:(JonckheereâTerpstra)
3837:Likelihood-ratio test
3526:Frequentist inference
3438:Locationâscale family
3359:Sampling distribution
3324:Statistical inference
3291:Cross-sectional study
3278:Observational studies
3237:Randomized experiment
3066:Stem-and-leaf display
2868:Central limit theorem
2204:Statistical inference
2169:Design of experiments
2110:likelihood-ratio test
2088:statistical inference
1847:
1821:
1787:
1761:
1731:
1701:
1686:if the parameter set
1675:
1647:
1602:
1427:Gaussian distribution
1420:
1382:
1356:
1321:
1266:
1179:
1155:
1131:
1108:
1053:
1033:
1013:
986:
948:
928:
908:
851:statistical inference
728:
704:
649:
591:
550:
466:
446:
391:
367:
343:
315:
295:
72:statistical inference
4778:Probabilistic design
4363:Principal components
4206:Exponential families
4158:Nonlinear regression
4137:General linear model
4099:Mixed effects models
4089:Errors and residuals
4066:Confounding variable
3968:Bayesian probability
3946:Van der Waerden test
3936:Ordered alternative
3701:Multiple comparisons
3580:RaoâBlackwellization
3543:Estimating equations
3499:Statistical distance
3217:Factorial experiment
2750:Arithmetic-Geometric
2516:Annals of Statistics
2480:; Meyer, D. (2016),
2154:All models are wrong
1891:improve this article
1830:
1796:
1770:
1744:
1720:
1716:is the dimension of
1690:
1664:
1630:
1436:
1409:
1369:
1330:
1275:
1245:
1237:Dimension of a model
1164:
1140:
1120:
1062:
1042:
1022:
995:
975:
937:
917:
887:
713:
658:
603:
580:
559:), it is said to be
479:
455:
400:
376:
356:
328:
304:
274:
4850:Official statistics
4773:Methods engineering
4454:Seasonal adjustment
4222:Poisson regressions
4142:Bayesian regression
4081:Regression analysis
4061:Partial correlation
4033:Regression analysis
3632:Prediction interval
3627:Likelihood interval
3617:Confidence interval
3609:Interval estimation
3570:Unbiased estimators
3388:Model specification
3268:Up-and-down designs
2956:Partial correlation
2912:Index of dispersion
2830:Interquartile range
2612:Statistical Science
2174:Deterministic model
2114:relative likelihood
1906:"Statistical model"
574:Bayesian statistics
60:probabilistic model
4995:Statistical theory
4990:Statistical models
4870:Spatial statistics
4750:Medical statistics
4650:First hitting time
4604:Whittle likelihood
4255:Degrees of freedom
4250:Multivariate ANOVA
4183:Heteroscedasticity
3995:Bayesian estimator
3960:Bayesian inference
3809:KolmogorovâSmirnov
3694:Randomization test
3664:Testing hypotheses
3637:Tolerance interval
3548:Maximum likelihood
3443:Exponential family
3376:Density estimation
3336:Statistical theory
3296:Natural experiment
3242:Scientific control
3159:Survey methodology
2845:Standard deviation
2577:Statistical Models
2547:Statistical Models
2486:Chapman & Hall
2442:Mellenbergh, G. J.
2224:Stochastic process
2219:Statistical theory
2145:Mathematics portal
1842:
1816:
1782:
1756:
1726:
1696:
1670:
1642:
1626:Although formally
1597:
1415:
1377:
1351:
1316:
1261:
1194:mathematical model
1174:
1150:
1126:
1103:
1048:
1028:
1008:
981:
943:
923:
903:
865:distributions are
797:is the intercept,
723:
699:
644:
586:
545:
461:
441:
386:
362:
338:
310:
290:
36:mathematical model
4972:
4971:
4910:
4909:
4906:
4905:
4845:National accounts
4815:Actuarial science
4807:Social statistics
4700:
4699:
4696:
4695:
4692:
4691:
4627:Survival function
4612:
4611:
4474:Granger causality
4315:Contingency table
4290:Survival analysis
4267:
4266:
4263:
4262:
4119:Linear regression
4014:
4013:
4010:
4009:
3985:Credible interval
3954:
3953:
3737:
3736:
3553:Method of moments
3422:Parametric family
3383:Statistical model
3313:
3312:
3309:
3308:
3227:Random assignment
3149:Statistical power
3083:
3082:
3079:
3078:
2928:Contingency table
2898:
2897:
2765:Generalized/power
2635:10.1214/10-STS330
2565:Statistica Sinica
2427:
2426:
2419:
1971:Multilevel models
1967:
1966:
1959:
1941:
1559:
1502:
1496:
1210:Bernoulli process
1129:{\displaystyle S}
984:{\displaystyle S}
926:{\displaystyle S}
756:linear regression
365:{\displaystyle S}
313:{\displaystyle S}
266:Formal definition
32:statistical model
18:Probability model
16:(Redirected from
5002:
4960:
4959:
4948:
4947:
4937:
4936:
4922:
4921:
4825:Crime statistics
4719:
4706:
4623:
4589:Fourier analysis
4576:Frequency domain
4556:
4503:
4469:Structural break
4429:
4378:Cluster analysis
4325:Log-linear model
4298:
4273:
4214:
4188:Homoscedasticity
4044:
4020:
3939:
3931:
3923:
3922:(KruskalâWallis)
3907:
3892:
3847:Cross validation
3832:
3814:AndersonâDarling
3761:
3748:
3719:Likelihood-ratio
3711:Parametric tests
3689:Permutation test
3672:1- & 2-tails
3563:Minimum distance
3535:Point estimation
3531:
3482:Optimal decision
3433:
3332:
3319:
3301:Quasi-experiment
3251:Adaptive designs
3102:
3089:
2966:Rank correlation
2728:
2719:
2706:
2673:
2666:
2659:
2650:
2645:
2628:
2591:World Scientific
2572:
2562:
2533:
2532:
2523:(5): 1225â1310,
2512:
2498:
2488:
2472:
2459:
2449:
2422:
2415:
2411:
2408:
2402:
2397:this article by
2388:inline citations
2375:
2374:
2367:
2362:
2361:
2359:
2329:
2323:
2317:
2311:
2305:
2299:
2293:
2287:
2281:
2275:
2270:
2261:
2252:
2246:
2240:
2199:Scientific model
2184:Predictive model
2179:Effective theory
2164:Conceptual model
2147:
2142:
2141:
2076:Comparing models
2067:
2055:
2021:
1962:
1955:
1951:
1948:
1942:
1940:
1899:
1875:
1867:
1851:
1849:
1848:
1843:
1825:
1823:
1822:
1817:
1806:
1791:
1789:
1788:
1783:
1765:
1763:
1762:
1757:
1739:
1735:
1733:
1732:
1727:
1715:
1705:
1703:
1702:
1697:
1679:
1677:
1676:
1671:
1659:
1655:
1651:
1649:
1648:
1643:
1622:
1618:
1614:
1606:
1604:
1603:
1598:
1596:
1592:
1579:
1565:
1561:
1560:
1558:
1557:
1556:
1543:
1542:
1541:
1519:
1503:
1501:
1497:
1489:
1483:
1469:
1468:
1445:
1444:
1424:
1422:
1421:
1416:
1394:
1386:
1384:
1383:
1378:
1376:
1364:
1360:
1358:
1357:
1352:
1350:
1349:
1344:
1325:
1323:
1322:
1317:
1300:
1299:
1284:
1283:
1270:
1268:
1267:
1262:
1260:
1259:
1183:
1181:
1180:
1175:
1173:
1172:
1159:
1157:
1156:
1151:
1149:
1148:
1135:
1133:
1132:
1127:
1112:
1110:
1109:
1104:
1087:
1086:
1071:
1070:
1057:
1055:
1054:
1049:
1037:
1035:
1034:
1029:
1017:
1015:
1014:
1009:
1007:
1006:
990:
988:
987:
982:
952:
950:
949:
944:
932:
930:
929:
924:
912:
910:
909:
904:
902:
901:
732:
730:
729:
724:
722:
721:
708:
706:
705:
700:
683:
682:
667:
666:
653:
651:
650:
645:
628:
627:
612:
611:
595:
593:
592:
587:
554:
552:
551:
546:
544:
543:
531:
530:
518:
517:
516:
515:
498:
497:
496:
495:
470:
468:
467:
462:
450:
448:
447:
442:
425:
424:
409:
408:
395:
393:
392:
387:
385:
384:
371:
369:
368:
363:
347:
345:
344:
339:
337:
336:
319:
317:
316:
311:
299:
297:
296:
291:
289:
288:
253:
251:
249:
248:
245:
242:
234:
232:
230:
229:
226:
223:
216:
214:
213:
210:
207:
195:
193:
192:
189:
186:
178:
176:
174:
173:
170:
167:
159:
157:
155:
154:
151:
148:
141:
139:
138:
135:
132:
124:
122:
121:
118:
115:
76:random variables
21:
5010:
5009:
5005:
5004:
5003:
5001:
5000:
4999:
4975:
4974:
4973:
4968:
4931:
4902:
4864:
4801:
4787:quality control
4754:
4736:Clinical trials
4713:
4688:
4672:
4660:Hazard function
4654:
4608:
4570:
4554:
4517:
4513:BreuschâGodfrey
4501:
4478:
4418:
4393:Factor analysis
4339:
4320:Graphical model
4292:
4259:
4226:
4212:
4192:
4146:
4113:
4075:
4038:
4037:
4006:
3950:
3937:
3929:
3921:
3905:
3890:
3869:Rank statistics
3863:
3842:Model selection
3830:
3788:Goodness of fit
3782:
3759:
3733:
3705:
3658:
3603:
3592:Median unbiased
3520:
3431:
3364:Order statistic
3326:
3305:
3272:
3246:
3198:
3153:
3096:
3094:Data collection
3075:
2987:
2942:
2916:
2894:
2854:
2806:
2723:Continuous data
2713:
2700:
2682:
2677:
2608:
2560:
2555:
2542:
2540:Further reading
2537:
2510:
2502:
2492:
2476:
2463:
2453:
2436:
2432:
2423:
2412:
2406:
2403:
2393:Please help to
2392:
2376:
2372:
2365:
2331:
2330:
2326:
2318:
2314:
2306:
2302:
2294:
2290:
2282:
2278:
2271:
2264:
2253:
2249:
2241:
2237:
2233:
2228:
2143:
2136:
2133:
2084:
2078:
2066:
2060:
2046:
2039:
2029:
2012:
2002:
1995:
1985:
1974:
1963:
1952:
1946:
1943:
1900:
1898:
1888:
1876:
1865:
1828:
1827:
1794:
1793:
1768:
1767:
1742:
1741:
1737:
1718:
1717:
1713:
1688:
1687:
1662:
1661:
1657:
1653:
1628:
1627:
1620:
1616:
1612:
1548:
1544:
1533:
1520:
1514:
1510:
1487:
1454:
1453:
1449:
1434:
1433:
1407:
1406:
1392:
1367:
1366:
1362:
1339:
1328:
1327:
1291:
1273:
1272:
1243:
1242:
1239:
1190:
1188:General remarks
1162:
1161:
1138:
1137:
1118:
1117:
1078:
1060:
1059:
1040:
1039:
1020:
1019:
998:
993:
992:
973:
972:
966:
959:
935:
934:
915:
914:
885:
884:
882:
875:
864:
858:
848:
842:
836:
829:
822:
809:
803:
796:
789:
783:
777:
770:
763:
744:
711:
710:
674:
656:
655:
619:
601:
600:
578:
577:
535:
522:
507:
502:
487:
482:
477:
476:
453:
452:
416:
398:
397:
374:
373:
354:
353:
326:
325:
302:
301:
272:
271:
268:
246:
243:
240:
239:
237:
236:
227:
224:
221:
220:
218:
211:
208:
205:
204:
202:
201:
190:
187:
184:
183:
181:
171:
168:
165:
164:
162:
161:
152:
149:
146:
145:
143:
136:
133:
130:
129:
127:
126:
119:
116:
113:
112:
110:
92:
28:
23:
22:
15:
12:
11:
5:
5008:
5006:
4998:
4997:
4992:
4987:
4977:
4976:
4970:
4969:
4967:
4966:
4954:
4942:
4928:
4915:
4912:
4911:
4908:
4907:
4904:
4903:
4901:
4900:
4895:
4890:
4885:
4880:
4874:
4872:
4866:
4865:
4863:
4862:
4857:
4852:
4847:
4842:
4837:
4832:
4827:
4822:
4817:
4811:
4809:
4803:
4802:
4800:
4799:
4794:
4789:
4780:
4775:
4770:
4764:
4762:
4756:
4755:
4753:
4752:
4747:
4742:
4733:
4731:Bioinformatics
4727:
4725:
4715:
4714:
4709:
4702:
4701:
4698:
4697:
4694:
4693:
4690:
4689:
4687:
4686:
4680:
4678:
4674:
4673:
4671:
4670:
4664:
4662:
4656:
4655:
4653:
4652:
4647:
4642:
4637:
4631:
4629:
4620:
4614:
4613:
4610:
4609:
4607:
4606:
4601:
4596:
4591:
4586:
4580:
4578:
4572:
4571:
4569:
4568:
4563:
4558:
4550:
4545:
4540:
4539:
4538:
4536:partial (PACF)
4527:
4525:
4519:
4518:
4516:
4515:
4510:
4505:
4497:
4492:
4486:
4484:
4483:Specific tests
4480:
4479:
4477:
4476:
4471:
4466:
4461:
4456:
4451:
4446:
4441:
4435:
4433:
4426:
4420:
4419:
4417:
4416:
4415:
4414:
4413:
4412:
4397:
4396:
4395:
4385:
4383:Classification
4380:
4375:
4370:
4365:
4360:
4355:
4349:
4347:
4341:
4340:
4338:
4337:
4332:
4330:McNemar's test
4327:
4322:
4317:
4312:
4306:
4304:
4294:
4293:
4276:
4269:
4268:
4265:
4264:
4261:
4260:
4258:
4257:
4252:
4247:
4242:
4236:
4234:
4228:
4227:
4225:
4224:
4208:
4202:
4200:
4194:
4193:
4191:
4190:
4185:
4180:
4175:
4170:
4168:Semiparametric
4165:
4160:
4154:
4152:
4148:
4147:
4145:
4144:
4139:
4134:
4129:
4123:
4121:
4115:
4114:
4112:
4111:
4106:
4101:
4096:
4091:
4085:
4083:
4077:
4076:
4074:
4073:
4068:
4063:
4058:
4052:
4050:
4040:
4039:
4036:
4035:
4030:
4024:
4023:
4016:
4015:
4012:
4011:
4008:
4007:
4005:
4004:
4003:
4002:
3992:
3987:
3982:
3981:
3980:
3975:
3964:
3962:
3956:
3955:
3952:
3951:
3949:
3948:
3943:
3942:
3941:
3933:
3925:
3909:
3906:(MannâWhitney)
3901:
3900:
3899:
3886:
3885:
3884:
3873:
3871:
3865:
3864:
3862:
3861:
3860:
3859:
3854:
3849:
3839:
3834:
3831:(ShapiroâWilk)
3826:
3821:
3816:
3811:
3806:
3798:
3792:
3790:
3784:
3783:
3781:
3780:
3772:
3763:
3751:
3745:
3743:Specific tests
3739:
3738:
3735:
3734:
3732:
3731:
3726:
3721:
3715:
3713:
3707:
3706:
3704:
3703:
3698:
3697:
3696:
3686:
3685:
3684:
3674:
3668:
3666:
3660:
3659:
3657:
3656:
3655:
3654:
3649:
3639:
3634:
3629:
3624:
3619:
3613:
3611:
3605:
3604:
3602:
3601:
3596:
3595:
3594:
3589:
3588:
3587:
3582:
3567:
3566:
3565:
3560:
3555:
3550:
3539:
3537:
3528:
3522:
3521:
3519:
3518:
3513:
3508:
3507:
3506:
3496:
3491:
3490:
3489:
3479:
3478:
3477:
3472:
3467:
3457:
3452:
3447:
3446:
3445:
3440:
3435:
3419:
3418:
3417:
3412:
3407:
3397:
3396:
3395:
3390:
3380:
3379:
3378:
3368:
3367:
3366:
3356:
3351:
3346:
3340:
3338:
3328:
3327:
3322:
3315:
3314:
3311:
3310:
3307:
3306:
3304:
3303:
3298:
3293:
3288:
3282:
3280:
3274:
3273:
3271:
3270:
3265:
3260:
3254:
3252:
3248:
3247:
3245:
3244:
3239:
3234:
3229:
3224:
3219:
3214:
3208:
3206:
3200:
3199:
3197:
3196:
3194:Standard error
3191:
3186:
3181:
3180:
3179:
3174:
3163:
3161:
3155:
3154:
3152:
3151:
3146:
3141:
3136:
3131:
3126:
3124:Optimal design
3121:
3116:
3110:
3108:
3098:
3097:
3092:
3085:
3084:
3081:
3080:
3077:
3076:
3074:
3073:
3068:
3063:
3058:
3053:
3048:
3043:
3038:
3033:
3028:
3023:
3018:
3013:
3008:
3003:
2997:
2995:
2989:
2988:
2986:
2985:
2980:
2979:
2978:
2973:
2963:
2958:
2952:
2950:
2944:
2943:
2941:
2940:
2935:
2930:
2924:
2922:
2921:Summary tables
2918:
2917:
2915:
2914:
2908:
2906:
2900:
2899:
2896:
2895:
2893:
2892:
2891:
2890:
2885:
2880:
2870:
2864:
2862:
2856:
2855:
2853:
2852:
2847:
2842:
2837:
2832:
2827:
2822:
2816:
2814:
2808:
2807:
2805:
2804:
2799:
2794:
2793:
2792:
2787:
2782:
2777:
2772:
2767:
2762:
2757:
2755:Contraharmonic
2752:
2747:
2736:
2734:
2725:
2715:
2714:
2709:
2702:
2701:
2699:
2698:
2693:
2687:
2684:
2683:
2678:
2676:
2675:
2668:
2661:
2653:
2647:
2646:
2619:(3): 289â310,
2606:
2593:
2583:
2573:
2553:
2541:
2538:
2536:
2535:
2500:
2490:
2474:
2461:
2451:
2433:
2431:
2428:
2425:
2424:
2407:September 2010
2379:
2377:
2370:
2364:
2363:
2324:
2312:
2300:
2288:
2276:
2273:McCullagh 2002
2262:
2247:
2234:
2232:
2229:
2227:
2226:
2221:
2216:
2211:
2206:
2201:
2196:
2191:
2186:
2181:
2176:
2171:
2166:
2161:
2156:
2150:
2149:
2148:
2132:
2129:
2123:introduced by
2077:
2074:
2064:
2057:
2056:
2044:
2037:
2023:
2022:
2010:
2000:
1993:
1965:
1964:
1879:
1877:
1870:
1864:
1861:
1841:
1838:
1835:
1815:
1812:
1809:
1805:
1801:
1781:
1778:
1775:
1755:
1752:
1749:
1725:
1709:semiparametric
1695:
1669:
1641:
1638:
1635:
1609:
1608:
1595:
1591:
1588:
1585:
1582:
1578:
1574:
1571:
1568:
1564:
1555:
1551:
1547:
1540:
1536:
1532:
1529:
1526:
1523:
1517:
1513:
1509:
1506:
1500:
1495:
1492:
1486:
1481:
1478:
1475:
1472:
1467:
1464:
1461:
1457:
1452:
1448:
1443:
1414:
1395:is called the
1375:
1348:
1343:
1338:
1335:
1315:
1312:
1309:
1306:
1303:
1298:
1294:
1290:
1287:
1282:
1258:
1253:
1250:
1238:
1235:
1231:
1230:
1227:
1224:
1189:
1186:
1171:
1147:
1125:
1102:
1099:
1096:
1093:
1090:
1085:
1081:
1077:
1074:
1069:
1047:
1027:
1005:
1001:
980:
964:
957:
942:
922:
900:
895:
892:
880:
873:
860:
854:
844:
838:
834:
827:
818:
805:
801:
794:
785:
779:
775:
768:
759:
752:stochastically
743:
740:
739:
738:
720:
698:
695:
692:
689:
686:
681:
677:
673:
670:
665:
643:
640:
637:
634:
631:
626:
622:
618:
615:
610:
597:
585:
542:
538:
534:
529:
525:
521:
514:
510:
505:
501:
494:
490:
485:
460:
440:
437:
434:
431:
428:
423:
419:
415:
412:
407:
383:
361:
335:
309:
287:
282:
279:
267:
264:
91:
88:
84:Kenneth Bollen
26:
24:
14:
13:
10:
9:
6:
4:
3:
2:
5007:
4996:
4993:
4991:
4988:
4986:
4983:
4982:
4980:
4965:
4964:
4955:
4953:
4952:
4943:
4941:
4940:
4935:
4929:
4927:
4926:
4917:
4916:
4913:
4899:
4896:
4894:
4893:Geostatistics
4891:
4889:
4886:
4884:
4881:
4879:
4876:
4875:
4873:
4871:
4867:
4861:
4860:Psychometrics
4858:
4856:
4853:
4851:
4848:
4846:
4843:
4841:
4838:
4836:
4833:
4831:
4828:
4826:
4823:
4821:
4818:
4816:
4813:
4812:
4810:
4808:
4804:
4798:
4795:
4793:
4790:
4788:
4784:
4781:
4779:
4776:
4774:
4771:
4769:
4766:
4765:
4763:
4761:
4757:
4751:
4748:
4746:
4743:
4741:
4737:
4734:
4732:
4729:
4728:
4726:
4724:
4723:Biostatistics
4720:
4716:
4712:
4707:
4703:
4685:
4684:Log-rank test
4682:
4681:
4679:
4675:
4669:
4666:
4665:
4663:
4661:
4657:
4651:
4648:
4646:
4643:
4641:
4638:
4636:
4633:
4632:
4630:
4628:
4624:
4621:
4619:
4615:
4605:
4602:
4600:
4597:
4595:
4592:
4590:
4587:
4585:
4582:
4581:
4579:
4577:
4573:
4567:
4564:
4562:
4559:
4557:
4555:(BoxâJenkins)
4551:
4549:
4546:
4544:
4541:
4537:
4534:
4533:
4532:
4529:
4528:
4526:
4524:
4520:
4514:
4511:
4509:
4508:DurbinâWatson
4506:
4504:
4498:
4496:
4493:
4491:
4490:DickeyâFuller
4488:
4487:
4485:
4481:
4475:
4472:
4470:
4467:
4465:
4464:Cointegration
4462:
4460:
4457:
4455:
4452:
4450:
4447:
4445:
4442:
4440:
4439:Decomposition
4437:
4436:
4434:
4430:
4427:
4425:
4421:
4411:
4408:
4407:
4406:
4403:
4402:
4401:
4398:
4394:
4391:
4390:
4389:
4386:
4384:
4381:
4379:
4376:
4374:
4371:
4369:
4366:
4364:
4361:
4359:
4356:
4354:
4351:
4350:
4348:
4346:
4342:
4336:
4333:
4331:
4328:
4326:
4323:
4321:
4318:
4316:
4313:
4311:
4310:Cohen's kappa
4308:
4307:
4305:
4303:
4299:
4295:
4291:
4287:
4283:
4279:
4274:
4270:
4256:
4253:
4251:
4248:
4246:
4243:
4241:
4238:
4237:
4235:
4233:
4229:
4223:
4219:
4215:
4209:
4207:
4204:
4203:
4201:
4199:
4195:
4189:
4186:
4184:
4181:
4179:
4176:
4174:
4171:
4169:
4166:
4164:
4163:Nonparametric
4161:
4159:
4156:
4155:
4153:
4149:
4143:
4140:
4138:
4135:
4133:
4130:
4128:
4125:
4124:
4122:
4120:
4116:
4110:
4107:
4105:
4102:
4100:
4097:
4095:
4092:
4090:
4087:
4086:
4084:
4082:
4078:
4072:
4069:
4067:
4064:
4062:
4059:
4057:
4054:
4053:
4051:
4049:
4045:
4041:
4034:
4031:
4029:
4026:
4025:
4021:
4017:
4001:
3998:
3997:
3996:
3993:
3991:
3988:
3986:
3983:
3979:
3976:
3974:
3971:
3970:
3969:
3966:
3965:
3963:
3961:
3957:
3947:
3944:
3940:
3934:
3932:
3926:
3924:
3918:
3917:
3916:
3913:
3912:Nonparametric
3910:
3908:
3902:
3898:
3895:
3894:
3893:
3887:
3883:
3882:Sample median
3880:
3879:
3878:
3875:
3874:
3872:
3870:
3866:
3858:
3855:
3853:
3850:
3848:
3845:
3844:
3843:
3840:
3838:
3835:
3833:
3827:
3825:
3822:
3820:
3817:
3815:
3812:
3810:
3807:
3805:
3803:
3799:
3797:
3794:
3793:
3791:
3789:
3785:
3779:
3777:
3773:
3771:
3769:
3764:
3762:
3757:
3753:
3752:
3749:
3746:
3744:
3740:
3730:
3727:
3725:
3722:
3720:
3717:
3716:
3714:
3712:
3708:
3702:
3699:
3695:
3692:
3691:
3690:
3687:
3683:
3680:
3679:
3678:
3675:
3673:
3670:
3669:
3667:
3665:
3661:
3653:
3650:
3648:
3645:
3644:
3643:
3640:
3638:
3635:
3633:
3630:
3628:
3625:
3623:
3620:
3618:
3615:
3614:
3612:
3610:
3606:
3600:
3597:
3593:
3590:
3586:
3583:
3581:
3578:
3577:
3576:
3573:
3572:
3571:
3568:
3564:
3561:
3559:
3556:
3554:
3551:
3549:
3546:
3545:
3544:
3541:
3540:
3538:
3536:
3532:
3529:
3527:
3523:
3517:
3514:
3512:
3509:
3505:
3502:
3501:
3500:
3497:
3495:
3492:
3488:
3487:loss function
3485:
3484:
3483:
3480:
3476:
3473:
3471:
3468:
3466:
3463:
3462:
3461:
3458:
3456:
3453:
3451:
3448:
3444:
3441:
3439:
3436:
3434:
3428:
3425:
3424:
3423:
3420:
3416:
3413:
3411:
3408:
3406:
3403:
3402:
3401:
3398:
3394:
3391:
3389:
3386:
3385:
3384:
3381:
3377:
3374:
3373:
3372:
3369:
3365:
3362:
3361:
3360:
3357:
3355:
3352:
3350:
3347:
3345:
3342:
3341:
3339:
3337:
3333:
3329:
3325:
3320:
3316:
3302:
3299:
3297:
3294:
3292:
3289:
3287:
3284:
3283:
3281:
3279:
3275:
3269:
3266:
3264:
3261:
3259:
3256:
3255:
3253:
3249:
3243:
3240:
3238:
3235:
3233:
3230:
3228:
3225:
3223:
3220:
3218:
3215:
3213:
3210:
3209:
3207:
3205:
3201:
3195:
3192:
3190:
3189:Questionnaire
3187:
3185:
3182:
3178:
3175:
3173:
3170:
3169:
3168:
3165:
3164:
3162:
3160:
3156:
3150:
3147:
3145:
3142:
3140:
3137:
3135:
3132:
3130:
3127:
3125:
3122:
3120:
3117:
3115:
3112:
3111:
3109:
3107:
3103:
3099:
3095:
3090:
3086:
3072:
3069:
3067:
3064:
3062:
3059:
3057:
3054:
3052:
3049:
3047:
3044:
3042:
3039:
3037:
3034:
3032:
3029:
3027:
3024:
3022:
3019:
3017:
3016:Control chart
3014:
3012:
3009:
3007:
3004:
3002:
2999:
2998:
2996:
2994:
2990:
2984:
2981:
2977:
2974:
2972:
2969:
2968:
2967:
2964:
2962:
2959:
2957:
2954:
2953:
2951:
2949:
2945:
2939:
2936:
2934:
2931:
2929:
2926:
2925:
2923:
2919:
2913:
2910:
2909:
2907:
2905:
2901:
2889:
2886:
2884:
2881:
2879:
2876:
2875:
2874:
2871:
2869:
2866:
2865:
2863:
2861:
2857:
2851:
2848:
2846:
2843:
2841:
2838:
2836:
2833:
2831:
2828:
2826:
2823:
2821:
2818:
2817:
2815:
2813:
2809:
2803:
2800:
2798:
2795:
2791:
2788:
2786:
2783:
2781:
2778:
2776:
2773:
2771:
2768:
2766:
2763:
2761:
2758:
2756:
2753:
2751:
2748:
2746:
2743:
2742:
2741:
2738:
2737:
2735:
2733:
2729:
2726:
2724:
2720:
2716:
2712:
2707:
2703:
2697:
2694:
2692:
2689:
2688:
2685:
2681:
2674:
2669:
2667:
2662:
2660:
2655:
2654:
2651:
2644:
2640:
2636:
2632:
2627:
2622:
2618:
2614:
2613:
2607:
2605:
2601:
2597:
2596:Kroese, D. P.
2594:
2592:
2588:
2584:
2582:
2578:
2574:
2570:
2566:
2559:
2554:
2552:
2548:
2544:
2543:
2539:
2531:
2526:
2522:
2518:
2517:
2509:
2505:
2504:McCullagh, P.
2501:
2496:
2491:
2487:
2483:
2479:
2475:
2470:
2466:
2462:
2457:
2452:
2447:
2443:
2439:
2435:
2434:
2429:
2421:
2418:
2410:
2400:
2396:
2390:
2389:
2383:
2378:
2369:
2368:
2358:
2353:
2349:
2345:
2341:
2340:
2335:
2328:
2325:
2321:
2316:
2313:
2309:
2304:
2301:
2297:
2292:
2289:
2286:, p. 197
2285:
2280:
2277:
2274:
2269:
2267:
2263:
2260:
2256:
2251:
2248:
2245:, p. 178
2244:
2239:
2236:
2230:
2225:
2222:
2220:
2217:
2215:
2212:
2210:
2207:
2205:
2202:
2200:
2197:
2195:
2192:
2190:
2187:
2185:
2182:
2180:
2177:
2175:
2172:
2170:
2167:
2165:
2162:
2160:
2157:
2155:
2152:
2151:
2146:
2140:
2135:
2130:
2128:
2126:
2125:Lucien Le Cam
2122:
2117:
2115:
2111:
2107:
2103:
2099:
2098:
2093:
2089:
2083:
2075:
2073:
2069:
2063:
2053:
2049:
2043:
2036:
2032:
2028:
2027:
2026:
2019:
2015:
2009:
2005:
1999:
1992:
1988:
1984:
1983:
1982:
1979:
1972:
1961:
1958:
1950:
1947:November 2023
1939:
1936:
1932:
1929:
1925:
1922:
1918:
1915:
1911:
1908: â
1907:
1903:
1902:Find sources:
1896:
1892:
1886:
1885:
1880:This section
1878:
1874:
1869:
1868:
1863:Nested models
1862:
1860:
1858:
1857:Sir David Cox
1853:
1833:
1813:
1807:
1803:
1799:
1773:
1747:
1711:
1710:
1685:
1684:
1683:nonparametric
1667:
1636:
1633:
1624:
1593:
1589:
1586:
1583:
1580:
1572:
1569:
1566:
1562:
1553:
1549:
1545:
1538:
1530:
1527:
1524:
1515:
1511:
1507:
1504:
1498:
1493:
1490:
1484:
1479:
1473:
1465:
1462:
1459:
1455:
1450:
1446:
1432:
1431:
1430:
1428:
1404:
1403:
1398:
1390:
1346:
1336:
1307:
1304:
1301:
1296:
1292:
1285:
1251:
1248:
1236:
1234:
1228:
1225:
1222:
1221:
1220:
1217:
1215:
1214:Sir David Cox
1211:
1207:
1203:
1199:
1198:deterministic
1195:
1187:
1185:
1123:
1114:
1094:
1091:
1088:
1083:
1079:
1072:
1045:
1003:
999:
978:
970:
963:
956:
940:
920:
893:
890:
879:
872:
868:
863:
857:
852:
847:
841:
833:
826:
821:
815:
813:
808:
800:
793:
788:
782:
774:
767:
762:
757:
753:
749:
741:
736:
690:
687:
684:
679:
675:
668:
635:
632:
629:
624:
620:
613:
598:
575:
571:
570:
569:
566:
564:
563:
558:
540:
536:
532:
527:
523:
512:
508:
503:
499:
492:
488:
483:
474:
432:
429:
426:
421:
417:
410:
359:
351:
323:
307:
280:
277:
265:
263:
260:
255:
199:
107:
105:
101:
97:
89:
87:
85:
81:
77:
73:
69:
65:
61:
57:
56:probabilities
53:
49:
45:
41:
37:
33:
19:
4961:
4949:
4930:
4923:
4835:Econometrics
4785: /
4768:Chemometrics
4745:Epidemiology
4738: /
4711:Applications
4553:ARIMA model
4500:Q-statistic
4449:Stationarity
4345:Multivariate
4288: /
4284: /
4282:Multivariate
4280: /
4220: /
4216: /
3990:Bayes factor
3889:Signed rank
3801:
3775:
3767:
3755:
3450:Completeness
3382:
3286:Cohort study
3184:Opinion poll
3119:Missing data
3106:Study design
3061:Scatter plot
2983:Scatter plot
2976:Spearman's Ď
2938:Grouped data
2616:
2610:
2599:
2586:
2576:
2568:
2564:
2546:
2520:
2514:
2494:
2481:
2478:Friendly, M.
2468:
2455:
2445:
2413:
2404:
2385:
2343:
2337:
2327:
2315:
2303:
2291:
2279:
2250:
2238:
2118:
2102:Bayes factor
2096:
2085:
2070:
2068:to equal 0.
2061:
2058:
2051:
2047:
2041:
2034:
2030:
2024:
2017:
2013:
2007:
2003:
1997:
1990:
1986:
1977:
1975:
1953:
1944:
1934:
1927:
1920:
1913:
1901:
1889:Please help
1884:verification
1881:
1854:
1708:
1682:
1625:
1610:
1400:
1396:
1389:real numbers
1387:denotes the
1240:
1232:
1218:
1206:coin tossing
1191:
1115:
968:
961:
954:
877:
870:
861:
855:
845:
839:
831:
824:
819:
816:
811:
806:
798:
791:
786:
780:
772:
765:
760:
745:
567:
562:identifiable
560:
471:defines the
348:is a set of
322:sample space
269:
258:
256:
108:
93:
90:Introduction
59:
31:
29:
4963:WikiProject
4878:Cartography
4840:Jurimetrics
4792:Reliability
4523:Time domain
4502:(LjungâBox)
4424:Time-series
4302:Categorical
4286:Time-series
4278:Categorical
4213:(Bernoulli)
4048:Correlation
4028:Correlation
3824:JarqueâBera
3796:Chi-squared
3558:M-estimator
3511:Asymptotics
3455:Sufficiency
3222:Interaction
3134:Replication
3114:Effect size
3071:Violin plot
3051:Radar chart
3031:Forest plot
3021:Correlogram
2971:Kendall's Ď
2571:: 1273â1297
2438:Adèr, H. J.
2399:introducing
2322:, p. 2
1223:Predictions
80:Herman Adèr
44:sample data
4979:Categories
4830:Demography
4548:ARMA model
4353:Regression
3930:(Friedman)
3891:(Wilcoxon)
3829:Normality
3819:Lilliefors
3766:Student's
3642:Resampling
3516:Robustness
3504:divergence
3494:Efficiency
3432:(monotone)
3427:Likelihood
3344:Population
3177:Stratified
3129:Population
2948:Dependence
2904:Count data
2835:Percentile
2812:Dispersion
2745:Arithmetic
2680:Statistics
2497:, Springer
2465:Cox, D. R.
2430:References
2382:references
2257:, p.
2159:Blockmodel
2121:deficiency
2108:, and the
2080:See also:
1917:newspapers
1402:parametric
1202:stochastic
742:An example
473:parameters
451:. The set
372:. The set
48:population
4211:Logistic
3978:posterior
3904:Rank sum
3652:Jackknife
3647:Bootstrap
3465:Bootstrap
3400:Parameter
3349:Statistic
3144:Statistic
3056:Run chart
3041:Pie chart
3036:Histogram
3026:Fan chart
3001:Bar chart
2883:L-moments
2770:Geometric
2626:1101.0891
2255:Adèr 2008
1840:∞
1837:→
1811:→
1780:∞
1777:→
1754:∞
1751:→
1724:Θ
1694:Θ
1668:θ
1640:Θ
1637:∈
1634:θ
1584:σ
1573:∈
1570:μ
1550:σ
1531:μ
1528:−
1516:−
1508:
1499:σ
1494:π
1480:≡
1466:σ
1460:μ
1413:Θ
1397:dimension
1337:⊆
1334:Θ
1311:Θ
1308:∈
1305:θ
1297:θ
1098:Θ
1095:∈
1092:θ
1084:θ
1046:θ
1026:Θ
1004:θ
953: = (
941:θ
784: + Îľ
748:uniformly
694:Λ
691:∈
688:λ
680:λ
639:Θ
636:∈
633:θ
625:θ
584:Θ
557:injective
537:θ
524:θ
520:⇒
509:θ
489:θ
459:Θ
436:Θ
433:∈
430:θ
422:θ
300:), where
4925:Category
4618:Survival
4495:Johansen
4218:Binomial
4173:Isotonic
3760:(normal)
3405:location
3212:Blocking
3167:Sampling
3046:QâQ plot
3011:Box plot
2993:Graphics
2888:Skewness
2878:Kurtosis
2850:Variance
2780:Heronian
2775:Harmonic
2643:15900983
2604:Springer
2506:(2002),
2467:(2006),
2444:(eds.),
2350:: 1429.
2320:Cox 2006
2284:Cox 2006
2243:Cox 2006
2194:SackSEER
2131:See also
2040: +
2033: =
2006: +
1996: +
1989: =
830: +
823: =
790:, where
771: +
764: =
252:.
233: =
198:weighted
177:.
158: =
82:quoting
66:and all
4951:Commons
4898:Kriging
4783:Process
4740:studies
4599:Wavelet
4432:General
3599:Plug-in
3393:L space
3172:Cluster
2873:Moments
2691:Outline
2395:improve
2310:, §11.6
1931:scholar
1271:) with
1058:, then
250:
238:
235:
231:
219:
215:
203:
194:
182:
175:
163:
160:
156:
144:
140:
128:
123:
111:
4820:Census
4410:Normal
4358:Manova
4178:Robust
3928:2-way
3920:1-way
3758:-test
3429:
3006:Biplot
2797:Median
2790:Lehmer
2732:Center
2641:
2384:, but
2298:, §1.1
1978:nested
1933:
1926:
1919:
1912:
1904:
1792:. If
1361:where
1018:. If
969:σ
867:i.i.d.
735:robust
324:, and
62:. All
4444:Trend
3973:prior
3915:anova
3804:-test
3778:-test
3770:-test
3677:Power
3622:Pivot
3415:shape
3410:scale
2860:Shape
2840:Range
2785:Heinz
2760:Cubic
2696:Index
2639:S2CID
2621:arXiv
2561:(PDF)
2511:(PDF)
2346:(4).
2231:Notes
1938:JSTOR
1924:books
100:event
34:is a
4677:Test
3877:Sign
3729:Wald
2802:Mode
2740:Mean
1910:news
1736:and
1587:>
104:dice
3857:BIC
3852:AIC
2631:doi
2525:doi
2352:doi
2259:280
1893:by
1826:as
1766:as
1505:exp
1405:if
837:age
778:age
572:In
352:on
259:not
86:).
4981::
2637:,
2629:,
2617:25
2615:,
2602:,
2589:,
2579:,
2569:17
2567:,
2563:,
2549:,
2521:30
2519:,
2513:,
2484:,
2344:35
2342:.
2336:.
2265:^
2127:.
2116:.
2104:,
2100:,
2090:.
1619:,
967:,
960:,
876:,
565:.
247:64
217:Ă
172:36
142:Ă
30:A
3802:G
3776:F
3768:t
3756:Z
3475:V
3470:U
2672:e
2665:t
2658:v
2633::
2623::
2534:.
2527::
2499:.
2489:.
2473:.
2460:.
2450:.
2420:)
2414:(
2409:)
2405:(
2391:.
2360:.
2354::
2097:R
2065:2
2062:b
2054:)
2052:Ď
2048:x
2045:1
2042:b
2038:0
2035:b
2031:y
2020:)
2018:Ď
2014:x
2011:2
2008:b
2004:x
2001:1
1998:b
1994:0
1991:b
1987:y
1973:.
1960:)
1954:(
1949:)
1945:(
1935:¡
1928:¡
1921:¡
1914:¡
1887:.
1834:n
1814:0
1808:n
1804:/
1800:k
1774:n
1748:k
1738:n
1714:k
1658:k
1654:k
1621:y
1617:x
1613:k
1607:.
1594:}
1590:0
1581:,
1577:R
1567::
1563:)
1554:2
1546:2
1539:2
1535:)
1525:x
1522:(
1512:(
1491:2
1485:1
1477:)
1474:x
1471:(
1463:,
1456:F
1451:{
1447:=
1442:P
1393:k
1374:R
1363:k
1347:k
1342:R
1314:}
1302::
1293:F
1289:{
1286:=
1281:P
1257:P
1252:,
1249:S
1170:P
1146:P
1124:S
1101:}
1089::
1080:F
1076:{
1073:=
1068:P
1000:F
979:S
965:1
962:b
958:0
955:b
921:S
899:P
894:,
891:S
881:1
878:b
874:0
871:b
862:i
856:i
846:i
840:i
835:1
832:b
828:0
825:b
820:i
812:i
807:i
802:1
799:b
795:0
792:b
787:i
781:i
776:1
773:b
769:0
766:b
761:i
719:Q
697:}
685::
676:F
672:{
669:=
664:P
642:}
630::
621:F
617:{
614:=
609:Q
596:.
541:2
533:=
528:1
513:2
504:F
500:=
493:1
484:F
439:}
427::
418:F
414:{
411:=
406:P
382:P
360:S
334:P
308:S
286:P
281:,
278:S
244:/
241:1
228:8
225:/
222:1
212:8
209:/
206:1
191:8
188:/
185:1
169:/
166:1
153:6
150:/
147:1
137:6
134:/
131:1
120:6
117:/
114:1
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.