Knowledge (XXG)

Estimator

Source 📝

1934:. Bias is a property of the estimator, not of the estimate. Often, people refer to a "biased estimate" or an "unbiased estimate", but they really are talking about an "estimate from a biased estimator", or an "estimate from an unbiased estimator". Also, people often confuse the "error" of a single estimate with the "bias" of an estimator. That the error for one estimate is large, does not mean the estimator is biased. In fact, even if all estimates have astronomical absolute values for their errors, if the expected value of the error is zero, the estimator is unbiased. Also, an estimator's being biased does not preclude the error of an estimate from being zero in a particular instance. The ideal situation is to have an unbiased estimator with low variance, and also try to limit the number of samples where the error is extreme (that is, have few outliers). Yet unbiasedness is not essential. Often, if just a little bias is permitted, then an estimator can be found with lower mean squared error and/or fewer outlier sample estimates. 5016: 822:
estimates (samples). Then high MSE means the average distance of the arrows from the bull's eye is high, and low MSE means the average distance from the bull's eye is low. The arrows may or may not be clustered. For example, even if all arrows hit the same point, yet grossly miss the target, the MSE is still relatively large. However, if the MSE is relatively low then the arrows are likely more highly clustered (than highly dispersed) around the target.
2675: 1828:. If the parameter is the bull's eye of a target and the arrows are estimates, then a relatively high absolute value for the bias means the average position of the arrows is off-target, and a relatively low absolute bias means the average position of the arrows is on target. They may be dispersed, or may be clustered. The relationship between bias and variance is analogous to the relationship between 4992: 5002: 4697:
MSE. The variance of the good estimator (good efficiency) would be smaller than the variance of the bad estimator (bad efficiency). The square of an estimator bias with a good estimator would be smaller than the estimator bias with a bad estimator. The MSE of a good estimator would be smaller than the MSE of the bad estimator. Suppose there are two estimator,
4696:
The first term represents the mean squared error; the second term represents the square of the estimator bias; and the third term represents the variance of the sample. The quality of the estimator can be identified from the comparison between the variance, the square of the estimator bias, or the
1243:
of the estimates. (Note the difference between MSE and variance.) If the parameter is the bull's-eye of a target, and the arrows are estimates, then a relatively high variance means the arrows are dispersed, and a relatively low variance means the arrows are clustered. Even if the variance is low,
2756:
A desired property for estimators is the unbiased trait where an estimator is shown to have no systematic tendency to produce estimates larger or smaller than the provided probability. Additionally, unbiased estimators with smaller variances are preferred over larger variances because it will be
1027: 821:
It is used to indicate how far, on average, the collection of estimates are from the single parameter being estimated. Consider the following analogy. Suppose the parameter is the bull's-eye of a target, the estimator is the process of shooting arrows at the target, and the individual arrows are
85:
is concerned with the properties of estimators; that is, with defining properties that can be used to compare different estimators (different rules for creating estimates) for the same quantity, based on the same data. Such properties can be used to determine the best rules to use under given
4437:
The efficiency of an estimator is used to estimate the quantity of interest in a "minimum error" manner. In reality, there is not an explicit best estimator; there can only be a better estimator. The good or not of the efficiency of an estimator is based on the choice of a particular
4344:. Note that convergence will not necessarily have occurred for any finite "n", therefore this value is only an approximation to the true variance of the estimator, while in the limit the asymptotic variance (V/n) is simply zero. To be more specific, the distribution of the estimator 4987:
Besides using formula to identify the efficiency of the estimator, it can also be identified through the graph. If an estimator is efficient, in the frequency vs. value graph, there will be a curve with high frequency at the center and low frequency on the two sides. For example:
3550: 4691: 1244:
the cluster of arrows may still be far off-target, and even if the variance is high, the diffuse collection of arrows may still be unbiased. Finally, even if all arrows grossly miss the target, if they nevertheless all hit the same point, the variance is zero.
4915: 1826: 3845: 1941:
of the distribution of estimates agrees with the true value; thus, in the long run half the estimates will be too low and half too high. While this applies immediately only to scalar-valued estimators, it can be extended to any measure of
3272: 1354: 4982: 4816: 1237: 3081: 889: 816: 4422:
of the maximum likelihood article. However, not all estimators are asymptotically normal; the simplest examples are found when the true value of a parameter lies on the boundary of the allowable parameter region.
1628: 3966: 4316: 4493: 4029:
and theoretical distribution functions respectively. An easy example to see if something is Fisher consistent is to check the mean consistency and the variance. For example, to check consistency for the mean
2815: 1713: 196:
The definition places virtually no restrictions on which functions of the data can be called the "estimators". The attractiveness of different estimators can be judged by looking at their properties, such as
1571: 3699:
The bias-variance tradeoff will be used in model complexity, over-fitting and under-fitting. It is mainly used in the field of supervised learning and predictive modeling to diagnose the performance of
3447: 3336: 1080: 2268: 646: 3140: 3886:
An estimator can be considered Fisher Consistent as long as the estimator is the same functional of the empirical distribution function as the true distribution function. Following the formula:
4569: 4128: 2608: 4072: 2906: 2059: 1932: 90:, statistical theory goes on to consider the balance between having good properties, if tightly defined assumptions hold, and having worse properties that hold under wider conditions. 4824: 1751: 479: 395: 259:. In these problems the estimates are functions that can be thought of as point estimates in an infinite dimensional space, and there are corresponding interval estimation problems. 3694: 3639: 3586: 2844: 2297: 1980: 1862: 1742: 1657: 1501: 1432: 1383: 1283: 1129: 881: 714: 581: 508: 345: 187: 4557: 4219: 2507: 2091: 2548: 114:. A common way of phrasing it is "the estimator is the method selected to obtain an estimate of an unknown parameter". The parameter being estimated is sometimes called the 3148:
Similarly, when looking at quantities in the interest of variance as the model distribution there is also an unbiased estimator that should satisfy the two equations below.
77:, where the result would be a range of plausible values. "Single value" does not necessarily mean "single number", but includes vector valued or function valued estimators. 4186: 2455: 2642: 2186: 4749: 4722: 4023: 3433: 3374: 2980: 2933: 2750: 2703: 4412: 3766: 3406: 2664: 4373: 3659: 3606: 2953: 2868: 2723: 2317: 2206: 2000: 1886: 1472: 1403: 669: 316: 288: 154: 3996: 2138: 2988:
When looking at quantities in the interest of expectation for the model distribution there is an unbiased estimator which should satisfy the two equations below.
2397: 1526:
There are two kinds of estimators: biased estimators and unbiased estimators. Whether an estimator is biased or not can be identified by the relationship between
441: 3154: 2111: 1521: 1452: 848: 548: 415: 1288: 5429: 4923: 4757: 1022:{\displaystyle d(x)={\widehat {\theta }}(x)-\operatorname {E} ({\widehat {\theta }}(X))={\widehat {\theta }}(x)-\operatorname {E} ({\widehat {\theta }}),} 5008:
To put it simply, the good estimator has a narrow curve, while the bad estimator has a large curve. Plotting these two curves on one graph with a shared
1134: 2994: 3728:) grows without bound. In other words, increasing the sample size increases the probability of the estimator being close to the population parameter. 193:; a particular realization of this random variable is called the "estimate". Sometimes the words "estimator" and "estimate" are used interchangeably. 4559:. These cannot in general both be satisfied simultaneously: an unbiased estimator may have a lower mean squared error than any biased estimator (see 722: 232:
When the word "estimator" is used without a qualifier, it usually refers to point estimation. The estimate in this case is a single point in the
5365: 5331: 5250: 359:
corresponding to the observed data, the estimator (itself treated as a random variable) is symbolised as a function of that random variable,
1580: 3892: 3552:
i.e. mean squared error = variance + square of bias. In particular, for an unbiased estimator, the variance equals the mean squared error.
4242: 4445: 2767: 2758: 1665: 1529: 551: 5414: 5391: 5198: 3545:{\displaystyle \operatorname {MSE} ({\widehat {\theta }})=\operatorname {Var} ({\widehat {\theta }})+(B({\widehat {\theta }}))^{2},} 5406: 5383: 5349: 3278: 1035: 5297: 4222: 4026: 1356:. It is the distance between the average of the collection of estimates, and the single parameter being estimated. The bias of 5047: 5109: 5081: 5023:
Among unbiased estimators, there often exists one with the lowest variance, called the minimum variance unbiased estimator (
5015: 2211: 589: 5307: 5113: 3087: 4686:{\displaystyle \operatorname {E} =(\operatorname {E} ({\widehat {\theta }})-\theta )^{2}+\operatorname {Var} (\theta )\ } 248: 5302: 5134: 3663: 348: 4077: 2552: 5323: 5129: 5118: 3721: 5051: 4033: 716:
is defined as the expected value (probability-weighted average, over all samples) of the squared errors; that is,
4910:{\displaystyle |\operatorname {E} (\theta _{1})-\theta |<\left|\operatorname {E} (\theta _{2})-\theta \right|} 1821:{\displaystyle \operatorname {E} ({\widehat {\theta }})-\theta =\operatorname {E} ({\widehat {\theta }}-\theta )} 5043: 2873: 2005: 5097: 4998:
If an estimator is not efficient, the frequency vs. value graph, there will be a relatively more gentle curve.
2757:
closer to the "true" value of the parameter. The unbiased estimator with the smallest variance is known as the
2002:. For example, if a genetic theory states there is a type of leaf (starchy green) that occurs with probability 1893: 5039: 5032: 446: 362: 5028: 4432: 4145: 3856: 3349:
we would obtain an estimator with a negative bias which would thus produce estimates that are too small for
210: 5343: 5164: 3875: 3670: 3615: 3562: 2820: 2273: 1956: 1838: 1829: 1718: 1633: 1477: 1408: 1359: 1259: 1105: 857: 690: 557: 484: 321: 163: 5315: 4502: 4379: 4195: 2459: 129: 125: 107: 2064: 5103: 4352: 4233: 4139: 3715: 2511: 1947: 1253: 206: 133: 5239:
Dekking, Frederik Michel; Kraaikamp, Cornelis; Lopuhaä, Hendrik Paul; Meester, Ludolf Erwin (2005).
3870:, namely the true value divided by the asymptotic value of the estimator. This occurs frequently in 5139: 5087: 4159: 4153: 3840:{\displaystyle \lim _{n\to \infty }\Pr \left\{\left|t_{n}-\theta \right|<\varepsilon \right\}=1} 2401: 675:, depends not only on the estimator (the estimation formula or procedure), but also on the sample. 43: 5267: 5031:
exists, which, in addition to having the lowest variance among unbiased estimators, satisfies the
2612: 2143: 5444: 4727: 4700: 4496: 4419: 4415: 4001: 3556: 3411: 3352: 2958: 2935:
being the only unbiased estimator. If the distributions overlapped and were both centered around
2911: 2728: 2681: 684: 244: 237: 74: 66: 4388: 3379: 5410: 5387: 5375: 5361: 5327: 5246: 5194: 5186: 5144: 5069: 2646: 214: 202: 111: 87: 81: 4358: 3644: 3591: 2938: 2853: 2708: 2302: 2191: 1985: 1871: 1457: 1388: 654: 301: 273: 139: 5353: 5240: 5063: 1943: 252: 121: 3974: 3267:{\displaystyle 1.\quad S_{n}^{2}={\frac {1}{n-1}}\sum _{i=1}^{n}(X_{i}-{\bar {X_{n}}})^{2}} 2116: 5191:
The Collected Works of John W. Tukey: Philosophy and Principles of Data Analysis 1965–1986
5124: 4442:, and it is reflected by two naturally desirable properties of estimators: to be unbiased 3871: 511: 356: 233: 218: 190: 70: 62: 55: 4751:
is the bad estimator. The above relationship can be expressed by the following formulas.
3866:
of a parameter can be made into a consistent estimator by multiplying the estimator by a
2322: 420: 1349:{\displaystyle B({\widehat {\theta }})=\operatorname {E} ({\widehat {\theta }})-\theta } 5149: 4560: 4418:
estimators are asymptotically normal under fairly weak regularity conditions — see the
2096: 1888: 1865: 1506: 1437: 1239:. It is used to indicate how far, on average, the collection of estimates are from the 1083: 833: 533: 400: 99: 17: 4977:{\displaystyle \operatorname {MSE} (\theta _{1})<\operatorname {MSE} (\theta _{2})} 4811:{\displaystyle \operatorname {Var} (\theta _{1})<\operatorname {Var} (\theta _{2})} 5438: 5159: 5092: 4439: 226: 222: 2674: 5204: 3867: 1937:
An alternative to the version of "unbiased" above, is "median-unbiased", where the
1232:{\displaystyle \operatorname {Var} ({\widehat {\theta }})=\operatorname {E} )^{2}]} 291: 198: 3076:{\displaystyle 1.\quad {\overline {X}}_{n}={\frac {X_{1}+X_{2}+\cdots +X_{n}}{n}}} 4148:
estimator is a consistent estimator whose distribution around the true parameter
4383: 3725: 3609: 256: 106:(that is, a function of the data) that is used to infer the value of an unknown 51: 5357: 5154: 3851:
The consistency defined above may be called weak consistency. The sequence is
2764:
To find if your estimator is unbiased it is easy to follow along the equation
811:{\displaystyle \operatorname {MSE} ({\widehat {\theta }})=\operatorname {E} .} 157: 31: 5035:, which is an absolute lower bound on variance for statistics of a variable. 213:, etc. The construction and comparison of estimators are the subjects of the 4991: 3747: 2188:
distribution. The number can be used to express the following estimator for
103: 5001: 3612:
of the variance), or an estimate of the standard deviation of an estimator
27:
Rule for calculating an estimate of a given quantity based on observed data
1099: 481:, which is a fixed value. Often an abbreviated notation is used in which 116: 47: 39: 290:
needs to be estimated. Then an "estimator" is a function that maps the
4563:). A function relates the mean squared error with the estimator bias. 2908:
the estimator is unbiased. Looking at the figure to the right despite
1623:{\displaystyle \operatorname {E} ({\widehat {\theta }})-\theta \neq 0} 3720:
A consistent sequence of estimators is a sequence of estimators that
1938: 4280: 4204: 3961:{\displaystyle {\widehat {\theta }}=h(T_{n}),\theta =h(T_{\theta })} 50:) and its result (the estimate) are distinguished. For example, the 4311:{\displaystyle {\sqrt {n}}(t_{n}-\theta ){\xrightarrow {D}}N(0,V),} 1131:
is the expected value of the squared sampling deviations; that is,
5014: 5000: 4990: 4488:{\displaystyle \operatorname {E} ({\widehat {\theta }})-\theta =0} 2810:{\displaystyle \operatorname {E} ({\widehat {\theta }})-\theta =0} 2673: 1708:{\displaystyle \operatorname {E} ({\widehat {\theta }})-\theta =0} 5345:
Introduction to Empirical Processes and Semiparametric Inference
5024: 1566:{\displaystyle \operatorname {E} ({\widehat {\theta }})-\theta } 2140:, or the number of starchy green leaves, can be modeled with a 46:: thus the rule (the estimator), the quantity of interest (the 1090:, depends not only on the estimator, but also on the sample. 3331:{\displaystyle 2.\quad \operatorname {E} \left=\sigma ^{2}} 1075:{\displaystyle \operatorname {E} ({\widehat {\theta }}(X))} 3724:
to the quantity being estimated as the index (usually the
240:, where the estimates are subsets of the parameter space. 225:, and its performance may be evaluated through the use of 189:. Being a function of the data, the estimator is itself a 3444:
The mean squared error, variance, and bias, are related:
347:. It is often convenient to express the theory using the 1748:
The bias is also the expected value of the error, since
156:
then the estimator is traditionally written by adding a
522:
The following definitions and attributes are relevant.
247:
arises in two applications. Firstly, in estimating the
73:
yield single-valued results. This is in contrast to an
5038:
Concerning such "best unbiased estimators", see also
4926: 4827: 4760: 4730: 4703: 4572: 4505: 4448: 4391: 4361: 4245: 4198: 4162: 4080: 4036: 4004: 3977: 3895: 3769: 3673: 3647: 3618: 3594: 3565: 3450: 3414: 3382: 3355: 3281: 3157: 3090: 2997: 2961: 2941: 2914: 2876: 2856: 2823: 2770: 2731: 2711: 2684: 2678:
Difference between estimators: an unbiased estimator
2649: 2615: 2555: 2514: 2462: 2404: 2325: 2305: 2276: 2263:{\displaystyle {\widehat {\theta }}=4/n\cdot N_{1}-2} 2214: 2194: 2146: 2119: 2099: 2067: 2008: 1988: 1959: 1896: 1874: 1841: 1754: 1721: 1668: 1636: 1583: 1532: 1509: 1480: 1460: 1440: 1411: 1391: 1362: 1291: 1262: 1137: 1108: 1038: 892: 860: 836: 725: 693: 657: 641:{\displaystyle e(x)={\widehat {\theta }}(x)-\theta ,} 592: 560: 536: 487: 449: 423: 403: 365: 324: 304: 276: 166: 142: 3135:{\displaystyle 2.\quad \operatorname {E} \left=\mu } 2982:
would actually be the preferred unbiased estimator.
397:. The estimate for a particular observed data value 5242:
A Modern Introduction to Probability and Statistics
4156:with standard deviation shrinking in proportion to 251:of random variables and secondly in estimating the 4976: 4909: 4810: 4743: 4716: 4685: 4551: 4487: 4414:as an estimator of the true mean. More generally, 4406: 4367: 4336:of the estimator. However, some authors also call 4310: 4213: 4180: 4122: 4066: 4017: 3990: 3960: 3839: 3688: 3653: 3633: 3600: 3580: 3544: 3427: 3400: 3368: 3330: 3266: 3134: 3075: 2974: 2947: 2927: 2900: 2862: 2838: 2809: 2744: 2717: 2697: 2658: 2636: 2602: 2542: 2501: 2449: 2391: 2311: 2291: 2262: 2200: 2180: 2132: 2105: 2085: 2053: 1994: 1974: 1926: 1880: 1856: 1820: 1736: 1707: 1651: 1622: 1565: 1515: 1495: 1466: 1446: 1426: 1397: 1377: 1348: 1277: 1231: 1123: 1074: 1021: 875: 842: 810: 708: 663: 640: 575: 542: 502: 473: 435: 409: 389: 339: 310: 282: 181: 148: 3786: 3771: 2870:solving the previous equation so it is shown as 3376:. It should also be mentioned that even though 236:. There also exists another type of estimator: 4123:{\displaystyle {\widehat {\sigma }}^{2}=SSD/n} 2603:{\displaystyle =4\cdot 1/4\cdot (\theta +2)-2} 5193:. Vol. 4. CRC Press. pp. 601–720 . 1982:can always have functional relationship with 671:is the parameter being estimated. The error, 8: 5234: 5232: 5012:-axis, the difference becomes more obvious. 4067:{\displaystyle {\widehat {\mu }}={\bar {X}}} 5019:Comparison between good and bad estimator. 3345: − 1 because if we divided with 2901:{\displaystyle \operatorname {E} =\theta } 2054:{\displaystyle p_{1}=1/4\cdot (\theta +2)} 1086:of the estimator. The sampling deviation, 120:. It can be either finite-dimensional (in 4965: 4940: 4925: 4887: 4861: 4846: 4828: 4826: 4799: 4774: 4759: 4735: 4729: 4708: 4702: 4656: 4632: 4631: 4607: 4586: 4585: 4571: 4540: 4519: 4518: 4504: 4459: 4458: 4447: 4393: 4392: 4390: 4360: 4275: 4260: 4246: 4244: 4199: 4197: 4171: 4166: 4161: 4112: 4094: 4083: 4082: 4079: 4053: 4052: 4038: 4037: 4035: 4009: 4003: 3982: 3976: 3949: 3921: 3897: 3896: 3894: 3803: 3774: 3768: 3731:Mathematically, a sequence of estimators 3675: 3674: 3672: 3646: 3620: 3619: 3617: 3593: 3567: 3566: 3564: 3533: 3515: 3514: 3488: 3487: 3461: 3460: 3449: 3419: 3413: 3392: 3387: 3381: 3360: 3354: 3322: 3305: 3300: 3280: 3258: 3242: 3236: 3235: 3226: 3213: 3202: 3180: 3171: 3166: 3156: 3116: 3106: 3089: 3061: 3042: 3029: 3022: 3013: 3003: 2996: 2966: 2960: 2940: 2919: 2913: 2875: 2855: 2825: 2824: 2822: 2781: 2780: 2769: 2736: 2730: 2710: 2689: 2683: 2648: 2614: 2568: 2554: 2528: 2513: 2487: 2469: 2461: 2432: 2411: 2403: 2374: 2359: 2333: 2332: 2324: 2304: 2278: 2277: 2275: 2248: 2233: 2216: 2215: 2213: 2193: 2169: 2145: 2124: 2118: 2098: 2066: 2025: 2013: 2007: 1987: 1961: 1960: 1958: 1927:{\displaystyle B({\widehat {\theta }})=0} 1904: 1903: 1895: 1873: 1843: 1842: 1840: 1798: 1797: 1765: 1764: 1753: 1723: 1722: 1720: 1679: 1678: 1667: 1638: 1637: 1635: 1594: 1593: 1582: 1543: 1542: 1531: 1508: 1482: 1481: 1479: 1459: 1439: 1413: 1412: 1410: 1390: 1364: 1363: 1361: 1326: 1325: 1299: 1298: 1290: 1264: 1263: 1261: 1220: 1202: 1201: 1178: 1177: 1148: 1147: 1136: 1110: 1109: 1107: 1049: 1048: 1037: 1002: 1001: 969: 968: 942: 941: 909: 908: 891: 862: 861: 859: 835: 796: 766: 765: 736: 735: 724: 695: 694: 692: 656: 609: 608: 591: 562: 561: 559: 535: 489: 488: 486: 451: 450: 448: 422: 402: 367: 366: 364: 326: 325: 323: 303: 275: 168: 167: 165: 141: 5320:Probability Theory: The logic of science 5177: 4074:and to check for variance confirm that 474:{\displaystyle {\widehat {\theta }}(x)} 390:{\displaystyle {\widehat {\theta }}(X)} 5217:Kosorok (2008), Section 3.1, pp 35–39. 5296:Bol'shev, Login Nikolaevich (2001) , 5187:"Data Analysis, including Statistics" 5185:Mosteller, F.; Tukey, J. W. (1987) . 7: 4382:implies asymptotic normality of the 3689:{\displaystyle {\widehat {\theta }}} 3634:{\displaystyle {\widehat {\theta }}} 3581:{\displaystyle {\widehat {\theta }}} 2839:{\displaystyle {\widehat {\theta }}} 2292:{\displaystyle {\widehat {\theta }}} 1975:{\displaystyle {\widehat {\theta }}} 1857:{\displaystyle {\widehat {\theta }}} 1737:{\displaystyle {\widehat {\theta }}} 1652:{\displaystyle {\widehat {\theta }}} 1496:{\displaystyle {\widehat {\theta }}} 1427:{\displaystyle {\widehat {\theta }}} 1378:{\displaystyle {\widehat {\theta }}} 1278:{\displaystyle {\widehat {\theta }}} 1124:{\displaystyle {\widehat {\theta }}} 876:{\displaystyle {\widehat {\theta }}} 709:{\displaystyle {\widehat {\theta }}} 576:{\displaystyle {\widehat {\theta }}} 503:{\displaystyle {\widehat {\theta }}} 340:{\displaystyle {\widehat {\theta }}} 182:{\displaystyle {\widehat {\theta }}} 54:is a commonly used estimator of the 2759:minimum-variance unbiased estimator 1385:is a function of the true value of 4874: 4833: 4622: 4573: 4552:{\displaystyle \operatorname {E} } 4506: 4449: 4214:{\displaystyle {\xrightarrow {D}}} 3876:measures of statistical dispersion 3781: 3439:Relationships among the quantities 3286: 3095: 2877: 2771: 2502:{\displaystyle =4/n\cdot np_{1}-2} 1788: 1755: 1669: 1584: 1533: 1316: 1192: 1165: 1039: 992: 932: 753: 25: 5430:Fundamentals on Estimation Theory 5348:. Springer Series in Statistics. 3862:An estimator that converges to a 2086:{\displaystyle 0<\theta <1} 318:is usually denoted by the symbol 5245:. Springer Texts in Statistics. 3746:} is a consistent estimator for 514:, but this can cause confusion. 4027:empirical distribution function 3760:, no matter how small, we have 3285: 3161: 3094: 3001: 2850:with and parameter of interest 2543:{\displaystyle =4\cdot p_{1}-2} 136:). If the parameter is denoted 5082:Best linear unbiased estimator 4971: 4958: 4946: 4933: 4893: 4880: 4862: 4852: 4839: 4829: 4805: 4792: 4780: 4767: 4677: 4671: 4653: 4643: 4628: 4619: 4613: 4604: 4582: 4579: 4546: 4537: 4515: 4512: 4470: 4455: 4398: 4302: 4290: 4272: 4253: 4058: 3955: 3942: 3927: 3914: 3872:estimation of scale parameters 3778: 3530: 3526: 3511: 3505: 3499: 3484: 3472: 3457: 3255: 3248: 3219: 2889: 2883: 2792: 2777: 2591: 2579: 2438: 2425: 2386: 2353: 2344: 2329: 2175: 2156: 2048: 2036: 1915: 1900: 1815: 1794: 1776: 1761: 1690: 1675: 1605: 1590: 1554: 1539: 1337: 1322: 1310: 1295: 1226: 1217: 1213: 1198: 1174: 1171: 1159: 1144: 1069: 1066: 1060: 1045: 1013: 998: 986: 980: 962: 959: 953: 938: 926: 920: 902: 896: 802: 793: 783: 777: 762: 759: 747: 732: 626: 620: 602: 596: 468: 462: 384: 378: 1: 5114:generalized method of moments 5027:). In some cases an unbiased 4181:{\displaystyle 1/{\sqrt {n}}} 2450:{\displaystyle =4/n\cdot E-2} 2299:is an unbiased estimator for 510:is interpreted directly as a 249:probability density functions 42:of a given quantity based on 38:is a rule for calculating an 3111: 3008: 2637:{\displaystyle =\theta +2-2} 2181:{\displaystyle Bin(n,p_{1})} 2113:leaves, the random variable 221:, an estimator is a type of 128:), or infinite-dimensional ( 5303:Encyclopedia of Mathematics 5135:Sensitivity and specificity 4744:{\displaystyle \theta _{2}} 4717:{\displaystyle \theta _{1}} 4223:convergence in distribution 4018:{\displaystyle T_{\theta }} 3428:{\displaystyle \sigma ^{2}} 3369:{\displaystyle \sigma ^{2}} 2975:{\displaystyle \theta _{1}} 2928:{\displaystyle \theta _{2}} 2745:{\displaystyle \theta _{1}} 2698:{\displaystyle \theta _{2}} 1405:so saying that the bias of 349:algebra of random variables 86:circumstances. However, in 5461: 5380:Theory of Point Estimation 5324:Cambridge University Press 5268:"Properties of Estimators" 5130:Pitman closeness criterion 5119:Minimum mean squared error 5067: 5061: 4724:is the good estimator and 4430: 4407:{\displaystyle {\bar {X}}} 4137: 3713: 1948:median-unbiased estimators 5358:10.1007/978-0-387-74978-5 5342:Kosorok, Michael (2008). 3435:the reverse is not true. 3401:{\displaystyle S_{n}^{2}} 253:spectral density function 5098:Markov chain Monte Carlo 3753:if and only if, for all 3341:Note we are dividing by 2659:{\displaystyle =\theta } 1953:In a practical problem, 5403:Mathematical Statistics 5298:"Statistical estimator" 5048:Lehmann–Scheffé theorem 4433:Efficiency (statistics) 4368:{\displaystyle \theta } 3857:converges almost surely 3722:converge in probability 3654:{\displaystyle \theta } 3601:{\displaystyle \theta } 2948:{\displaystyle \theta } 2863:{\displaystyle \theta } 2725:vs. a biased estimator 2718:{\displaystyle \theta } 2312:{\displaystyle \theta } 2201:{\displaystyle \theta } 1995:{\displaystyle \theta } 1946:of a distribution: see 1881:{\displaystyle \theta } 1467:{\displaystyle \theta } 1398:{\displaystyle \theta } 664:{\displaystyle \theta } 311:{\displaystyle \theta } 283:{\displaystyle \theta } 211:asymptotic distribution 149:{\displaystyle \theta } 18:Asymptotically unbiased 5378:; Casella, G. (1998). 5273:. University of Oxford 5165:Well-behaved statistic 5020: 5005: 4995: 4978: 4911: 4812: 4745: 4718: 4687: 4553: 4489: 4408: 4369: 4351:converges weakly to a 4312: 4215: 4182: 4124: 4068: 4019: 3992: 3962: 3841: 3690: 3655: 3635: 3602: 3582: 3546: 3429: 3402: 3370: 3332: 3268: 3218: 3136: 3077: 2976: 2949: 2929: 2902: 2864: 2840: 2811: 2753: 2746: 2719: 2699: 2660: 2638: 2604: 2544: 2503: 2451: 2393: 2313: 2293: 2264: 2202: 2182: 2134: 2107: 2087: 2055: 1996: 1976: 1928: 1882: 1858: 1830:accuracy and precision 1822: 1738: 1709: 1653: 1624: 1567: 1517: 1497: 1468: 1448: 1428: 1399: 1379: 1350: 1279: 1233: 1125: 1076: 1023: 877: 844: 812: 710: 665: 642: 577: 544: 504: 475: 437: 411: 391: 341: 312: 284: 183: 150: 126:semi-parametric models 5226:Jaynes (2007), p.172. 5068:Further information: 5052:Rao–Blackwell theorem 5018: 5004: 4994: 4979: 4912: 4813: 4746: 4719: 4688: 4554: 4490: 4409: 4380:central limit theorem 4370: 4313: 4234:asymptotically normal 4216: 4183: 4146:asymptotically normal 4125: 4069: 4020: 3993: 3991:{\displaystyle T_{n}} 3963: 3842: 3705:Behavioral properties 3691: 3656: 3636: 3603: 3583: 3547: 3430: 3403: 3371: 3333: 3269: 3198: 3137: 3078: 2977: 2950: 2930: 2903: 2865: 2841: 2812: 2747: 2720: 2700: 2677: 2661: 2639: 2605: 2545: 2504: 2452: 2394: 2314: 2294: 2265: 2203: 2183: 2135: 2133:{\displaystyle N_{1}} 2108: 2088: 2056: 1997: 1977: 1929: 1883: 1859: 1823: 1739: 1710: 1654: 1625: 1568: 1518: 1498: 1469: 1454:means that for every 1449: 1429: 1400: 1380: 1351: 1280: 1234: 1126: 1077: 1024: 878: 845: 813: 711: 666: 643: 578: 545: 518:Quantified properties 505: 476: 438: 412: 392: 342: 313: 285: 184: 151: 134:non-parametric models 5266:Lauritzen, Steffen. 5104:Maximum a posteriori 5044:Gauss–Markov theorem 4924: 4825: 4758: 4728: 4701: 4570: 4503: 4446: 4389: 4359: 4353:dirac delta function 4328:In this formulation 4243: 4196: 4160: 4140:Asymptotic normality 4134:Asymptotic normality 4078: 4034: 4002: 3975: 3893: 3767: 3716:Consistent estimator 3671: 3645: 3616: 3592: 3563: 3448: 3412: 3380: 3353: 3279: 3155: 3088: 2995: 2959: 2939: 2912: 2874: 2854: 2821: 2768: 2729: 2709: 2682: 2647: 2613: 2553: 2512: 2460: 2402: 2323: 2303: 2274: 2270:. One can show that 2212: 2192: 2144: 2117: 2097: 2065: 2006: 1986: 1957: 1894: 1872: 1839: 1752: 1719: 1666: 1634: 1581: 1530: 1507: 1478: 1458: 1438: 1409: 1389: 1360: 1289: 1260: 1135: 1106: 1036: 890: 858: 834: 723: 691: 655: 590: 558: 534: 485: 447: 421: 401: 363: 355:is used to denote a 322: 302: 274: 217:. In the context of 164: 140: 5140:Shrinkage estimator 5088:Invariant estimator 5029:efficient estimator 4420:asymptotics section 4342:asymptotic variance 4334:asymptotic variance 4284: 4208: 4188:as the sample size 4154:normal distribution 3859:to the true value. 3853:strongly consistent 3397: 3310: 3176: 2705:is centered around 2392:{\displaystyle E=E} 830:For a given sample 554:" of the estimator 530:For a given sample 436:{\displaystyle X=x} 238:interval estimators 98:An "estimator" or " 67:interval estimators 5401:Shao, Jun (1998), 5021: 5006: 4996: 4974: 4907: 4808: 4741: 4714: 4683: 4549: 4497:mean squared error 4485: 4416:maximum likelihood 4404: 4365: 4332:can be called the 4308: 4211: 4178: 4120: 4064: 4015: 3988: 3958: 3882:Fisher consistency 3837: 3785: 3686: 3651: 3631: 3598: 3578: 3557:standard deviation 3542: 3425: 3398: 3383: 3366: 3328: 3296: 3264: 3162: 3132: 3073: 2972: 2955:then distribution 2945: 2925: 2898: 2860: 2836: 2807: 2754: 2742: 2715: 2695: 2656: 2634: 2600: 2540: 2499: 2447: 2389: 2309: 2289: 2260: 2198: 2178: 2130: 2103: 2083: 2051: 1992: 1972: 1924: 1878: 1866:unbiased estimator 1854: 1818: 1734: 1705: 1649: 1620: 1563: 1513: 1493: 1464: 1444: 1424: 1395: 1375: 1346: 1275: 1229: 1121: 1072: 1019: 873: 852:sampling deviation 840: 826:Sampling deviation 808: 706: 685:mean squared error 679:Mean squared error 661: 638: 573: 540: 500: 471: 433: 407: 387: 337: 308: 298:. An estimator of 280: 245:density estimation 179: 146: 75:interval estimator 5367:978-0-387-74978-5 5333:978-0-521-59271-0 5252:978-1-85233-896-1 5145:Signal processing 5110:Method of moments 5070:Robust regression 4682: 4640: 4594: 4527: 4495:and have minimal 4467: 4401: 4285: 4251: 4209: 4176: 4091: 4061: 4046: 3905: 3770: 3683: 3628: 3575: 3523: 3496: 3469: 3251: 3196: 3114: 3071: 3011: 2846:. With estimator 2833: 2789: 2341: 2286: 2224: 2106:{\displaystyle n} 1969: 1912: 1851: 1806: 1773: 1731: 1687: 1646: 1602: 1551: 1516:{\displaystyle b} 1490: 1447:{\displaystyle b} 1421: 1372: 1334: 1307: 1272: 1210: 1186: 1156: 1118: 1057: 1010: 977: 950: 917: 870: 854:of the estimator 843:{\displaystyle x} 774: 744: 703: 617: 570: 543:{\displaystyle x} 497: 459: 410:{\displaystyle x} 375: 334: 215:estimation theory 203:mean square error 176: 160:over the symbol: 112:statistical model 88:robust statistics 82:Estimation theory 16:(Redirected from 5452: 5419: 5397: 5382:(2nd ed.). 5371: 5337: 5310: 5283: 5282: 5280: 5278: 5272: 5263: 5257: 5256: 5236: 5227: 5224: 5218: 5215: 5209: 5208: 5182: 5064:Robust estimator 5040:Cramér–Rao bound 5033:Cramér–Rao bound 4983: 4981: 4980: 4975: 4970: 4969: 4945: 4944: 4916: 4914: 4913: 4908: 4906: 4902: 4892: 4891: 4865: 4851: 4850: 4832: 4817: 4815: 4814: 4809: 4804: 4803: 4779: 4778: 4750: 4748: 4747: 4742: 4740: 4739: 4723: 4721: 4720: 4715: 4713: 4712: 4692: 4690: 4689: 4684: 4680: 4661: 4660: 4642: 4641: 4633: 4612: 4611: 4596: 4595: 4587: 4558: 4556: 4555: 4550: 4545: 4544: 4529: 4528: 4520: 4494: 4492: 4491: 4486: 4469: 4468: 4460: 4413: 4411: 4410: 4405: 4403: 4402: 4394: 4374: 4372: 4371: 4366: 4317: 4315: 4314: 4309: 4286: 4276: 4265: 4264: 4252: 4247: 4220: 4218: 4217: 4212: 4210: 4200: 4187: 4185: 4184: 4179: 4177: 4172: 4170: 4129: 4127: 4126: 4121: 4116: 4099: 4098: 4093: 4092: 4084: 4073: 4071: 4070: 4065: 4063: 4062: 4054: 4048: 4047: 4039: 4024: 4022: 4021: 4016: 4014: 4013: 3997: 3995: 3994: 3989: 3987: 3986: 3967: 3965: 3964: 3959: 3954: 3953: 3926: 3925: 3907: 3906: 3898: 3846: 3844: 3843: 3838: 3830: 3826: 3819: 3815: 3808: 3807: 3784: 3759: 3745: 3695: 3693: 3692: 3687: 3685: 3684: 3676: 3661:, is called the 3660: 3658: 3657: 3652: 3640: 3638: 3637: 3632: 3630: 3629: 3621: 3607: 3605: 3604: 3599: 3587: 3585: 3584: 3579: 3577: 3576: 3568: 3559:of an estimator 3551: 3549: 3548: 3543: 3538: 3537: 3525: 3524: 3516: 3498: 3497: 3489: 3471: 3470: 3462: 3434: 3432: 3431: 3426: 3424: 3423: 3408:is unbiased for 3407: 3405: 3404: 3399: 3396: 3391: 3375: 3373: 3372: 3367: 3365: 3364: 3337: 3335: 3334: 3329: 3327: 3326: 3314: 3309: 3304: 3273: 3271: 3270: 3265: 3263: 3262: 3253: 3252: 3247: 3246: 3237: 3231: 3230: 3217: 3212: 3197: 3195: 3181: 3175: 3170: 3141: 3139: 3138: 3133: 3125: 3121: 3120: 3115: 3107: 3082: 3080: 3079: 3074: 3072: 3067: 3066: 3065: 3047: 3046: 3034: 3033: 3023: 3018: 3017: 3012: 3004: 2981: 2979: 2978: 2973: 2971: 2970: 2954: 2952: 2951: 2946: 2934: 2932: 2931: 2926: 2924: 2923: 2907: 2905: 2904: 2899: 2869: 2867: 2866: 2861: 2845: 2843: 2842: 2837: 2835: 2834: 2826: 2816: 2814: 2813: 2808: 2791: 2790: 2782: 2751: 2749: 2748: 2743: 2741: 2740: 2724: 2722: 2721: 2716: 2704: 2702: 2701: 2696: 2694: 2693: 2665: 2663: 2662: 2657: 2643: 2641: 2640: 2635: 2609: 2607: 2606: 2601: 2572: 2549: 2547: 2546: 2541: 2533: 2532: 2508: 2506: 2505: 2500: 2492: 2491: 2473: 2456: 2454: 2453: 2448: 2437: 2436: 2415: 2398: 2396: 2395: 2390: 2379: 2378: 2363: 2343: 2342: 2334: 2318: 2316: 2315: 2310: 2298: 2296: 2295: 2290: 2288: 2287: 2279: 2269: 2267: 2266: 2261: 2253: 2252: 2237: 2226: 2225: 2217: 2207: 2205: 2204: 2199: 2187: 2185: 2184: 2179: 2174: 2173: 2139: 2137: 2136: 2131: 2129: 2128: 2112: 2110: 2109: 2104: 2092: 2090: 2089: 2084: 2060: 2058: 2057: 2052: 2029: 2018: 2017: 2001: 1999: 1998: 1993: 1981: 1979: 1978: 1973: 1971: 1970: 1962: 1944:central tendency 1933: 1931: 1930: 1925: 1914: 1913: 1905: 1887: 1885: 1884: 1879: 1863: 1861: 1860: 1855: 1853: 1852: 1844: 1827: 1825: 1824: 1819: 1808: 1807: 1799: 1775: 1774: 1766: 1743: 1741: 1740: 1735: 1733: 1732: 1724: 1714: 1712: 1711: 1706: 1689: 1688: 1680: 1658: 1656: 1655: 1650: 1648: 1647: 1639: 1629: 1627: 1626: 1621: 1604: 1603: 1595: 1572: 1570: 1569: 1564: 1553: 1552: 1544: 1522: 1520: 1519: 1514: 1502: 1500: 1499: 1494: 1492: 1491: 1483: 1473: 1471: 1470: 1465: 1453: 1451: 1450: 1445: 1433: 1431: 1430: 1425: 1423: 1422: 1414: 1404: 1402: 1401: 1396: 1384: 1382: 1381: 1376: 1374: 1373: 1365: 1355: 1353: 1352: 1347: 1336: 1335: 1327: 1309: 1308: 1300: 1284: 1282: 1281: 1276: 1274: 1273: 1265: 1238: 1236: 1235: 1230: 1225: 1224: 1212: 1211: 1203: 1188: 1187: 1179: 1158: 1157: 1149: 1130: 1128: 1127: 1122: 1120: 1119: 1111: 1081: 1079: 1078: 1073: 1059: 1058: 1050: 1028: 1026: 1025: 1020: 1012: 1011: 1003: 979: 978: 970: 952: 951: 943: 919: 918: 910: 882: 880: 879: 874: 872: 871: 863: 849: 847: 846: 841: 817: 815: 814: 809: 801: 800: 776: 775: 767: 746: 745: 737: 715: 713: 712: 707: 705: 704: 696: 670: 668: 667: 662: 647: 645: 644: 639: 619: 618: 610: 582: 580: 579: 574: 572: 571: 563: 549: 547: 546: 541: 509: 507: 506: 501: 499: 498: 490: 480: 478: 477: 472: 461: 460: 452: 442: 440: 439: 434: 416: 414: 413: 408: 396: 394: 393: 388: 377: 376: 368: 346: 344: 343: 338: 336: 335: 327: 317: 315: 314: 309: 296:sample estimates 289: 287: 286: 281: 267:Suppose a fixed 188: 186: 185: 180: 178: 177: 169: 155: 153: 152: 147: 71:point estimators 21: 5460: 5459: 5455: 5454: 5453: 5451: 5450: 5449: 5435: 5434: 5426: 5417: 5400: 5394: 5374: 5368: 5341: 5334: 5314: 5295: 5292: 5290:Further reading 5287: 5286: 5276: 5274: 5270: 5265: 5264: 5260: 5253: 5238: 5237: 5230: 5225: 5221: 5216: 5212: 5201: 5184: 5183: 5179: 5174: 5169: 5125:Particle filter 5077: 5072: 5066: 5060: 4961: 4936: 4922: 4921: 4883: 4873: 4869: 4842: 4823: 4822: 4795: 4770: 4756: 4755: 4731: 4726: 4725: 4704: 4699: 4698: 4652: 4603: 4568: 4567: 4536: 4501: 4500: 4444: 4443: 4435: 4429: 4387: 4386: 4357: 4356: 4349: 4256: 4241: 4240: 4230: 4194: 4193: 4158: 4157: 4142: 4136: 4081: 4076: 4075: 4032: 4031: 4005: 4000: 3999: 3978: 3973: 3972: 3945: 3917: 3891: 3890: 3884: 3799: 3798: 3794: 3793: 3789: 3765: 3764: 3754: 3738: 3732: 3718: 3712: 3707: 3669: 3668: 3643: 3642: 3614: 3613: 3590: 3589: 3561: 3560: 3529: 3446: 3445: 3441: 3415: 3410: 3409: 3378: 3377: 3356: 3351: 3350: 3318: 3292: 3277: 3276: 3254: 3238: 3222: 3185: 3153: 3152: 3105: 3101: 3086: 3085: 3057: 3038: 3025: 3024: 3002: 2993: 2992: 2962: 2957: 2956: 2937: 2936: 2915: 2910: 2909: 2872: 2871: 2852: 2851: 2819: 2818: 2766: 2765: 2732: 2727: 2726: 2707: 2706: 2685: 2680: 2679: 2672: 2645: 2644: 2611: 2610: 2551: 2550: 2524: 2510: 2509: 2483: 2458: 2457: 2428: 2400: 2399: 2370: 2321: 2320: 2301: 2300: 2272: 2271: 2244: 2210: 2209: 2190: 2189: 2165: 2142: 2141: 2120: 2115: 2114: 2095: 2094: 2063: 2062: 2009: 2004: 2003: 1984: 1983: 1955: 1954: 1892: 1891: 1870: 1869: 1837: 1836: 1750: 1749: 1717: 1716: 1664: 1663: 1632: 1631: 1579: 1578: 1528: 1527: 1505: 1504: 1476: 1475: 1456: 1455: 1436: 1435: 1407: 1406: 1387: 1386: 1358: 1357: 1287: 1286: 1258: 1257: 1250: 1216: 1133: 1132: 1104: 1103: 1096: 1034: 1033: 888: 887: 856: 855: 832: 831: 828: 792: 721: 720: 689: 688: 681: 653: 652: 588: 587: 556: 555: 532: 531: 528: 520: 512:random variable 483: 482: 445: 444: 419: 418: 399: 398: 361: 360: 357:random variable 320: 319: 300: 299: 272: 271: 265: 243:The problem of 234:parameter space 219:decision theory 191:random variable 162: 161: 138: 137: 130:semi-parametric 96: 56:population mean 28: 23: 22: 15: 12: 11: 5: 5458: 5456: 5448: 5447: 5437: 5436: 5433: 5432: 5425: 5424:External links 5422: 5421: 5420: 5415: 5398: 5392: 5376:Lehmann, E. L. 5372: 5366: 5339: 5332: 5322:(5 ed.). 5312: 5291: 5288: 5285: 5284: 5258: 5251: 5228: 5219: 5210: 5199: 5176: 5175: 5173: 5170: 5168: 5167: 5162: 5157: 5152: 5150:State observer 5147: 5142: 5137: 5132: 5127: 5122: 5116: 5107: 5101: 5095: 5090: 5085: 5078: 5076: 5073: 5062:Main article: 5059: 5056: 4985: 4984: 4973: 4968: 4964: 4960: 4957: 4954: 4951: 4948: 4943: 4939: 4935: 4932: 4929: 4918: 4917: 4905: 4901: 4898: 4895: 4890: 4886: 4882: 4879: 4876: 4872: 4868: 4864: 4860: 4857: 4854: 4849: 4845: 4841: 4838: 4835: 4831: 4819: 4818: 4807: 4802: 4798: 4794: 4791: 4788: 4785: 4782: 4777: 4773: 4769: 4766: 4763: 4738: 4734: 4711: 4707: 4694: 4693: 4679: 4676: 4673: 4670: 4667: 4664: 4659: 4655: 4651: 4648: 4645: 4639: 4636: 4630: 4627: 4624: 4621: 4618: 4615: 4610: 4606: 4602: 4599: 4593: 4590: 4584: 4581: 4578: 4575: 4561:estimator bias 4548: 4543: 4539: 4535: 4532: 4526: 4523: 4517: 4514: 4511: 4508: 4484: 4481: 4478: 4475: 4472: 4466: 4463: 4457: 4454: 4451: 4431:Main article: 4428: 4425: 4400: 4397: 4364: 4347: 4319: 4318: 4307: 4304: 4301: 4298: 4295: 4292: 4289: 4283: 4279: 4274: 4271: 4268: 4263: 4259: 4255: 4250: 4228: 4207: 4203: 4192:grows. Using 4175: 4169: 4165: 4138:Main article: 4135: 4132: 4119: 4115: 4111: 4108: 4105: 4102: 4097: 4090: 4087: 4060: 4057: 4051: 4045: 4042: 4012: 4008: 3985: 3981: 3969: 3968: 3957: 3952: 3948: 3944: 3941: 3938: 3935: 3932: 3929: 3924: 3920: 3916: 3913: 3910: 3904: 3901: 3883: 3880: 3849: 3848: 3836: 3833: 3829: 3825: 3822: 3818: 3814: 3811: 3806: 3802: 3797: 3792: 3788: 3783: 3780: 3777: 3773: 3736: 3714:Main article: 3711: 3708: 3706: 3703: 3702: 3701: 3697: 3682: 3679: 3664:standard error 3650: 3627: 3624: 3597: 3574: 3571: 3553: 3541: 3536: 3532: 3528: 3522: 3519: 3513: 3510: 3507: 3504: 3501: 3495: 3492: 3486: 3483: 3480: 3477: 3474: 3468: 3465: 3459: 3456: 3453: 3440: 3437: 3422: 3418: 3395: 3390: 3386: 3363: 3359: 3339: 3338: 3325: 3321: 3317: 3313: 3308: 3303: 3299: 3295: 3291: 3288: 3284: 3274: 3261: 3257: 3250: 3245: 3241: 3234: 3229: 3225: 3221: 3216: 3211: 3208: 3205: 3201: 3194: 3191: 3188: 3184: 3179: 3174: 3169: 3165: 3160: 3143: 3142: 3131: 3128: 3124: 3119: 3113: 3110: 3104: 3100: 3097: 3093: 3083: 3070: 3064: 3060: 3056: 3053: 3050: 3045: 3041: 3037: 3032: 3028: 3021: 3016: 3010: 3007: 3000: 2969: 2965: 2944: 2922: 2918: 2897: 2894: 2891: 2888: 2885: 2882: 2879: 2859: 2832: 2829: 2806: 2803: 2800: 2797: 2794: 2788: 2785: 2779: 2776: 2773: 2739: 2735: 2714: 2692: 2688: 2671: 2668: 2655: 2652: 2633: 2630: 2627: 2624: 2621: 2618: 2599: 2596: 2593: 2590: 2587: 2584: 2581: 2578: 2575: 2571: 2567: 2564: 2561: 2558: 2539: 2536: 2531: 2527: 2523: 2520: 2517: 2498: 2495: 2490: 2486: 2482: 2479: 2476: 2472: 2468: 2465: 2446: 2443: 2440: 2435: 2431: 2427: 2424: 2421: 2418: 2414: 2410: 2407: 2388: 2385: 2382: 2377: 2373: 2369: 2366: 2362: 2358: 2355: 2352: 2349: 2346: 2340: 2337: 2331: 2328: 2308: 2285: 2282: 2259: 2256: 2251: 2247: 2243: 2240: 2236: 2232: 2229: 2223: 2220: 2197: 2177: 2172: 2168: 2164: 2161: 2158: 2155: 2152: 2149: 2127: 2123: 2102: 2082: 2079: 2076: 2073: 2070: 2050: 2047: 2044: 2041: 2038: 2035: 2032: 2028: 2024: 2021: 2016: 2012: 1991: 1968: 1965: 1923: 1920: 1917: 1911: 1908: 1902: 1899: 1889:if and only if 1877: 1850: 1847: 1835:The estimator 1817: 1814: 1811: 1805: 1802: 1796: 1793: 1790: 1787: 1784: 1781: 1778: 1772: 1769: 1763: 1760: 1757: 1746: 1745: 1730: 1727: 1704: 1701: 1698: 1695: 1692: 1686: 1683: 1677: 1674: 1671: 1660: 1645: 1642: 1619: 1616: 1613: 1610: 1607: 1601: 1598: 1592: 1589: 1586: 1562: 1559: 1556: 1550: 1547: 1541: 1538: 1535: 1512: 1489: 1486: 1463: 1443: 1420: 1417: 1394: 1371: 1368: 1345: 1342: 1339: 1333: 1330: 1324: 1321: 1318: 1315: 1312: 1306: 1303: 1297: 1294: 1285:is defined as 1271: 1268: 1249: 1246: 1241:expected value 1228: 1223: 1219: 1215: 1209: 1206: 1200: 1197: 1194: 1191: 1185: 1182: 1176: 1173: 1170: 1167: 1164: 1161: 1155: 1152: 1146: 1143: 1140: 1117: 1114: 1095: 1092: 1084:expected value 1071: 1068: 1065: 1062: 1056: 1053: 1047: 1044: 1041: 1030: 1029: 1018: 1015: 1009: 1006: 1000: 997: 994: 991: 988: 985: 982: 976: 973: 967: 964: 961: 958: 955: 949: 946: 940: 937: 934: 931: 928: 925: 922: 916: 913: 907: 904: 901: 898: 895: 883:is defined as 869: 866: 839: 827: 824: 819: 818: 807: 804: 799: 795: 791: 788: 785: 782: 779: 773: 770: 764: 761: 758: 755: 752: 749: 743: 740: 734: 731: 728: 702: 699: 680: 677: 660: 649: 648: 637: 634: 631: 628: 625: 622: 616: 613: 607: 604: 601: 598: 595: 583:is defined as 569: 566: 539: 527: 524: 519: 516: 496: 493: 470: 467: 464: 458: 455: 432: 429: 426: 406: 386: 383: 380: 374: 371: 333: 330: 307: 279: 264: 261: 227:loss functions 175: 172: 145: 100:point estimate 95: 92: 26: 24: 14: 13: 10: 9: 6: 4: 3: 2: 5457: 5446: 5443: 5442: 5440: 5431: 5428: 5427: 5423: 5418: 5416:0-387-98674-X 5412: 5408: 5404: 5399: 5395: 5393:0-387-98502-6 5389: 5385: 5381: 5377: 5373: 5369: 5363: 5359: 5355: 5351: 5347: 5346: 5340: 5335: 5329: 5325: 5321: 5317: 5316:Jaynes, E. T. 5313: 5309: 5305: 5304: 5299: 5294: 5293: 5289: 5269: 5262: 5259: 5254: 5248: 5244: 5243: 5235: 5233: 5229: 5223: 5220: 5214: 5211: 5206: 5202: 5200:0-534-05101-4 5196: 5192: 5188: 5181: 5178: 5171: 5166: 5163: 5161: 5160:Wiener filter 5158: 5156: 5153: 5151: 5148: 5146: 5143: 5141: 5138: 5136: 5133: 5131: 5128: 5126: 5123: 5120: 5117: 5115: 5111: 5108: 5105: 5102: 5099: 5096: 5094: 5093:Kalman filter 5091: 5089: 5086: 5083: 5080: 5079: 5074: 5071: 5065: 5057: 5055: 5053: 5049: 5045: 5041: 5036: 5034: 5030: 5026: 5017: 5013: 5011: 5003: 4999: 4993: 4989: 4966: 4962: 4955: 4952: 4949: 4941: 4937: 4930: 4927: 4920: 4919: 4903: 4899: 4896: 4888: 4884: 4877: 4870: 4866: 4858: 4855: 4847: 4843: 4836: 4821: 4820: 4800: 4796: 4789: 4786: 4783: 4775: 4771: 4764: 4761: 4754: 4753: 4752: 4736: 4732: 4709: 4705: 4674: 4668: 4665: 4662: 4657: 4649: 4646: 4637: 4634: 4625: 4616: 4608: 4600: 4597: 4591: 4588: 4576: 4566: 4565: 4564: 4562: 4541: 4533: 4530: 4524: 4521: 4509: 4498: 4482: 4479: 4476: 4473: 4464: 4461: 4452: 4441: 4440:loss function 4434: 4426: 4424: 4421: 4417: 4395: 4385: 4381: 4376: 4362: 4354: 4350: 4343: 4339: 4335: 4331: 4326: 4324: 4305: 4299: 4296: 4293: 4287: 4281: 4277: 4269: 4266: 4261: 4257: 4248: 4239: 4238: 4237: 4235: 4231: 4224: 4205: 4201: 4191: 4173: 4167: 4163: 4155: 4152:approaches a 4151: 4147: 4141: 4133: 4131: 4117: 4113: 4109: 4106: 4103: 4100: 4095: 4088: 4085: 4055: 4049: 4043: 4040: 4028: 4010: 4006: 3983: 3979: 3950: 3946: 3939: 3936: 3933: 3930: 3922: 3918: 3911: 3908: 3902: 3899: 3889: 3888: 3887: 3881: 3879: 3877: 3873: 3869: 3865: 3860: 3858: 3854: 3834: 3831: 3827: 3823: 3820: 3816: 3812: 3809: 3804: 3800: 3795: 3790: 3775: 3763: 3762: 3761: 3757: 3752: 3749: 3743: 3739: 3729: 3727: 3723: 3717: 3709: 3704: 3698: 3680: 3677: 3666: 3665: 3648: 3625: 3622: 3611: 3595: 3572: 3569: 3558: 3554: 3539: 3534: 3520: 3517: 3508: 3502: 3493: 3490: 3481: 3478: 3475: 3466: 3463: 3454: 3451: 3443: 3442: 3438: 3436: 3420: 3416: 3393: 3388: 3384: 3361: 3357: 3348: 3344: 3323: 3319: 3315: 3311: 3306: 3301: 3297: 3293: 3289: 3282: 3275: 3259: 3243: 3239: 3232: 3227: 3223: 3214: 3209: 3206: 3203: 3199: 3192: 3189: 3186: 3182: 3177: 3172: 3167: 3163: 3158: 3151: 3150: 3149: 3147: 3129: 3126: 3122: 3117: 3108: 3102: 3098: 3091: 3084: 3068: 3062: 3058: 3054: 3051: 3048: 3043: 3039: 3035: 3030: 3026: 3019: 3014: 3005: 2998: 2991: 2990: 2989: 2987: 2983: 2967: 2963: 2942: 2920: 2916: 2895: 2892: 2886: 2880: 2857: 2849: 2830: 2827: 2804: 2801: 2798: 2795: 2786: 2783: 2774: 2762: 2760: 2737: 2733: 2712: 2690: 2686: 2676: 2669: 2667: 2653: 2650: 2631: 2628: 2625: 2622: 2619: 2616: 2597: 2594: 2588: 2585: 2582: 2576: 2573: 2569: 2565: 2562: 2559: 2556: 2537: 2534: 2529: 2525: 2521: 2518: 2515: 2496: 2493: 2488: 2484: 2480: 2477: 2474: 2470: 2466: 2463: 2444: 2441: 2433: 2429: 2422: 2419: 2416: 2412: 2408: 2405: 2383: 2380: 2375: 2371: 2367: 2364: 2360: 2356: 2350: 2347: 2338: 2335: 2326: 2306: 2283: 2280: 2257: 2254: 2249: 2245: 2241: 2238: 2234: 2230: 2227: 2221: 2218: 2195: 2170: 2166: 2162: 2159: 2153: 2150: 2147: 2125: 2121: 2100: 2080: 2077: 2074: 2071: 2068: 2045: 2042: 2039: 2033: 2030: 2026: 2022: 2019: 2014: 2010: 1989: 1966: 1963: 1951: 1949: 1945: 1940: 1935: 1921: 1918: 1909: 1906: 1897: 1890: 1875: 1867: 1848: 1845: 1833: 1831: 1812: 1809: 1803: 1800: 1791: 1785: 1782: 1779: 1770: 1767: 1758: 1728: 1725: 1702: 1699: 1696: 1693: 1684: 1681: 1672: 1661: 1643: 1640: 1617: 1614: 1611: 1608: 1599: 1596: 1587: 1576: 1575: 1574: 1560: 1557: 1548: 1545: 1536: 1524: 1510: 1487: 1484: 1461: 1441: 1418: 1415: 1392: 1369: 1366: 1343: 1340: 1331: 1328: 1319: 1313: 1304: 1301: 1292: 1269: 1266: 1255: 1247: 1245: 1242: 1221: 1207: 1204: 1195: 1189: 1183: 1180: 1168: 1162: 1153: 1150: 1141: 1138: 1115: 1112: 1101: 1093: 1091: 1089: 1085: 1063: 1054: 1051: 1042: 1016: 1007: 1004: 995: 989: 983: 974: 971: 965: 956: 947: 944: 935: 929: 923: 914: 911: 905: 899: 893: 886: 885: 884: 867: 864: 853: 837: 825: 823: 805: 797: 789: 786: 780: 771: 768: 756: 750: 741: 738: 729: 726: 719: 718: 717: 700: 697: 686: 678: 676: 674: 658: 635: 632: 629: 623: 614: 611: 605: 599: 593: 586: 585: 584: 567: 564: 553: 537: 525: 523: 517: 515: 513: 494: 491: 465: 456: 453: 430: 427: 424: 404: 381: 372: 369: 358: 354: 350: 331: 328: 305: 297: 293: 277: 270: 262: 260: 258: 254: 250: 246: 241: 239: 235: 230: 228: 224: 223:decision rule 220: 216: 212: 208: 204: 200: 194: 192: 173: 170: 159: 143: 135: 131: 127: 123: 119: 118: 113: 109: 105: 101: 93: 91: 89: 84: 83: 78: 76: 72: 68: 64: 59: 57: 53: 49: 45: 44:observed data 41: 37: 33: 19: 5402: 5379: 5344: 5319: 5301: 5275:. Retrieved 5261: 5241: 5222: 5213: 5205:Google Books 5203:– via 5190: 5180: 5037: 5022: 5009: 5007: 4997: 4986: 4695: 4436: 4377: 4355:centered at 4345: 4341: 4337: 4333: 4329: 4327: 4322: 4320: 4226: 4189: 4149: 4143: 3970: 3885: 3868:scale factor 3863: 3861: 3852: 3850: 3755: 3750: 3741: 3734: 3730: 3719: 3662: 3346: 3342: 3340: 3145: 3144: 2985: 2984: 2847: 2763: 2755: 2093:. Then, for 1952: 1936: 1834: 1747: 1744:is unbiased. 1525: 1474:the bias of 1251: 1240: 1097: 1087: 1031: 851: 829: 820: 682: 672: 650: 529: 521: 352: 295: 294:to a set of 292:sample space 268: 266: 242: 231: 199:unbiasedness 195: 115: 97: 80: 79: 60: 35: 29: 4384:sample mean 3726:sample size 3710:Consistency 3700:algorithms. 3610:square root 2986:Expectation 443:) is then 257:time series 207:consistency 52:sample mean 5277:9 December 5172:References 5155:Testimator 5058:Robustness 4427:Efficiency 4221:to denote 1659:is biased. 417:(i.e. for 351:: thus if 263:Definition 158:circumflex 122:parametric 94:Background 61:There are 32:statistics 5445:Estimator 5308:EMS Press 4963:θ 4956:⁡ 4938:θ 4931:⁡ 4900:θ 4897:− 4885:θ 4878:⁡ 4859:θ 4856:− 4844:θ 4837:⁡ 4797:θ 4790:⁡ 4772:θ 4765:⁡ 4733:θ 4706:θ 4675:θ 4669:⁡ 4650:θ 4647:− 4638:^ 4635:θ 4626:⁡ 4601:θ 4598:− 4592:^ 4589:θ 4577:⁡ 4534:θ 4531:− 4525:^ 4522:θ 4510:⁡ 4477:θ 4474:− 4465:^ 4462:θ 4453:⁡ 4399:¯ 4363:θ 4321:for some 4270:θ 4267:− 4089:^ 4086:σ 4059:¯ 4044:^ 4041:μ 4011:θ 3951:θ 3934:θ 3903:^ 3900:θ 3824:ε 3813:θ 3810:− 3782:∞ 3779:→ 3748:parameter 3681:^ 3678:θ 3649:θ 3626:^ 3623:θ 3596:θ 3573:^ 3570:θ 3521:^ 3518:θ 3494:^ 3491:θ 3482:⁡ 3467:^ 3464:θ 3455:⁡ 3417:σ 3358:σ 3320:σ 3290:⁡ 3249:¯ 3233:− 3200:∑ 3190:− 3130:μ 3112:¯ 3099:⁡ 3052:⋯ 3009:¯ 2964:θ 2943:θ 2917:θ 2896:θ 2881:⁡ 2858:θ 2831:^ 2828:θ 2799:θ 2796:− 2787:^ 2784:θ 2775:⁡ 2734:θ 2713:θ 2687:θ 2654:θ 2629:− 2620:θ 2595:− 2583:θ 2577:⋅ 2563:⋅ 2535:− 2522:⋅ 2494:− 2478:⋅ 2442:− 2420:⋅ 2381:− 2368:⋅ 2339:^ 2336:θ 2307:θ 2284:^ 2281:θ 2255:− 2242:⋅ 2222:^ 2219:θ 2196:θ 2075:θ 2040:θ 2034:⋅ 1990:θ 1967:^ 1964:θ 1910:^ 1907:θ 1876:θ 1849:^ 1846:θ 1813:θ 1810:− 1804:^ 1801:θ 1792:⁡ 1783:θ 1780:− 1771:^ 1768:θ 1759:⁡ 1729:^ 1726:θ 1697:θ 1694:− 1685:^ 1682:θ 1673:⁡ 1644:^ 1641:θ 1615:≠ 1612:θ 1609:− 1600:^ 1597:θ 1588:⁡ 1561:θ 1558:− 1549:^ 1546:θ 1537:⁡ 1488:^ 1485:θ 1462:θ 1419:^ 1416:θ 1393:θ 1370:^ 1367:θ 1344:θ 1341:− 1332:^ 1329:θ 1320:⁡ 1305:^ 1302:θ 1270:^ 1267:θ 1208:^ 1205:θ 1196:⁡ 1190:− 1184:^ 1181:θ 1169:⁡ 1154:^ 1151:θ 1142:⁡ 1116:^ 1113:θ 1055:^ 1052:θ 1043:⁡ 1008:^ 1005:θ 996:⁡ 990:− 975:^ 972:θ 948:^ 945:θ 936:⁡ 930:− 915:^ 912:θ 868:^ 865:θ 790:θ 787:− 772:^ 769:θ 757:⁡ 742:^ 739:θ 730:⁡ 701:^ 698:θ 659:θ 633:θ 630:− 615:^ 612:θ 568:^ 565:θ 495:^ 492:θ 457:^ 454:θ 373:^ 370:θ 332:^ 329:θ 306:θ 278:θ 269:parameter 174:^ 171:θ 144:θ 108:parameter 104:statistic 36:estimator 5439:Category 5407:Springer 5384:Springer 5350:Springer 5318:(2007). 5075:See also 4278:→ 4202:→ 3864:multiple 3855:, if it 3146:Variance 2761:(MVUE). 2670:Unbiased 1100:variance 1094:Variance 117:estimand 48:estimand 40:estimate 4025:is the 2061:, with 1573:and 0: 1082:is the 550:, the " 102:" is a 5413:  5390:  5364:  5330:  5249:  5197:  5121:(MMSE) 5100:(MCMC) 5084:(BLUE) 4681:  4499:(MSE) 3971:Where 3758:> 0 1939:median 1864:is an 1032:where 850:, the 651:where 69:. The 5271:(PDF) 5106:(MAP) 3608:(the 552:error 526:Error 255:of a 110:in a 63:point 34:, an 5411:ISBN 5388:ISBN 5362:ISBN 5328:ISBN 5279:2023 5247:ISBN 5195:ISBN 5025:MVUE 4950:< 4867:< 4784:< 4378:The 4340:the 3998:and 3821:< 3555:The 2078:< 2072:< 1254:bias 1252:The 1248:Bias 1098:The 683:The 132:and 124:and 65:and 5354:doi 4953:MSE 4928:MSE 4787:Var 4762:Var 4666:Var 4330:V/n 4236:if 4232:is 4144:An 3874:by 3772:lim 3744:≥ 0 3667:of 3641:of 3588:of 3479:Var 3452:MSE 1868:of 1662:If 1577:If 1503:is 1434:is 1256:of 1139:Var 1102:of 727:MSE 687:of 30:In 5441:: 5409:, 5405:, 5386:. 5360:. 5352:. 5326:. 5306:, 5300:, 5231:^ 5189:. 5112:, 5054:. 5050:, 5046:, 5042:, 4375:. 4325:. 4225:, 4130:. 3878:. 3787:Pr 3740:; 3283:2. 3159:1. 3092:2. 2999:1. 2817:, 2666:. 2319:: 2208:: 1950:. 1832:. 1715:, 1630:, 1523:. 229:. 209:, 205:, 201:, 58:. 5396:. 5370:. 5356:: 5338:. 5336:. 5311:. 5281:. 5255:. 5207:. 5010:y 4972:) 4967:2 4959:( 4947:) 4942:1 4934:( 4904:| 4894:) 4889:2 4881:( 4875:E 4871:| 4863:| 4853:) 4848:1 4840:( 4834:E 4830:| 4806:) 4801:2 4793:( 4781:) 4776:1 4768:( 4737:2 4710:1 4678:) 4672:( 4663:+ 4658:2 4654:) 4644:) 4629:( 4623:E 4620:( 4617:= 4614:] 4609:2 4605:) 4583:( 4580:[ 4574:E 4547:] 4542:2 4538:) 4516:( 4513:[ 4507:E 4483:0 4480:= 4471:) 4456:( 4450:E 4396:X 4348:n 4346:t 4338:V 4323:V 4306:, 4303:) 4300:V 4297:, 4294:0 4291:( 4288:N 4282:D 4273:) 4262:n 4258:t 4254:( 4249:n 4229:n 4227:t 4206:D 4190:n 4174:n 4168:/ 4164:1 4150:θ 4118:n 4114:/ 4110:D 4107:S 4104:S 4101:= 4096:2 4056:X 4050:= 4007:T 3984:n 3980:T 3956:) 3947:T 3943:( 3940:h 3937:= 3931:, 3928:) 3923:n 3919:T 3915:( 3912:h 3909:= 3847:. 3835:1 3832:= 3828:} 3817:| 3805:n 3801:t 3796:| 3791:{ 3776:n 3756:ε 3751:θ 3742:n 3737:n 3735:t 3733:{ 3696:. 3540:, 3535:2 3531:) 3527:) 3512:( 3509:B 3506:( 3503:+ 3500:) 3485:( 3476:= 3473:) 3458:( 3421:2 3394:2 3389:n 3385:S 3362:2 3347:n 3343:n 3324:2 3316:= 3312:] 3307:2 3302:n 3298:S 3294:[ 3287:E 3260:2 3256:) 3244:n 3240:X 3228:i 3224:X 3220:( 3215:n 3210:1 3207:= 3204:i 3193:1 3187:n 3183:1 3178:= 3173:2 3168:n 3164:S 3127:= 3123:] 3118:n 3109:X 3103:[ 3096:E 3069:n 3063:n 3059:X 3055:+ 3049:+ 3044:2 3040:X 3036:+ 3031:1 3027:X 3020:= 3015:n 3006:X 2968:1 2921:2 2893:= 2890:] 2887:T 2884:[ 2878:E 2848:T 2805:0 2802:= 2793:) 2778:( 2772:E 2752:. 2738:1 2691:2 2651:= 2632:2 2626:2 2623:+ 2617:= 2598:2 2592:) 2589:2 2586:+ 2580:( 2574:4 2570:/ 2566:1 2560:4 2557:= 2538:2 2530:1 2526:p 2519:4 2516:= 2497:2 2489:1 2485:p 2481:n 2475:n 2471:/ 2467:4 2464:= 2445:2 2439:] 2434:1 2430:N 2426:[ 2423:E 2417:n 2413:/ 2409:4 2406:= 2387:] 2384:2 2376:1 2372:N 2365:n 2361:/ 2357:4 2354:[ 2351:E 2348:= 2345:] 2330:[ 2327:E 2258:2 2250:1 2246:N 2239:n 2235:/ 2231:4 2228:= 2176:) 2171:1 2167:p 2163:, 2160:n 2157:( 2154:n 2151:i 2148:B 2126:1 2122:N 2101:n 2081:1 2069:0 2049:) 2046:2 2043:+ 2037:( 2031:4 2027:/ 2023:1 2020:= 2015:1 2011:p 1922:0 1919:= 1916:) 1901:( 1898:B 1816:) 1795:( 1789:E 1786:= 1777:) 1762:( 1756:E 1703:0 1700:= 1691:) 1676:( 1670:E 1618:0 1606:) 1591:( 1585:E 1555:) 1540:( 1534:E 1511:b 1442:b 1338:) 1323:( 1317:E 1314:= 1311:) 1296:( 1293:B 1227:] 1222:2 1218:) 1214:] 1199:[ 1193:E 1175:( 1172:[ 1166:E 1163:= 1160:) 1145:( 1088:d 1070:) 1067:) 1064:X 1061:( 1046:( 1040:E 1017:, 1014:) 999:( 993:E 987:) 984:x 981:( 966:= 963:) 960:) 957:X 954:( 939:( 933:E 927:) 924:x 921:( 906:= 903:) 900:x 897:( 894:d 838:x 806:. 803:] 798:2 794:) 784:) 781:X 778:( 763:( 760:[ 754:E 751:= 748:) 733:( 673:e 636:, 627:) 624:x 621:( 606:= 603:) 600:x 597:( 594:e 538:x 469:) 466:x 463:( 431:x 428:= 425:X 405:x 385:) 382:X 379:( 353:X 20:)

Index

Asymptotically unbiased
statistics
estimate
observed data
estimand
sample mean
population mean
point
interval estimators
point estimators
interval estimator
Estimation theory
robust statistics
point estimate
statistic
parameter
statistical model
estimand
parametric
semi-parametric models
semi-parametric
non-parametric models
circumflex
random variable
unbiasedness
mean square error
consistency
asymptotic distribution
estimation theory
decision theory

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.