Knowledge (XXG)

Inverse-variance weighting

Source 📝

3503: 3067: 2819: 1129:
might actually exceed the error in the least noisy measurement if different measurements have very different errors. Instead of discarding the noisy measurements that increase the final error, the experimenter can combine all the measurements with appropriate weights so as to give more importance to
3338: 903: 1505: 2804: 2691: 3062:{\displaystyle Var(Y)=\sum _{i}{\frac {\sigma _{0}^{4}}{\sigma _{i}^{4}}}\sigma _{i}^{2}=\sigma _{0}^{4}\sum _{i}{\frac {1}{\sigma _{i}^{2}}}=\sigma _{0}^{4}{\frac {1}{\sigma _{0}^{2}}}=\sigma _{0}^{2}={\frac {1}{\sum _{i}1/\sigma _{i}^{2}}}.} 185: 1610: 2414: 3423: 2272: 1691: 1322: 283: 1208: 730: 3219: 2492: 1993: 791: 2548: 418:. If they are all noisy but unbiased, i.e., the measuring device does not systematically overestimate or underestimate the true value and the errors are scattered symmetrically, then the 547: 1727: 1375: 1380: 1788: 416: 2702: 3211: 3182: 2591: 2113: 1858: 1127: 2599: 982: 667: 1074:
to be the same. Some instruments could be noisier than others. In the example of measuring the acceleration due to gravity, the different "instruments" could be measuring
1072: 930: 574: 483: 594: 74: 3145: 2301: 2070: 2031: 461: 3110: 2143: 1885: 1815: 1228: 786: 638: 329: 1510: 1905: 1092: 1045: 1025: 1005: 950: 766: 614: 349: 2309: 3520: 3346: 2151: 3081:
random variables inverse-variance weighted averages can also be derived as the maximum likelihood estimate for the true value. Furthermore, from a
1617: 1237: 196: 3484: 3567: 1133: 3539: 3586: 3546: 3333:{\displaystyle \mathbf {\hat {x}} =\left(\sum _{i}\mathbf {C} _{i}^{-1}\right)^{-1}\sum _{i}\mathbf {C} _{i}^{-1}\mathbf {x} _{i}} 2813:
by noting that the variance is a quadratic function of the weights. Thus, the minimum variance of the estimator is then given by:
675: 2425: 1917: 788:
but also has a scatter. If the individual measurements are uncorrelated, the square of the error in the estimate is given by
3524: 2810: 898:{\displaystyle Var({\overline {X}})={\frac {1}{n^{2}}}\sum _{i}\sigma _{i}^{2}=\left({\frac {\sigma }{\sqrt {n}}}\right)^{2}} 3553: 288:
If the variances of the measurements are all equal, then the inverse-variance weighted average becomes the simple average.
3535: 3613: 3155:
For multivariate distributions an equivalent argument leads to an optimal weighting based on the covariance matrices
190:
The inverse-variance weighted average has the least variance among all weighted averages, which can be calculated as
2500: 3513: 1693:, the estimator has a scatter smaller than the scatter in any individual measurement. Furthermore, the scatter in 492: 1908: 1696: 1500:{\displaystyle Var({\hat {\mu }})={\frac {\sum _{i}w_{i}^{2}\sigma _{i}^{2}}{\left(\sum _{i}w_{i}\right)^{2}}}} 1327: 3112:
and a flat prior is a normal distribution with the inverse-variance weighted average as a mean and variance
1047:
different instruments with varying quality of measurements, then there is no reason to expect the different
1740: 356: 3608: 3437: 2799:{\displaystyle w_{k}={\frac {1}{\sigma _{k}^{2}}}\left(\sum _{i}{\frac {1}{\sigma _{i}^{2}}}\right)^{-1}.} 44: 3560: 2686:{\displaystyle {\frac {2}{w_{0}}}=\sum _{i}{\frac {1}{\sigma _{i}^{2}}}:={\frac {1}{\sigma _{0}^{2}}}.} 3187: 3158: 2553: 2075: 1820: 3476: 1105: 3447: 3078: 2116: 955: 643: 3085:
perspective the posterior distribution for the true value given normally distributed observations
3082: 1050: 908: 552: 40: 180:{\displaystyle {\hat {y}}={\frac {\sum _{i}y_{i}/\sigma _{i}^{2}}{\sum _{i}1/\sigma _{i}^{2}}}.} 465: 307:
Suppose an experimenter wishes to measure the value of a quantity, say the acceleration due to
3480: 1099: 419: 579: 3442: 3428:
For multivariate distributions the term "precision-weighted" average is more commonly used.
3115: 2280: 2040: 2001: 1605:{\displaystyle Var({\hat {\mu }}_{\text{opt}})=\left(\sum _{i}\sigma _{i}^{-2}\right)^{-1}.} 424: 351: 308: 36: 28: 3088: 2121: 1863: 1793: 1213: 1095: 771: 623: 314: 2409:{\displaystyle 0={\frac {\partial }{\partial w_{k}}}Var(Y)=2w_{k}\sigma _{k}^{2}-w_{0},} 3469: 1890: 1077: 1030: 1010: 990: 935: 735: 599: 334: 732:. Note that this empirical average is also a random variable, whose expectation value 3602: 1231: 296: 292: 3418:{\displaystyle \mathbf {\hat {C}} =\left(\sum _{i}\mathbf {C} _{i}^{-1}\right)^{-1}} 1729:
decreases with adding more measurements, however noisier those measurements may be.
2809:
It is easy to see that this extremum solution corresponds to the minimum from the
2267:{\displaystyle Var(Y)=\sum _{i}w_{i}^{2}\sigma _{i}^{2}-w_{0}(\sum _{i}w_{i}-1).} 3502: 549:, and if the measurements are performed under identical scenarios, then all the 1102:
etc. The simple average is no longer an optimal estimator, since the error in
20: 617: 1686:{\displaystyle Var({\hat {\mu }}_{\text{opt}})<\min _{j}\sigma _{j}^{2}} 1317:{\displaystyle {\hat {\mu }}={\frac {\sum _{i}w_{i}X_{i}}{\sum _{i}w_{i}}}} 331:. A careful experimenter makes multiple measurements, which we denote with 2034: 486: 32: 278:{\displaystyle Var({\hat {y}})={\frac {1}{\sum _{i}1/\sigma _{i}^{2}}}.} 670: 1007:
repeated measurements with one instrument, if the experimenter makes
932:
are equal, then the error in the estimate decreases with increase in
1130:
the least noisy measurements and vice versa. Given the knowledge of
1203:{\displaystyle \sigma _{1}^{2},\sigma _{2}^{2},...,\sigma _{n}^{2}} 3496: 485:. The scatter in the measurement is then characterised by the 725:{\displaystyle {\overline {X}}={\frac {1}{n}}\sum _{i}X_{i}} 291:
Inverse-variance weighting is typically used in statistical
2487:{\displaystyle w_{k}={\frac {w_{0}/2}{\sigma _{k}^{2}}}.} 1988:{\displaystyle Var(Y)=\sum _{i}w_{i}^{2}\sigma _{i}^{2}.} 299:
to combine the results from independent measurements.
3467:
Joachim Hartung; Guido Knapp; Bimal K. Sinha (2008).
3349: 3222: 3190: 3161: 3118: 3091: 2822: 2705: 2602: 2556: 2503: 2428: 2312: 2283: 2154: 2124: 2078: 2043: 2004: 1920: 1893: 1866: 1823: 1796: 1743: 1699: 1620: 1513: 1507:, which for the optimal choice of the weights become 1383: 1330: 1240: 1216: 1136: 1108: 1080: 1053: 1033: 1013: 993: 958: 938: 911: 794: 774: 738: 678: 646: 626: 602: 582: 555: 495: 468: 427: 359: 337: 317: 199: 77: 2145:
to enforce the constraint, we express the variance:
68:, the inverse-variance weighted average is given by 3527:. Unsourced material may be challenged and removed. 3468: 3417: 3332: 3205: 3176: 3139: 3104: 3061: 2798: 2685: 2585: 2542: 2486: 2408: 2295: 2266: 2137: 2107: 2064: 2025: 1987: 1899: 1879: 1852: 1809: 1782: 1721: 1685: 1604: 1499: 1369: 1316: 1222: 1202: 1121: 1086: 1066: 1039: 1019: 999: 976: 944: 924: 897: 780: 760: 724: 661: 632: 608: 588: 568: 541: 477: 455: 410: 343: 323: 277: 179: 3356: 3229: 1659: 2072:to zero, while maintaining the constraint that 2543:{\displaystyle w_{k}\propto 1/\sigma _{k}^{2}} 50:Given a sequence of independent observations 8: 3471:Statistical meta-analysis with applications 1324:, for the particular choice of the weights 984:, thus making more observations preferred. 542:{\displaystyle Var(X_{i}):=\sigma _{i}^{2}} 43:to its variance (i.e., proportional to its 3184:of the individual vector-valued estimates 1722:{\displaystyle {\hat {\mu }}_{\text{opt}}} 3587:Learn how and when to remove this message 3406: 3392: 3387: 3382: 3375: 3351: 3350: 3348: 3324: 3319: 3309: 3304: 3299: 3292: 3279: 3265: 3260: 3255: 3248: 3224: 3223: 3221: 3197: 3192: 3189: 3168: 3163: 3160: 3117: 3096: 3090: 3047: 3042: 3033: 3024: 3014: 3005: 3000: 2985: 2980: 2971: 2965: 2960: 2945: 2940: 2931: 2925: 2915: 2910: 2897: 2892: 2880: 2875: 2865: 2860: 2854: 2848: 2821: 2784: 2771: 2766: 2757: 2751: 2733: 2728: 2719: 2710: 2704: 2672: 2667: 2658: 2647: 2642: 2633: 2627: 2612: 2603: 2601: 2571: 2561: 2555: 2534: 2529: 2520: 2508: 2502: 2473: 2468: 2455: 2449: 2442: 2433: 2427: 2397: 2384: 2379: 2369: 2332: 2319: 2311: 2282: 2246: 2236: 2223: 2210: 2205: 2195: 2190: 2180: 2153: 2129: 2123: 2093: 2083: 2077: 2042: 2003: 1976: 1971: 1961: 1956: 1946: 1919: 1892: 1871: 1865: 1838: 1828: 1822: 1801: 1795: 1774: 1764: 1754: 1742: 1713: 1702: 1701: 1698: 1677: 1672: 1662: 1646: 1635: 1634: 1619: 1590: 1576: 1571: 1561: 1539: 1528: 1527: 1512: 1489: 1478: 1468: 1451: 1446: 1436: 1431: 1421: 1414: 1397: 1396: 1382: 1361: 1356: 1347: 1335: 1329: 1305: 1295: 1283: 1273: 1263: 1256: 1242: 1241: 1239: 1215: 1194: 1189: 1164: 1159: 1146: 1141: 1135: 1109: 1107: 1079: 1058: 1052: 1032: 1012: 992: 967: 962: 957: 937: 916: 910: 889: 873: 859: 854: 844: 832: 823: 807: 793: 773: 745: 737: 716: 706: 692: 679: 677: 648: 647: 645: 625: 601: 581: 576:are the same, which we shall refer to by 560: 554: 533: 528: 512: 494: 467: 438: 426: 402: 377: 364: 358: 336: 316: 263: 258: 249: 240: 230: 213: 212: 198: 165: 160: 151: 142: 130: 125: 116: 110: 100: 93: 79: 78: 76: 3459: 2696:The individual normalised weights are: 1370:{\displaystyle w_{i}=1/\sigma _{i}^{2}} 27:is a method of aggregating two or more 39:. Each random variable is weighted in 1887:are all independent, the variance of 1783:{\displaystyle Y=\sum _{i}w_{i}X_{i}} 411:{\displaystyle X_{1},X_{2},...,X_{n}} 7: 3525:adding citations to reliable sources 1998:For optimality, we wish to minimise 2325: 2321: 2033:which can be done by equating the 1210:, an optimal estimator to measure 469: 14: 311:, whose true value happens to be 3501: 3383: 3353: 3320: 3300: 3256: 3226: 3206:{\displaystyle \mathbf {x} _{i}} 3193: 3177:{\displaystyle \mathbf {C} _{i}} 3164: 2586:{\displaystyle \sum _{i}w_{i}=1} 2108:{\displaystyle \sum _{i}w_{i}=1} 1853:{\displaystyle \sum _{i}w_{i}=1} 1737:Consider a generic weighted sum 1377:. The variance of the estimator 3512:needs additional citations for 2497:The main takeaway here is that 2037:with respect to the weights of 1122:{\displaystyle {\overline {X}}} 3134: 3128: 2838: 2832: 2811:second partial derivative test 2356: 2350: 2258: 2229: 2170: 2164: 2059: 2053: 2020: 2014: 1936: 1930: 1707: 1652: 1640: 1630: 1545: 1533: 1523: 1408: 1402: 1393: 1247: 817: 804: 755: 742: 653: 518: 505: 444: 431: 224: 218: 209: 84: 1: 977:{\displaystyle 1/{\sqrt {n}}} 662:{\displaystyle {\hat {\mu }}} 3536:"Inverse-variance weighting" 1114: 812: 750: 684: 1067:{\displaystyle \sigma _{i}} 925:{\displaystyle \sigma _{i}} 569:{\displaystyle \sigma _{i}} 3630: 1027:of the same quantity with 25:inverse-variance weighting 1817:are normalised such that 669:, is given by the simple 478:{\displaystyle \forall i} 616:measurements, a typical 489:of the random variables 589:{\displaystyle \sigma } 3438:Weighted least squares 3419: 3334: 3207: 3178: 3141: 3140:{\displaystyle Var(Y)} 3106: 3063: 2800: 2687: 2587: 2544: 2488: 2410: 2297: 2296:{\displaystyle k>0} 2268: 2139: 2109: 2066: 2065:{\displaystyle Var(Y)} 2027: 2026:{\displaystyle Var(Y)} 1989: 1901: 1881: 1854: 1811: 1784: 1723: 1687: 1606: 1501: 1371: 1318: 1224: 1204: 1123: 1088: 1068: 1041: 1021: 1001: 978: 946: 926: 899: 782: 762: 726: 663: 634: 610: 590: 570: 543: 479: 457: 456:{\displaystyle E=\mu } 412: 345: 325: 279: 181: 3477:John Wiley & Sons 3420: 3335: 3208: 3179: 3142: 3107: 3105:{\displaystyle y_{i}} 3064: 2801: 2688: 2588: 2545: 2489: 2411: 2298: 2269: 2140: 2138:{\displaystyle w_{0}} 2110: 2067: 2028: 1990: 1902: 1882: 1880:{\displaystyle X_{i}} 1855: 1812: 1810:{\displaystyle w_{i}} 1785: 1724: 1688: 1607: 1502: 1372: 1319: 1225: 1205: 1124: 1089: 1069: 1042: 1022: 1002: 979: 947: 927: 900: 783: 763: 727: 664: 635: 611: 591: 571: 544: 480: 458: 413: 346: 326: 280: 182: 3521:improve this article 3347: 3220: 3188: 3159: 3116: 3089: 3079:normally distributed 3073:Normal distributions 2820: 2703: 2600: 2554: 2501: 2426: 2419:which implies that: 2310: 2281: 2152: 2122: 2076: 2041: 2002: 1918: 1891: 1864: 1821: 1794: 1790:, where the weights 1741: 1697: 1618: 1511: 1381: 1328: 1238: 1234:of the measurements 1223:{\displaystyle \mu } 1214: 1134: 1106: 1078: 1051: 1031: 1011: 991: 956: 936: 909: 905:. Hence, if all the 792: 781:{\displaystyle \mu } 772: 736: 676: 644: 633:{\displaystyle \mu } 624: 600: 580: 553: 493: 466: 425: 357: 335: 324:{\displaystyle \mu } 315: 197: 75: 3400: 3317: 3273: 3052: 3010: 2990: 2970: 2950: 2920: 2902: 2885: 2870: 2776: 2738: 2677: 2652: 2539: 2478: 2389: 2215: 2200: 2117:Lagrange multiplier 1981: 1966: 1909:BienaymĂ©'s identity 1682: 1584: 1456: 1441: 1366: 1199: 1169: 1151: 1098:, from analysing a 864: 538: 268: 170: 135: 3614:Estimation methods 3415: 3381: 3380: 3330: 3298: 3297: 3254: 3253: 3203: 3174: 3137: 3102: 3059: 3038: 3029: 2996: 2976: 2956: 2936: 2930: 2906: 2888: 2871: 2856: 2853: 2796: 2762: 2756: 2724: 2683: 2663: 2638: 2632: 2583: 2566: 2540: 2525: 2484: 2464: 2406: 2375: 2293: 2264: 2241: 2201: 2186: 2185: 2135: 2105: 2088: 2062: 2023: 1985: 1967: 1952: 1951: 1897: 1877: 1850: 1833: 1807: 1780: 1759: 1719: 1683: 1668: 1667: 1602: 1567: 1566: 1497: 1473: 1442: 1427: 1426: 1367: 1352: 1314: 1300: 1268: 1220: 1200: 1185: 1155: 1137: 1119: 1084: 1064: 1037: 1017: 997: 974: 942: 922: 895: 850: 849: 778: 758: 722: 711: 659: 630: 606: 586: 566: 539: 524: 475: 453: 408: 341: 321: 275: 254: 245: 177: 156: 147: 121: 105: 41:inverse proportion 16:Statistical method 3597: 3596: 3589: 3571: 3486:978-0-470-29089-7 3371: 3359: 3288: 3244: 3232: 3151:Multivariate case 3054: 3020: 2991: 2951: 2921: 2886: 2844: 2777: 2747: 2739: 2678: 2653: 2623: 2618: 2557: 2479: 2339: 2232: 2176: 2079: 1942: 1907:is given by (see 1900:{\displaystyle Y} 1824: 1750: 1716: 1710: 1658: 1649: 1643: 1557: 1542: 1536: 1495: 1464: 1417: 1405: 1312: 1291: 1259: 1250: 1117: 1100:projectile motion 1087:{\displaystyle g} 1040:{\displaystyle n} 1020:{\displaystyle n} 1000:{\displaystyle n} 972: 945:{\displaystyle n} 883: 882: 840: 838: 815: 761:{\displaystyle E} 753: 702: 700: 687: 656: 609:{\displaystyle n} 420:expectation value 344:{\displaystyle n} 270: 236: 221: 172: 138: 96: 87: 3621: 3592: 3585: 3581: 3578: 3572: 3570: 3529: 3505: 3497: 3491: 3490: 3474: 3464: 3448:CramĂ©r-Rao bound 3443:Portfolio theory 3424: 3422: 3421: 3416: 3414: 3413: 3405: 3401: 3399: 3391: 3386: 3379: 3361: 3360: 3352: 3339: 3337: 3336: 3331: 3329: 3328: 3323: 3316: 3308: 3303: 3296: 3287: 3286: 3278: 3274: 3272: 3264: 3259: 3252: 3234: 3233: 3225: 3212: 3210: 3209: 3204: 3202: 3201: 3196: 3183: 3181: 3180: 3175: 3173: 3172: 3167: 3146: 3144: 3143: 3138: 3111: 3109: 3108: 3103: 3101: 3100: 3068: 3066: 3065: 3060: 3055: 3053: 3051: 3046: 3037: 3028: 3015: 3009: 3004: 2992: 2989: 2984: 2972: 2969: 2964: 2952: 2949: 2944: 2932: 2929: 2919: 2914: 2901: 2896: 2887: 2884: 2879: 2869: 2864: 2855: 2852: 2805: 2803: 2802: 2797: 2792: 2791: 2783: 2779: 2778: 2775: 2770: 2758: 2755: 2740: 2737: 2732: 2720: 2715: 2714: 2692: 2690: 2689: 2684: 2679: 2676: 2671: 2659: 2654: 2651: 2646: 2634: 2631: 2619: 2617: 2616: 2604: 2592: 2590: 2589: 2584: 2576: 2575: 2565: 2549: 2547: 2546: 2541: 2538: 2533: 2524: 2513: 2512: 2493: 2491: 2490: 2485: 2480: 2477: 2472: 2463: 2459: 2454: 2453: 2443: 2438: 2437: 2415: 2413: 2412: 2407: 2402: 2401: 2388: 2383: 2374: 2373: 2340: 2338: 2337: 2336: 2320: 2302: 2300: 2299: 2294: 2273: 2271: 2270: 2265: 2251: 2250: 2240: 2228: 2227: 2214: 2209: 2199: 2194: 2184: 2144: 2142: 2141: 2136: 2134: 2133: 2114: 2112: 2111: 2106: 2098: 2097: 2087: 2071: 2069: 2068: 2063: 2032: 2030: 2029: 2024: 1994: 1992: 1991: 1986: 1980: 1975: 1965: 1960: 1950: 1906: 1904: 1903: 1898: 1886: 1884: 1883: 1878: 1876: 1875: 1859: 1857: 1856: 1851: 1843: 1842: 1832: 1816: 1814: 1813: 1808: 1806: 1805: 1789: 1787: 1786: 1781: 1779: 1778: 1769: 1768: 1758: 1728: 1726: 1725: 1720: 1718: 1717: 1714: 1712: 1711: 1703: 1692: 1690: 1689: 1684: 1681: 1676: 1666: 1651: 1650: 1647: 1645: 1644: 1636: 1614:Note that since 1611: 1609: 1608: 1603: 1598: 1597: 1589: 1585: 1583: 1575: 1565: 1544: 1543: 1540: 1538: 1537: 1529: 1506: 1504: 1503: 1498: 1496: 1494: 1493: 1488: 1484: 1483: 1482: 1472: 1457: 1455: 1450: 1440: 1435: 1425: 1415: 1407: 1406: 1398: 1376: 1374: 1373: 1368: 1365: 1360: 1351: 1340: 1339: 1323: 1321: 1320: 1315: 1313: 1311: 1310: 1309: 1299: 1289: 1288: 1287: 1278: 1277: 1267: 1257: 1252: 1251: 1243: 1229: 1227: 1226: 1221: 1209: 1207: 1206: 1201: 1198: 1193: 1168: 1163: 1150: 1145: 1128: 1126: 1125: 1120: 1118: 1110: 1093: 1091: 1090: 1085: 1073: 1071: 1070: 1065: 1063: 1062: 1046: 1044: 1043: 1038: 1026: 1024: 1023: 1018: 1006: 1004: 1003: 998: 983: 981: 980: 975: 973: 968: 966: 951: 949: 948: 943: 931: 929: 928: 923: 921: 920: 904: 902: 901: 896: 894: 893: 888: 884: 878: 874: 863: 858: 848: 839: 837: 836: 824: 816: 808: 787: 785: 784: 779: 767: 765: 764: 759: 754: 746: 731: 729: 728: 723: 721: 720: 710: 701: 693: 688: 680: 668: 666: 665: 660: 658: 657: 649: 639: 637: 636: 631: 615: 613: 612: 607: 595: 593: 592: 587: 575: 573: 572: 567: 565: 564: 548: 546: 545: 540: 537: 532: 517: 516: 484: 482: 481: 476: 462: 460: 459: 454: 443: 442: 417: 415: 414: 409: 407: 406: 382: 381: 369: 368: 352:random variables 350: 348: 347: 342: 330: 328: 327: 322: 309:gravity of Earth 284: 282: 281: 276: 271: 269: 267: 262: 253: 244: 231: 223: 222: 214: 186: 184: 183: 178: 173: 171: 169: 164: 155: 146: 136: 134: 129: 120: 115: 114: 104: 94: 89: 88: 80: 67: 58: 37:weighted average 31:to minimize the 29:random variables 3629: 3628: 3624: 3623: 3622: 3620: 3619: 3618: 3599: 3598: 3593: 3582: 3576: 3573: 3530: 3528: 3518: 3506: 3495: 3494: 3487: 3466: 3465: 3461: 3456: 3434: 3370: 3366: 3365: 3345: 3344: 3318: 3243: 3239: 3238: 3218: 3217: 3191: 3186: 3185: 3162: 3157: 3156: 3153: 3114: 3113: 3092: 3087: 3086: 3075: 3019: 2818: 2817: 2746: 2742: 2741: 2706: 2701: 2700: 2608: 2598: 2597: 2567: 2552: 2551: 2504: 2499: 2498: 2445: 2444: 2429: 2424: 2423: 2393: 2365: 2328: 2324: 2308: 2307: 2279: 2278: 2242: 2219: 2150: 2149: 2125: 2120: 2119: 2089: 2074: 2073: 2039: 2038: 2000: 1999: 1916: 1915: 1889: 1888: 1867: 1862: 1861: 1834: 1819: 1818: 1797: 1792: 1791: 1770: 1760: 1739: 1738: 1735: 1700: 1695: 1694: 1633: 1616: 1615: 1556: 1552: 1551: 1526: 1509: 1508: 1474: 1463: 1459: 1458: 1416: 1379: 1378: 1331: 1326: 1325: 1301: 1290: 1279: 1269: 1258: 1236: 1235: 1212: 1211: 1132: 1131: 1104: 1103: 1096:simple pendulum 1076: 1075: 1054: 1049: 1048: 1029: 1028: 1009: 1008: 989: 988: 954: 953: 934: 933: 912: 907: 906: 869: 868: 828: 790: 789: 770: 769: 734: 733: 712: 674: 673: 642: 641: 622: 621: 598: 597: 578: 577: 556: 551: 550: 508: 491: 490: 464: 463: 434: 423: 422: 398: 373: 360: 355: 354: 333: 332: 313: 312: 305: 235: 195: 194: 137: 106: 95: 73: 72: 65: 60: 59:with variances 56: 51: 17: 12: 11: 5: 3627: 3625: 3617: 3616: 3611: 3601: 3600: 3595: 3594: 3577:September 2012 3509: 3507: 3500: 3493: 3492: 3485: 3458: 3457: 3455: 3452: 3451: 3450: 3445: 3440: 3433: 3430: 3426: 3425: 3412: 3409: 3404: 3398: 3395: 3390: 3385: 3378: 3374: 3369: 3364: 3358: 3355: 3341: 3340: 3327: 3322: 3315: 3312: 3307: 3302: 3295: 3291: 3285: 3282: 3277: 3271: 3268: 3263: 3258: 3251: 3247: 3242: 3237: 3231: 3228: 3200: 3195: 3171: 3166: 3152: 3149: 3136: 3133: 3130: 3127: 3124: 3121: 3099: 3095: 3074: 3071: 3070: 3069: 3058: 3050: 3045: 3041: 3036: 3032: 3027: 3023: 3018: 3013: 3008: 3003: 2999: 2995: 2988: 2983: 2979: 2975: 2968: 2963: 2959: 2955: 2948: 2943: 2939: 2935: 2928: 2924: 2918: 2913: 2909: 2905: 2900: 2895: 2891: 2883: 2878: 2874: 2868: 2863: 2859: 2851: 2847: 2843: 2840: 2837: 2834: 2831: 2828: 2825: 2807: 2806: 2795: 2790: 2787: 2782: 2774: 2769: 2765: 2761: 2754: 2750: 2745: 2736: 2731: 2727: 2723: 2718: 2713: 2709: 2694: 2693: 2682: 2675: 2670: 2666: 2662: 2657: 2650: 2645: 2641: 2637: 2630: 2626: 2622: 2615: 2611: 2607: 2582: 2579: 2574: 2570: 2564: 2560: 2537: 2532: 2528: 2523: 2519: 2516: 2511: 2507: 2495: 2494: 2483: 2476: 2471: 2467: 2462: 2458: 2452: 2448: 2441: 2436: 2432: 2417: 2416: 2405: 2400: 2396: 2392: 2387: 2382: 2378: 2372: 2368: 2364: 2361: 2358: 2355: 2352: 2349: 2346: 2343: 2335: 2331: 2327: 2323: 2318: 2315: 2292: 2289: 2286: 2275: 2274: 2263: 2260: 2257: 2254: 2249: 2245: 2239: 2235: 2231: 2226: 2222: 2218: 2213: 2208: 2204: 2198: 2193: 2189: 2183: 2179: 2175: 2172: 2169: 2166: 2163: 2160: 2157: 2132: 2128: 2104: 2101: 2096: 2092: 2086: 2082: 2061: 2058: 2055: 2052: 2049: 2046: 2022: 2019: 2016: 2013: 2010: 2007: 1996: 1995: 1984: 1979: 1974: 1970: 1964: 1959: 1955: 1949: 1945: 1941: 1938: 1935: 1932: 1929: 1926: 1923: 1896: 1874: 1870: 1849: 1846: 1841: 1837: 1831: 1827: 1804: 1800: 1777: 1773: 1767: 1763: 1757: 1753: 1749: 1746: 1734: 1731: 1709: 1706: 1680: 1675: 1671: 1665: 1661: 1657: 1654: 1642: 1639: 1632: 1629: 1626: 1623: 1601: 1596: 1593: 1588: 1582: 1579: 1574: 1570: 1564: 1560: 1555: 1550: 1547: 1535: 1532: 1525: 1522: 1519: 1516: 1492: 1487: 1481: 1477: 1471: 1467: 1462: 1454: 1449: 1445: 1439: 1434: 1430: 1424: 1420: 1413: 1410: 1404: 1401: 1395: 1392: 1389: 1386: 1364: 1359: 1355: 1350: 1346: 1343: 1338: 1334: 1308: 1304: 1298: 1294: 1286: 1282: 1276: 1272: 1266: 1262: 1255: 1249: 1246: 1219: 1197: 1192: 1188: 1184: 1181: 1178: 1175: 1172: 1167: 1162: 1158: 1154: 1149: 1144: 1140: 1116: 1113: 1083: 1061: 1057: 1036: 1016: 996: 971: 965: 961: 941: 919: 915: 892: 887: 881: 877: 872: 867: 862: 857: 853: 847: 843: 835: 831: 827: 822: 819: 814: 811: 806: 803: 800: 797: 777: 757: 752: 749: 744: 741: 719: 715: 709: 705: 699: 696: 691: 686: 683: 655: 652: 629: 605: 585: 563: 559: 536: 531: 527: 523: 520: 515: 511: 507: 504: 501: 498: 474: 471: 452: 449: 446: 441: 437: 433: 430: 405: 401: 397: 394: 391: 388: 385: 380: 376: 372: 367: 363: 340: 320: 304: 301: 286: 285: 274: 266: 261: 257: 252: 248: 243: 239: 234: 229: 226: 220: 217: 211: 208: 205: 202: 188: 187: 176: 168: 163: 159: 154: 150: 145: 141: 133: 128: 124: 119: 113: 109: 103: 99: 92: 86: 83: 63: 54: 15: 13: 10: 9: 6: 4: 3: 2: 3626: 3615: 3612: 3610: 3609:Meta-analysis 3607: 3606: 3604: 3591: 3588: 3580: 3569: 3566: 3562: 3559: 3555: 3552: 3548: 3545: 3541: 3538: â€“  3537: 3533: 3532:Find sources: 3526: 3522: 3516: 3515: 3510:This article 3508: 3504: 3499: 3498: 3488: 3482: 3478: 3473: 3472: 3463: 3460: 3453: 3449: 3446: 3444: 3441: 3439: 3436: 3435: 3431: 3429: 3410: 3407: 3402: 3396: 3393: 3388: 3376: 3372: 3367: 3362: 3343: 3342: 3325: 3313: 3310: 3305: 3293: 3289: 3283: 3280: 3275: 3269: 3266: 3261: 3249: 3245: 3240: 3235: 3216: 3215: 3214: 3198: 3169: 3150: 3148: 3131: 3125: 3122: 3119: 3097: 3093: 3084: 3080: 3072: 3056: 3048: 3043: 3039: 3034: 3030: 3025: 3021: 3016: 3011: 3006: 3001: 2997: 2993: 2986: 2981: 2977: 2973: 2966: 2961: 2957: 2953: 2946: 2941: 2937: 2933: 2926: 2922: 2916: 2911: 2907: 2903: 2898: 2893: 2889: 2881: 2876: 2872: 2866: 2861: 2857: 2849: 2845: 2841: 2835: 2829: 2826: 2823: 2816: 2815: 2814: 2812: 2793: 2788: 2785: 2780: 2772: 2767: 2763: 2759: 2752: 2748: 2743: 2734: 2729: 2725: 2721: 2716: 2711: 2707: 2699: 2698: 2697: 2680: 2673: 2668: 2664: 2660: 2655: 2648: 2643: 2639: 2635: 2628: 2624: 2620: 2613: 2609: 2605: 2596: 2595: 2594: 2580: 2577: 2572: 2568: 2562: 2558: 2535: 2530: 2526: 2521: 2517: 2514: 2509: 2505: 2481: 2474: 2469: 2465: 2460: 2456: 2450: 2446: 2439: 2434: 2430: 2422: 2421: 2420: 2403: 2398: 2394: 2390: 2385: 2380: 2376: 2370: 2366: 2362: 2359: 2353: 2347: 2344: 2341: 2333: 2329: 2316: 2313: 2306: 2305: 2304: 2290: 2287: 2284: 2261: 2255: 2252: 2247: 2243: 2237: 2233: 2224: 2220: 2216: 2211: 2206: 2202: 2196: 2191: 2187: 2181: 2177: 2173: 2167: 2161: 2158: 2155: 2148: 2147: 2146: 2130: 2126: 2118: 2102: 2099: 2094: 2090: 2084: 2080: 2056: 2050: 2047: 2044: 2036: 2017: 2011: 2008: 2005: 1982: 1977: 1972: 1968: 1962: 1957: 1953: 1947: 1943: 1939: 1933: 1927: 1924: 1921: 1914: 1913: 1912: 1910: 1894: 1872: 1868: 1847: 1844: 1839: 1835: 1829: 1825: 1802: 1798: 1775: 1771: 1765: 1761: 1755: 1751: 1747: 1744: 1732: 1730: 1704: 1678: 1673: 1669: 1663: 1655: 1637: 1627: 1624: 1621: 1612: 1599: 1594: 1591: 1586: 1580: 1577: 1572: 1568: 1562: 1558: 1553: 1548: 1530: 1520: 1517: 1514: 1490: 1485: 1479: 1475: 1469: 1465: 1460: 1452: 1447: 1443: 1437: 1432: 1428: 1422: 1418: 1411: 1399: 1390: 1387: 1384: 1362: 1357: 1353: 1348: 1344: 1341: 1336: 1332: 1306: 1302: 1296: 1292: 1284: 1280: 1274: 1270: 1264: 1260: 1253: 1244: 1233: 1232:weighted mean 1217: 1195: 1190: 1186: 1182: 1179: 1176: 1173: 1170: 1165: 1160: 1156: 1152: 1147: 1142: 1138: 1111: 1101: 1097: 1081: 1059: 1055: 1034: 1014: 994: 985: 969: 963: 959: 939: 917: 913: 890: 885: 879: 875: 870: 865: 860: 855: 851: 845: 841: 833: 829: 825: 820: 809: 801: 798: 795: 775: 747: 739: 717: 713: 707: 703: 697: 694: 689: 681: 672: 650: 640:, denoted as 627: 619: 603: 583: 561: 557: 534: 529: 525: 521: 513: 509: 502: 499: 496: 488: 472: 450: 447: 439: 435: 428: 421: 403: 399: 395: 392: 389: 386: 383: 378: 374: 370: 365: 361: 353: 338: 318: 310: 302: 300: 298: 297:sensor fusion 294: 293:meta-analysis 289: 272: 264: 259: 255: 250: 246: 241: 237: 232: 227: 215: 206: 203: 200: 193: 192: 191: 174: 166: 161: 157: 152: 148: 143: 139: 131: 126: 122: 117: 111: 107: 101: 97: 90: 81: 71: 70: 69: 66: 57: 48: 46: 42: 38: 34: 30: 26: 22: 3583: 3574: 3564: 3557: 3550: 3543: 3531: 3519:Please help 3514:verification 3511: 3470: 3462: 3427: 3154: 3076: 2808: 2695: 2496: 2418: 2276: 1997: 1736: 1613: 986: 596:. Given the 306: 290: 287: 189: 61: 52: 49: 24: 18: 1230:would be a 987:Instead of 3603:Categories 3547:newspapers 3454:References 2115:. Using a 1860:. If the 1733:Derivation 21:statistics 3408:− 3394:− 3373:∑ 3357:^ 3311:− 3290:∑ 3281:− 3267:− 3246:∑ 3230:^ 3040:σ 3022:∑ 2998:σ 2978:σ 2958:σ 2938:σ 2923:∑ 2908:σ 2890:σ 2873:σ 2858:σ 2846:∑ 2786:− 2764:σ 2749:∑ 2726:σ 2665:σ 2640:σ 2625:∑ 2559:∑ 2527:σ 2515:∝ 2466:σ 2391:− 2377:σ 2326:∂ 2322:∂ 2253:− 2234:∑ 2217:− 2203:σ 2178:∑ 2081:∑ 1969:σ 1944:∑ 1826:∑ 1752:∑ 1708:^ 1705:μ 1670:σ 1641:^ 1638:μ 1592:− 1578:− 1569:σ 1559:∑ 1534:^ 1531:μ 1466:∑ 1444:σ 1419:∑ 1403:^ 1400:μ 1354:σ 1293:∑ 1261:∑ 1248:^ 1245:μ 1218:μ 1187:σ 1157:σ 1139:σ 1115:¯ 1056:σ 914:σ 876:σ 852:σ 842:∑ 813:¯ 776:μ 751:¯ 704:∑ 685:¯ 654:^ 651:μ 628:μ 618:estimator 584:σ 558:σ 526:σ 470:∀ 451:μ 319:μ 256:σ 238:∑ 219:^ 158:σ 140:∑ 123:σ 98:∑ 85:^ 45:precision 3432:See also 3083:Bayesian 2550:. Since 2035:gradient 487:variance 33:variance 3561:scholar 1094:from a 671:average 303:Context 35:of the 3563:  3556:  3549:  3542:  3534:  3483:  3568:JSTOR 3554:books 3540:news 3481:ISBN 3077:For 2288:> 2277:For 1656:< 620:for 3523:by 1715:opt 1660:min 1648:opt 1541:opt 952:as 768:is 295:or 47:). 19:In 3605:: 3479:. 3475:. 3213:: 3147:. 2656::= 2593:, 2303:, 1911:) 522::= 23:, 3590:) 3584:( 3579:) 3575:( 3565:· 3558:· 3551:· 3544:· 3517:. 3489:. 3411:1 3403:) 3397:1 3389:i 3384:C 3377:i 3368:( 3363:= 3354:C 3326:i 3321:x 3314:1 3306:i 3301:C 3294:i 3284:1 3276:) 3270:1 3262:i 3257:C 3250:i 3241:( 3236:= 3227:x 3199:i 3194:x 3170:i 3165:C 3135:) 3132:Y 3129:( 3126:r 3123:a 3120:V 3098:i 3094:y 3057:. 3049:2 3044:i 3035:/ 3031:1 3026:i 3017:1 3012:= 3007:2 3002:0 2994:= 2987:2 2982:0 2974:1 2967:4 2962:0 2954:= 2947:2 2942:i 2934:1 2927:i 2917:4 2912:0 2904:= 2899:2 2894:i 2882:4 2877:i 2867:4 2862:0 2850:i 2842:= 2839:) 2836:Y 2833:( 2830:r 2827:a 2824:V 2794:. 2789:1 2781:) 2773:2 2768:i 2760:1 2753:i 2744:( 2735:2 2730:k 2722:1 2717:= 2712:k 2708:w 2681:. 2674:2 2669:0 2661:1 2649:2 2644:i 2636:1 2629:i 2621:= 2614:0 2610:w 2606:2 2581:1 2578:= 2573:i 2569:w 2563:i 2536:2 2531:k 2522:/ 2518:1 2510:k 2506:w 2482:. 2475:2 2470:k 2461:2 2457:/ 2451:0 2447:w 2440:= 2435:k 2431:w 2404:, 2399:0 2395:w 2386:2 2381:k 2371:k 2367:w 2363:2 2360:= 2357:) 2354:Y 2351:( 2348:r 2345:a 2342:V 2334:k 2330:w 2317:= 2314:0 2291:0 2285:k 2262:. 2259:) 2256:1 2248:i 2244:w 2238:i 2230:( 2225:0 2221:w 2212:2 2207:i 2197:2 2192:i 2188:w 2182:i 2174:= 2171:) 2168:Y 2165:( 2162:r 2159:a 2156:V 2131:0 2127:w 2103:1 2100:= 2095:i 2091:w 2085:i 2060:) 2057:Y 2054:( 2051:r 2048:a 2045:V 2021:) 2018:Y 2015:( 2012:r 2009:a 2006:V 1983:. 1978:2 1973:i 1963:2 1958:i 1954:w 1948:i 1940:= 1937:) 1934:Y 1931:( 1928:r 1925:a 1922:V 1895:Y 1873:i 1869:X 1848:1 1845:= 1840:i 1836:w 1830:i 1803:i 1799:w 1776:i 1772:X 1766:i 1762:w 1756:i 1748:= 1745:Y 1679:2 1674:j 1664:j 1653:) 1631:( 1628:r 1625:a 1622:V 1600:. 1595:1 1587:) 1581:2 1573:i 1563:i 1554:( 1549:= 1546:) 1524:( 1521:r 1518:a 1515:V 1491:2 1486:) 1480:i 1476:w 1470:i 1461:( 1453:2 1448:i 1438:2 1433:i 1429:w 1423:i 1412:= 1409:) 1394:( 1391:r 1388:a 1385:V 1363:2 1358:i 1349:/ 1345:1 1342:= 1337:i 1333:w 1307:i 1303:w 1297:i 1285:i 1281:X 1275:i 1271:w 1265:i 1254:= 1196:2 1191:n 1183:, 1180:. 1177:. 1174:. 1171:, 1166:2 1161:2 1153:, 1148:2 1143:1 1112:X 1082:g 1060:i 1035:n 1015:n 995:n 970:n 964:/ 960:1 940:n 918:i 891:2 886:) 880:n 871:( 866:= 861:2 856:i 846:i 834:2 830:n 826:1 821:= 818:) 810:X 805:( 802:r 799:a 796:V 756:] 748:X 743:[ 740:E 718:i 714:X 708:i 698:n 695:1 690:= 682:X 604:n 562:i 535:2 530:i 519:) 514:i 510:X 506:( 503:r 500:a 497:V 473:i 448:= 445:] 440:i 436:X 432:[ 429:E 404:n 400:X 396:, 393:. 390:. 387:. 384:, 379:2 375:X 371:, 366:1 362:X 339:n 273:. 265:2 260:i 251:/ 247:1 242:i 233:1 228:= 225:) 216:y 210:( 207:r 204:a 201:V 175:. 167:2 162:i 153:/ 149:1 144:i 132:2 127:i 118:/ 112:i 108:y 102:i 91:= 82:y 64:i 62:σ 55:i 53:y

Index

statistics
random variables
variance
weighted average
inverse proportion
precision
meta-analysis
sensor fusion
gravity of Earth
random variables
expectation value
variance
estimator
average
simple pendulum
projectile motion
weighted mean
Bienaymé's identity
gradient
Lagrange multiplier
second partial derivative test
normally distributed
Bayesian
Weighted least squares
Portfolio theory
Cramér-Rao bound
Statistical meta-analysis with applications
John Wiley & Sons
ISBN
978-0-470-29089-7

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑