Knowledge

Mean squared prediction error

Source đź“ť

815: 1888: 797:
out-of-sample data points is viewed more favorably, regardless of the models’ relative in-sample performances. The out-of-sample MSPE in this context is exact for the out-of-sample data points that it was computed over, but is merely an estimate of the model’s MSPE for the mostly unobserved
1430: 2058: 1621: 1128: 1695: 434: 792:
held-back points. If the increase in the MSPE out of sample compared to in sample is relatively slight, that results in the model being viewed favorably. And if two models are to be compared, the one with the lower MSPE over the
490: 783:
data points with the specific purpose of using them to compute the estimated model’s MSPE out of sample (i.e., not using data that were used in the model estimation process). Since the regression process is tailored to the
706: 326: 1139: 580: 989: 2230: 937: 1911: 1441: 997: 2105: 1883:{\displaystyle n\cdot \operatorname {\widehat {MSPE}} (L)=\sum _{i=1}^{n}\left(y_{i}-{\widehat {g}}(x_{i})\right)^{2}-{\widehat {\sigma }}^{2}\left(n-\operatorname {tr} \left\right).} 1687: 334: 146: 235: 101: 1651: 2223: 2625: 190: 2216: 453: 586: 2266: 2620: 243: 2131: 832: 2364: 2359: 2196: 1425:{\displaystyle \sum _{i=1}^{n}\left(\operatorname {E} \left\right)^{2}=\operatorname {E} \left-\sigma ^{2}\operatorname {tr} \left.} 854: 801:
Second, as time goes on more data may become available to the data analyst, and then the MSPE can be computed over these new data.
496: 2586: 2447: 942: 2271: 877: 2053:{\displaystyle C_{p}={\frac {\sum _{i=1}^{n}\left(y_{i}-{\widehat {g}}(x_{i})\right)^{2}}{{\widehat {\sigma }}^{2}}}-n+2p.} 1616:{\displaystyle n\cdot \operatorname {MSPE} (L)=\operatorname {E} \left-\sigma ^{2}\left(n-\operatorname {tr} \left\right).} 2537: 746: 149: 2296: 2116: 1123:{\displaystyle n\cdot \operatorname {MSPE} (L)=g^{\text{T}}(I-L)^{\text{T}}(I-L)g+\sigma ^{2}\operatorname {tr} \left.} 2503: 2432: 2369: 867:
When the model has been estimated over all available data with none held back, the MSPE of the model over the entire
2316: 2291: 2276: 836: 825: 2306: 2301: 2402: 2074: 1656: 2630: 2493: 2397: 2392: 2107:
is computed from the version of the model that includes all possible regressors. That concludes this proof.
52: 2560: 788:
in-sample points, normally the in-sample MSPE will be smaller than the out-of-sample one computed over the
429:{\displaystyle \operatorname {MSPE} =\operatorname {E} \left=\sum _{i=1}^{n}\operatorname {PE} _{i}^{2}/n.} 2334: 2121: 868: 2136: 2555: 2455: 2344: 2339: 119: 199: 77: 2261: 2248: 2176: 764: 752: 2581: 2547: 2427: 2256: 2126: 1629: 2576: 2465: 2412: 2192: 2188: 2180: 440: 165: 71: 2599: 2384: 2349: 2286: 2239: 60: 2532: 2475: 2354: 2152: 193: 156:
would be required in order to calculate the MSPE exactly; in practice, MSPE is estimated.
2141: 1897: 751:
The mean squared prediction error can be computed exactly in two contexts. First, with a
2172: 175: 56: 2208: 2614: 2417: 2374: 1893: 48: 760: 485:{\displaystyle \operatorname {MSPE} =\operatorname {ME} ^{2}+\operatorname {VAR} ,} 17: 1133:
Using in-sample data values, the first term on the right side is equivalent to
701:{\displaystyle \operatorname {VAR} =\operatorname {E} \left\right)^{2}\right].} 2511: 2422: 2407: 104: 28: 2329: 44: 1896:
advocated this method in the construction of his model selection statistic
444: 2437: 2324: 321:{\displaystyle \operatorname {PE_{i}} =g(x_{i})-{\widehat {g}}(x_{i}),} 2516: 2488: 2483: 2460: 2212: 808: 74:
between the fitted values implied by the predictive function
961: 575:{\displaystyle \operatorname {ME} =\operatorname {E} \left} 984:{\displaystyle \varepsilon _{i}\sim {\mathcal {N}}(0,1)} 1905:, which is a normalized version of the estimated MSPE: 871:
of mostly unobserved data can be estimated as follows.
439:
The MSPE can be decomposed into two terms: the squared
932:{\displaystyle y_{i}=g(x_{i})+\sigma \varepsilon _{i}} 2077: 1914: 1698: 1659: 1632: 1444: 1142: 1000: 945: 880: 835:. Please help to ensure that disputed statements are 589: 499: 456: 337: 246: 202: 178: 122: 80: 2569: 2546: 2525: 2502: 2474: 2446: 2383: 2315: 2247: 2099: 2052: 1882: 1681: 1645: 1615: 1424: 1122: 983: 931: 700: 574: 484: 428: 320: 229: 184: 140: 95: 2187:(3rd ed.). New York: McGraw-Hill. pp.  2224: 8: 2185:Econometric Models & Economic Forecasts 798:population from which the data were drawn. 741:Computation of MSPE over out-of-sample data 2231: 2217: 2209: 1689:, it becomes possible to estimate MSPE by 443:(mean error) of the fitted values and the 164:If the smoothing or fitting procedure has 2091: 2080: 2079: 2076: 2024: 2013: 2012: 2004: 1990: 1972: 1971: 1962: 1946: 1935: 1928: 1919: 1913: 1838: 1827: 1826: 1816: 1802: 1784: 1783: 1774: 1758: 1747: 1706: 1705: 1697: 1673: 1662: 1661: 1658: 1637: 1631: 1571: 1553: 1539: 1521: 1520: 1511: 1495: 1484: 1443: 1389: 1352: 1334: 1320: 1302: 1301: 1292: 1276: 1265: 1241: 1222: 1204: 1203: 1191: 1158: 1147: 1141: 1103: 1082: 1051: 1029: 999: 960: 959: 950: 944: 923: 904: 885: 879: 855:Learn how and when to remove this message 685: 666: 654: 631: 613: 612: 588: 558: 536: 518: 517: 498: 467: 455: 415: 406: 401: 391: 380: 363: 358: 336: 306: 288: 287: 275: 255: 247: 245: 204: 203: 201: 177: 124: 123: 121: 82: 81: 79: 2100:{\displaystyle {\widehat {\sigma }}^{2}} 1682:{\displaystyle {\widehat {\sigma }}^{2}} 831:Relevant discussion may be found on the 172:, which maps the observed values vector 2164: 805:Estimation of MSPE over the population 2181:"Forecasting with Time-Series Models" 103:and the values of the (unobservable) 41:mean squared error of the predictions 7: 2626:Statistical deviation and dispersion 237:then PE and MSPE are formulated as: 152:of an estimated model. Knowledge of 2067:the number of estimated parameters 2132:Errors and residuals in statistics 1718: 1715: 1712: 1709: 1469: 1250: 1170: 725:root mean squared prediction error 643: 596: 506: 344: 256: 252: 248: 148:and can be used in the process of 110:. It is an inverse measure of the 25: 813: 2587:Pearson correlation coefficient 141:{\displaystyle {\widehat {g}},} 1996: 1983: 1808: 1795: 1737: 1731: 1653:is known or well-estimated by 1545: 1532: 1463: 1457: 1326: 1313: 1228: 1215: 1197: 1184: 1069: 1057: 1048: 1035: 1019: 1013: 978: 966: 910: 897: 672: 659: 637: 624: 564: 551: 542: 529: 312: 299: 281: 268: 230:{\displaystyle {\hat {y}}=Ly,} 209: 96:{\displaystyle {\widehat {g}}} 1: 2526:Deep Learning Related Metrics 747:Cross-validation (statistics) 33:mean squared prediction error 2621:Point estimation performance 2117:Akaike information criterion 727:is the square root of MSPE: 721:sum squared prediction error 2370:Sensitivity and specificity 1646:{\displaystyle \sigma ^{2}} 2647: 779:), holding back the other 744: 2595: 771:of the data points (with 2398:Calinski-Harabasz index 2122:Bias-variance tradeoff 2101: 2054: 1951: 1884: 1763: 1683: 1647: 1617: 1500: 1426: 1281: 1163: 1124: 985: 933: 702: 576: 486: 447:of the fitted values: 430: 396: 322: 231: 186: 142: 97: 2561:Intra-list Similarity 2137:Law of total variance 2102: 2055: 1931: 1885: 1743: 1684: 1648: 1618: 1480: 1427: 1261: 1143: 1125: 986: 934: 745:Further information: 703: 577: 487: 431: 376: 323: 232: 187: 143: 98: 2177:Rubinfeld, Daniel L. 2075: 1912: 1696: 1657: 1630: 1442: 1140: 998: 943: 878: 824:factual accuracy is 587: 497: 454: 335: 244: 200: 176: 120: 78: 411: 368: 168:(i.e., hat matrix) 2582:Euclidean distance 2548:Recommender system 2428:Similarity measure 2242:evaluation metrics 2173:Pindyck, Robert S. 2127:Mean squared error 2097: 2050: 1880: 1679: 1643: 1613: 1422: 1120: 981: 929: 698: 572: 482: 426: 397: 354: 318: 227: 182: 138: 93: 2608: 2607: 2577:Cosine similarity 2413:Hopkins statistic 2088: 2030: 2021: 1980: 1835: 1792: 1725: 1670: 1529: 1310: 1212: 1106: 1054: 1032: 865: 864: 857: 621: 526: 296: 212: 185:{\displaystyle y} 166:projection matrix 132: 113:explanatory power 90: 72:square difference 64:prediction errors 55:procedure is the 39:), also known as 16:(Redirected from 2638: 2600:Confusion matrix 2375:Logarithmic Loss 2240:Machine learning 2233: 2226: 2219: 2210: 2203: 2202: 2169: 2106: 2104: 2103: 2098: 2096: 2095: 2090: 2089: 2081: 2059: 2057: 2056: 2051: 2031: 2029: 2028: 2023: 2022: 2014: 2010: 2009: 2008: 2003: 1999: 1995: 1994: 1982: 1981: 1973: 1967: 1966: 1950: 1945: 1929: 1924: 1923: 1889: 1887: 1886: 1881: 1876: 1872: 1871: 1843: 1842: 1837: 1836: 1828: 1821: 1820: 1815: 1811: 1807: 1806: 1794: 1793: 1785: 1779: 1778: 1762: 1757: 1727: 1726: 1721: 1707: 1688: 1686: 1685: 1680: 1678: 1677: 1672: 1671: 1663: 1652: 1650: 1649: 1644: 1642: 1641: 1622: 1620: 1619: 1614: 1609: 1605: 1604: 1576: 1575: 1563: 1559: 1558: 1557: 1552: 1548: 1544: 1543: 1531: 1530: 1522: 1516: 1515: 1499: 1494: 1431: 1429: 1428: 1423: 1418: 1414: 1413: 1409: 1394: 1393: 1388: 1384: 1357: 1356: 1344: 1340: 1339: 1338: 1333: 1329: 1325: 1324: 1312: 1311: 1303: 1297: 1296: 1280: 1275: 1246: 1245: 1240: 1236: 1235: 1231: 1227: 1226: 1214: 1213: 1205: 1196: 1195: 1162: 1157: 1129: 1127: 1126: 1121: 1116: 1112: 1108: 1107: 1104: 1087: 1086: 1056: 1055: 1052: 1034: 1033: 1030: 991:, one may write 990: 988: 987: 982: 965: 964: 955: 954: 938: 936: 935: 930: 928: 927: 909: 908: 890: 889: 860: 853: 849: 846: 840: 837:reliably sourced 817: 816: 809: 736: 735: 734: 718: 707: 705: 704: 699: 694: 690: 689: 684: 680: 679: 675: 671: 670: 658: 636: 635: 623: 622: 614: 581: 579: 578: 573: 571: 567: 563: 562: 541: 540: 528: 527: 519: 491: 489: 488: 483: 472: 471: 435: 433: 432: 427: 419: 410: 405: 395: 390: 372: 367: 362: 327: 325: 324: 319: 311: 310: 298: 297: 289: 280: 279: 261: 260: 259: 236: 234: 233: 228: 214: 213: 205: 194:predicted values 191: 189: 188: 183: 150:cross-validation 147: 145: 144: 139: 134: 133: 125: 102: 100: 99: 94: 92: 91: 83: 21: 18:Prediction error 2646: 2645: 2641: 2640: 2639: 2637: 2636: 2635: 2611: 2610: 2609: 2604: 2591: 2565: 2542: 2533:Inception score 2521: 2498: 2476:Computer Vision 2470: 2442: 2379: 2311: 2243: 2237: 2207: 2206: 2199: 2171: 2170: 2166: 2161: 2153:Model selection 2147: 2113: 2078: 2073: 2072: 2011: 1986: 1958: 1957: 1953: 1952: 1930: 1915: 1910: 1909: 1902: 1861: 1848: 1844: 1825: 1798: 1770: 1769: 1765: 1764: 1708: 1694: 1693: 1660: 1655: 1654: 1633: 1628: 1627: 1594: 1581: 1577: 1567: 1535: 1507: 1506: 1502: 1501: 1479: 1475: 1440: 1439: 1399: 1395: 1374: 1370: 1369: 1368: 1364: 1348: 1316: 1288: 1287: 1283: 1282: 1260: 1256: 1218: 1187: 1180: 1176: 1169: 1165: 1164: 1138: 1137: 1099: 1098: 1094: 1078: 1047: 1025: 996: 995: 946: 941: 940: 919: 900: 881: 876: 875: 861: 850: 844: 841: 830: 822:This article's 818: 814: 807: 749: 743: 732: 730: 728: 712: 662: 653: 649: 627: 611: 607: 606: 602: 585: 584: 554: 532: 516: 512: 495: 494: 463: 452: 451: 350: 333: 332: 302: 271: 251: 242: 241: 198: 197: 174: 173: 162: 118: 117: 76: 75: 23: 22: 15: 12: 11: 5: 2644: 2642: 2634: 2633: 2631:Loss functions 2628: 2623: 2613: 2612: 2606: 2605: 2603: 2602: 2596: 2593: 2592: 2590: 2589: 2584: 2579: 2573: 2571: 2567: 2566: 2564: 2563: 2558: 2552: 2550: 2544: 2543: 2541: 2540: 2535: 2529: 2527: 2523: 2522: 2520: 2519: 2514: 2508: 2506: 2500: 2499: 2497: 2496: 2491: 2486: 2480: 2478: 2472: 2471: 2469: 2468: 2463: 2458: 2452: 2450: 2444: 2443: 2441: 2440: 2435: 2430: 2425: 2420: 2415: 2410: 2405: 2403:Davies-Bouldin 2400: 2395: 2389: 2387: 2381: 2380: 2378: 2377: 2372: 2367: 2362: 2357: 2352: 2347: 2342: 2337: 2332: 2327: 2321: 2319: 2317:Classification 2313: 2312: 2310: 2309: 2304: 2299: 2294: 2289: 2284: 2279: 2274: 2269: 2264: 2259: 2253: 2251: 2245: 2244: 2238: 2236: 2235: 2228: 2221: 2213: 2205: 2204: 2197: 2163: 2162: 2160: 2157: 2156: 2155: 2150: 2145: 2139: 2134: 2129: 2124: 2119: 2112: 2109: 2094: 2087: 2084: 2061: 2060: 2049: 2046: 2043: 2040: 2037: 2034: 2027: 2020: 2017: 2007: 2002: 1998: 1993: 1989: 1985: 1979: 1976: 1970: 1965: 1961: 1956: 1949: 1944: 1941: 1938: 1934: 1927: 1922: 1918: 1900: 1891: 1890: 1879: 1875: 1870: 1867: 1864: 1860: 1857: 1854: 1851: 1847: 1841: 1834: 1831: 1824: 1819: 1814: 1810: 1805: 1801: 1797: 1791: 1788: 1782: 1777: 1773: 1768: 1761: 1756: 1753: 1750: 1746: 1742: 1739: 1736: 1733: 1730: 1724: 1720: 1717: 1714: 1711: 1704: 1701: 1676: 1669: 1666: 1640: 1636: 1624: 1623: 1612: 1608: 1603: 1600: 1597: 1593: 1590: 1587: 1584: 1580: 1574: 1570: 1566: 1562: 1556: 1551: 1547: 1542: 1538: 1534: 1528: 1525: 1519: 1514: 1510: 1505: 1498: 1493: 1490: 1487: 1483: 1478: 1474: 1471: 1468: 1465: 1462: 1459: 1456: 1453: 1450: 1447: 1433: 1432: 1421: 1417: 1412: 1408: 1405: 1402: 1398: 1392: 1387: 1383: 1380: 1377: 1373: 1367: 1363: 1360: 1355: 1351: 1347: 1343: 1337: 1332: 1328: 1323: 1319: 1315: 1309: 1306: 1300: 1295: 1291: 1286: 1279: 1274: 1271: 1268: 1264: 1259: 1255: 1252: 1249: 1244: 1239: 1234: 1230: 1225: 1221: 1217: 1211: 1208: 1202: 1199: 1194: 1190: 1186: 1183: 1179: 1175: 1172: 1168: 1161: 1156: 1153: 1150: 1146: 1131: 1130: 1119: 1115: 1111: 1102: 1097: 1093: 1090: 1085: 1081: 1077: 1074: 1071: 1068: 1065: 1062: 1059: 1050: 1046: 1043: 1040: 1037: 1028: 1024: 1021: 1018: 1015: 1012: 1009: 1006: 1003: 980: 977: 974: 971: 968: 963: 958: 953: 949: 926: 922: 918: 915: 912: 907: 903: 899: 896: 893: 888: 884: 874:For the model 863: 862: 821: 819: 812: 806: 803: 742: 739: 709: 708: 697: 693: 688: 683: 678: 674: 669: 665: 661: 657: 652: 648: 645: 642: 639: 634: 630: 626: 620: 617: 610: 605: 601: 598: 595: 592: 582: 570: 566: 561: 557: 553: 550: 547: 544: 539: 535: 531: 525: 522: 515: 511: 508: 505: 502: 492: 481: 478: 475: 470: 466: 462: 459: 437: 436: 425: 422: 418: 414: 409: 404: 400: 394: 389: 386: 383: 379: 375: 371: 366: 361: 357: 353: 349: 346: 343: 340: 329: 328: 317: 314: 309: 305: 301: 295: 292: 286: 283: 278: 274: 270: 267: 264: 258: 254: 250: 226: 223: 220: 217: 211: 208: 181: 161: 158: 137: 131: 128: 89: 86: 57:expected value 24: 14: 13: 10: 9: 6: 4: 3: 2: 2643: 2632: 2629: 2627: 2624: 2622: 2619: 2618: 2616: 2601: 2598: 2597: 2594: 2588: 2585: 2583: 2580: 2578: 2575: 2574: 2572: 2568: 2562: 2559: 2557: 2554: 2553: 2551: 2549: 2545: 2539: 2536: 2534: 2531: 2530: 2528: 2524: 2518: 2515: 2513: 2510: 2509: 2507: 2505: 2501: 2495: 2492: 2490: 2487: 2485: 2482: 2481: 2479: 2477: 2473: 2467: 2464: 2462: 2459: 2457: 2454: 2453: 2451: 2449: 2445: 2439: 2436: 2434: 2431: 2429: 2426: 2424: 2421: 2419: 2418:Jaccard index 2416: 2414: 2411: 2409: 2406: 2404: 2401: 2399: 2396: 2394: 2391: 2390: 2388: 2386: 2382: 2376: 2373: 2371: 2368: 2366: 2363: 2361: 2358: 2356: 2353: 2351: 2348: 2346: 2343: 2341: 2338: 2336: 2333: 2331: 2328: 2326: 2323: 2322: 2320: 2318: 2314: 2308: 2305: 2303: 2300: 2298: 2295: 2293: 2290: 2288: 2285: 2283: 2280: 2278: 2275: 2273: 2270: 2268: 2265: 2263: 2260: 2258: 2255: 2254: 2252: 2250: 2246: 2241: 2234: 2229: 2227: 2222: 2220: 2215: 2214: 2211: 2200: 2198:0-07-050098-3 2194: 2190: 2186: 2182: 2178: 2174: 2168: 2165: 2158: 2154: 2151: 2149: 2148: 2140: 2138: 2135: 2133: 2130: 2128: 2125: 2123: 2120: 2118: 2115: 2114: 2110: 2108: 2092: 2085: 2082: 2070: 2066: 2047: 2044: 2041: 2038: 2035: 2032: 2025: 2018: 2015: 2005: 2000: 1991: 1987: 1977: 1974: 1968: 1963: 1959: 1954: 1947: 1942: 1939: 1936: 1932: 1925: 1920: 1916: 1908: 1907: 1906: 1904: 1903: 1895: 1894:Colin Mallows 1877: 1873: 1868: 1865: 1862: 1858: 1855: 1852: 1849: 1845: 1839: 1832: 1829: 1822: 1817: 1812: 1803: 1799: 1789: 1786: 1780: 1775: 1771: 1766: 1759: 1754: 1751: 1748: 1744: 1740: 1734: 1728: 1722: 1702: 1699: 1692: 1691: 1690: 1674: 1667: 1664: 1638: 1634: 1610: 1606: 1601: 1598: 1595: 1591: 1588: 1585: 1582: 1578: 1572: 1568: 1564: 1560: 1554: 1549: 1540: 1536: 1526: 1523: 1517: 1512: 1508: 1503: 1496: 1491: 1488: 1485: 1481: 1476: 1472: 1466: 1460: 1454: 1451: 1448: 1445: 1438: 1437: 1436: 1419: 1415: 1410: 1406: 1403: 1400: 1396: 1390: 1385: 1381: 1378: 1375: 1371: 1365: 1361: 1358: 1353: 1349: 1345: 1341: 1335: 1330: 1321: 1317: 1307: 1304: 1298: 1293: 1289: 1284: 1277: 1272: 1269: 1266: 1262: 1257: 1253: 1247: 1242: 1237: 1232: 1223: 1219: 1209: 1206: 1200: 1192: 1188: 1181: 1177: 1173: 1166: 1159: 1154: 1151: 1148: 1144: 1136: 1135: 1134: 1117: 1113: 1109: 1100: 1095: 1091: 1088: 1083: 1079: 1075: 1072: 1066: 1063: 1060: 1044: 1041: 1038: 1026: 1022: 1016: 1010: 1007: 1004: 1001: 994: 993: 992: 975: 972: 969: 956: 951: 947: 924: 920: 916: 913: 905: 901: 894: 891: 886: 882: 872: 870: 859: 856: 848: 838: 834: 828: 827: 820: 811: 810: 804: 802: 799: 796: 791: 787: 782: 778: 774: 770: 766: 762: 758: 754: 748: 740: 738: 726: 722: 716: 711:The quantity 695: 691: 686: 681: 676: 667: 663: 655: 650: 646: 640: 632: 628: 618: 615: 608: 603: 599: 593: 590: 583: 568: 559: 555: 548: 545: 537: 533: 523: 520: 513: 509: 503: 500: 493: 479: 476: 473: 468: 464: 460: 457: 450: 449: 448: 446: 442: 423: 420: 416: 412: 407: 402: 398: 392: 387: 384: 381: 377: 373: 369: 364: 359: 355: 351: 347: 341: 338: 331: 330: 315: 307: 303: 293: 290: 284: 276: 272: 265: 262: 240: 239: 238: 224: 221: 218: 215: 206: 195: 179: 171: 167: 159: 157: 155: 151: 135: 129: 126: 115: 114: 109: 106: 87: 84: 73: 69: 65: 62: 58: 54: 50: 49:curve fitting 46: 42: 38: 34: 30: 19: 2281: 2184: 2167: 2143: 2068: 2064: 2062: 1898: 1892: 1625: 1434: 1132: 873: 866: 851: 842: 823: 800: 794: 789: 785: 780: 776: 772: 768: 763:may run the 761:data analyst 756: 750: 724: 720: 714: 710: 438: 169: 163: 153: 112: 111: 107: 67: 63: 40: 36: 32: 26: 753:data sample 160:Formulation 2615:Categories 2570:Similarity 2512:Perplexity 2423:Rand index 2408:Dunn index 2393:Silhouette 2385:Clustering 2249:Regression 2159:References 2142:Mallows's 869:population 767:over only 765:regression 755:of length 719:is called 105:true value 53:regression 29:statistics 2340:Precision 2292:RMSE/RMSD 2086:^ 2083:σ 2033:− 2019:^ 2016:σ 1978:^ 1969:− 1933:∑ 1859:⁡ 1853:− 1833:^ 1830:σ 1823:− 1790:^ 1781:− 1745:∑ 1729:⁡ 1723:^ 1703:⋅ 1668:^ 1665:σ 1635:σ 1592:⁡ 1586:− 1569:σ 1565:− 1527:^ 1518:− 1482:∑ 1473:⁡ 1455:⁡ 1449:⋅ 1404:− 1379:− 1362:⁡ 1350:σ 1346:− 1308:^ 1299:− 1263:∑ 1254:⁡ 1210:^ 1201:− 1174:⁡ 1145:∑ 1092:⁡ 1080:σ 1064:− 1042:− 1011:⁡ 1005:⋅ 957:∼ 948:ε 921:ε 917:σ 833:talk page 647:⁡ 641:− 619:^ 600:⁡ 546:− 524:^ 510:⁡ 413:⁡ 378:∑ 348:⁡ 294:^ 285:− 210:^ 130:^ 88:^ 45:smoothing 2556:Coverage 2335:Accuracy 2179:(1991). 2111:See also 845:May 2018 826:disputed 445:variance 2448:Ranking 2438:SimHash 2325:F-score 2189:516–535 731:√ 196:vector 70:), the 61:squared 59:of the 43:, of a 2345:Recall 2195:  2063:where 1435:Thus, 939:where 759:, the 729:RMSPE= 723:. The 2350:Kappa 2267:sMAPE 795:n – q 790:n – q 781:n – q 775:< 713:SSPE= 51:, or 2517:BLEU 2489:SSIM 2484:PSNR 2461:NDCG 2282:MSPE 2277:MASE 2272:MAPE 2193:ISBN 2071:and 1452:MSPE 1008:MSPE 733:MSPE 717:MSPE 458:MSPE 441:bias 339:MSPE 37:MSPE 31:the 2538:FID 2504:NLP 2494:IoU 2456:MRR 2433:SMC 2365:ROC 2360:AUC 2355:MCC 2307:MAD 2302:MDA 2287:RMS 2262:MAE 2257:MSE 1626:If 591:VAR 477:VAR 192:to 116:of 27:In 2617:: 2466:AP 2330:P4 2191:. 2183:. 2175:; 1856:tr 1589:tr 1359:tr 1089:tr 737:. 501:ME 465:ME 399:PE 356:PE 68:PE 47:, 2297:R 2232:e 2225:t 2218:v 2201:. 2146:p 2144:C 2093:2 2069:p 2065:p 2048:. 2045:p 2042:2 2039:+ 2036:n 2026:2 2006:2 2001:) 1997:) 1992:i 1988:x 1984:( 1975:g 1964:i 1960:y 1955:( 1948:n 1943:1 1940:= 1937:i 1926:= 1921:p 1917:C 1901:p 1899:C 1878:. 1874:) 1869:] 1866:L 1863:[ 1850:n 1846:( 1840:2 1818:2 1813:) 1809:) 1804:i 1800:x 1796:( 1787:g 1776:i 1772:y 1767:( 1760:n 1755:1 1752:= 1749:i 1741:= 1738:) 1735:L 1732:( 1719:E 1716:P 1713:S 1710:M 1700:n 1675:2 1639:2 1611:. 1607:) 1602:] 1599:L 1596:[ 1583:n 1579:( 1573:2 1561:] 1555:2 1550:) 1546:) 1541:i 1537:x 1533:( 1524:g 1513:i 1509:y 1504:( 1497:n 1492:1 1489:= 1486:i 1477:[ 1470:E 1467:= 1464:) 1461:L 1458:( 1446:n 1420:. 1416:] 1411:) 1407:L 1401:I 1397:( 1391:T 1386:) 1382:L 1376:I 1372:( 1366:[ 1354:2 1342:] 1336:2 1331:) 1327:) 1322:i 1318:x 1314:( 1305:g 1294:i 1290:y 1285:( 1278:n 1273:1 1270:= 1267:i 1258:[ 1251:E 1248:= 1243:2 1238:) 1233:] 1229:) 1224:i 1220:x 1216:( 1207:g 1198:) 1193:i 1189:x 1185:( 1182:g 1178:[ 1171:E 1167:( 1160:n 1155:1 1152:= 1149:i 1118:. 1114:] 1110:L 1105:T 1101:L 1096:[ 1084:2 1076:+ 1073:g 1070:) 1067:L 1061:I 1058:( 1053:T 1049:) 1045:L 1039:I 1036:( 1031:T 1027:g 1023:= 1020:) 1017:L 1014:( 1002:n 979:) 976:1 973:, 970:0 967:( 962:N 952:i 925:i 914:+ 911:) 906:i 902:x 898:( 895:g 892:= 887:i 883:y 858:) 852:( 847:) 843:( 839:. 829:. 786:q 777:n 773:q 769:q 757:n 715:n 696:. 692:] 687:2 682:) 677:] 673:) 668:i 664:x 660:( 656:g 651:[ 644:E 638:) 633:i 629:x 625:( 616:g 609:( 604:[ 597:E 594:= 569:] 565:) 560:i 556:x 552:( 549:g 543:) 538:i 534:x 530:( 521:g 514:[ 507:E 504:= 480:, 474:+ 469:2 461:= 424:. 421:n 417:/ 408:2 403:i 393:n 388:1 385:= 382:i 374:= 370:] 365:2 360:i 352:[ 345:E 342:= 316:, 313:) 308:i 304:x 300:( 291:g 282:) 277:i 273:x 269:( 266:g 263:= 257:i 253:E 249:P 225:, 222:y 219:L 216:= 207:y 180:y 170:L 154:g 136:, 127:g 108:g 85:g 66:( 35:( 20:)

Index

Prediction error
statistics
smoothing
curve fitting
regression
expected value
squared
square difference
true value
cross-validation
projection matrix
predicted values
bias
variance
Cross-validation (statistics)
data sample
data analyst
regression
disputed
talk page
reliably sourced
Learn how and when to remove this message
population
Colin Mallows
Cp
Akaike information criterion
Bias-variance tradeoff
Mean squared error
Errors and residuals in statistics
Law of total variance

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑