Knowledge (XXG)

Integrated nested Laplace approximations

Source đź“ť

3731: 3284: 3726:{\displaystyle {\begin{aligned}{\widetilde {\pi }}({\boldsymbol {\theta }}_{k}|{\boldsymbol {y}})&\propto \left.{\frac {\pi \left({\boldsymbol {x}},{\boldsymbol {\theta }}_{k},{\boldsymbol {y}}\right)}{{\widetilde {\pi }}_{G}\left({\boldsymbol {x}}|{\boldsymbol {\theta }}_{k},{\boldsymbol {y}}\right)}}\right\vert _{{\boldsymbol {x}}={\boldsymbol {x}}^{*}({\boldsymbol {\theta }}_{k})},\\&\propto \left.{\frac {\pi ({\boldsymbol {y}}|{\boldsymbol {x}},{\boldsymbol {\theta }}_{k})\pi ({\boldsymbol {x}}|{\boldsymbol {\theta }}_{k})\pi ({\boldsymbol {\theta }}_{k})}{{\widetilde {\pi }}_{G}\left({\boldsymbol {x}}|{\boldsymbol {\theta }}_{k},{\boldsymbol {y}}\right)}}\right\vert _{{\boldsymbol {x}}={\boldsymbol {x}}^{*}({\boldsymbol {\theta }}_{k})},\end{aligned}}} 1801: 2409: 1461: 268: 2148: 2022: 1796:{\displaystyle {\begin{aligned}\pi ({\boldsymbol {x}},{\boldsymbol {\theta }}|{\boldsymbol {y}})&\propto \pi ({\boldsymbol {\theta }})\pi ({\boldsymbol {x}}|{\boldsymbol {\theta }})\prod _{i}\pi (y_{i}|\eta _{i},{\boldsymbol {\theta }})\\&\propto \pi ({\boldsymbol {\theta }})\left|{\boldsymbol {Q_{\theta }}}\right|^{1/2}\exp \left(-{\frac {1}{2}}{\boldsymbol {x}}^{T}{\boldsymbol {Q_{\theta }}}{\boldsymbol {x}}+\sum _{i}\log \left\right).\end{aligned}}} 1806: 3189: 2404:{\displaystyle {\begin{array}{rcl}{\widetilde {\pi }}(x_{i}|{\boldsymbol {y}})&=&\int {\widetilde {\pi }}(x_{i}|{\boldsymbol {\theta }},{\boldsymbol {y}}){\widetilde {\pi }}({\boldsymbol {\theta }}|{\boldsymbol {y}})d{\boldsymbol {\theta }}\\{\widetilde {\pi }}(\theta _{j}|{\boldsymbol {y}})&=&\int {\widetilde {\pi }}({\boldsymbol {\theta }}|{\boldsymbol {y}})d{\boldsymbol {\theta }}_{-j},\end{array}}} 2810: 1412: 3045: 2623: 36: 1038: 2017:{\displaystyle {\begin{array}{rcl}\pi (x_{i}|{\boldsymbol {y}})&=&\int \pi (x_{i}|{\boldsymbol {\theta }},{\boldsymbol {y}})\pi ({\boldsymbol {\theta }}|{\boldsymbol {y}})d{\boldsymbol {\theta }}\\\pi (\theta _{j}|{\boldsymbol {y}})&=&\int \pi ({\boldsymbol {\theta }}|{\boldsymbol {y}})d{\boldsymbol {\theta }}_{-j},\end{array}}} 1270: 903: 773: 3184:{\displaystyle {\begin{aligned}{\pi }({\boldsymbol {\theta }}|{\boldsymbol {y}})={\frac {\pi \left({\boldsymbol {x}},{\boldsymbol {\theta }},{\boldsymbol {y}}\right)}{\pi \left({\boldsymbol {x}}|{\boldsymbol {\theta }},{\boldsymbol {y}}\right)\pi ({\boldsymbol {y}})}},\end{aligned}}} 2805:{\displaystyle {\begin{aligned}{\widetilde {\pi }}(x_{i}|{\boldsymbol {y}})=\sum _{k}{\widetilde {\pi }}\left(x_{i}|{\boldsymbol {\theta }}_{k},{\boldsymbol {y}}\right)\times {\widetilde {\pi }}({\boldsymbol {\theta }}_{k}|{\boldsymbol {y}})\times \Delta _{k},\end{aligned}}} 2140: 1407:{\displaystyle \pi ({\boldsymbol {x}},{\boldsymbol {\theta }}|{\boldsymbol {y}})={\frac {\pi ({\boldsymbol {y}}|{\boldsymbol {x}},{\boldsymbol {\theta }})\pi ({\boldsymbol {x}}|{\boldsymbol {\theta }})\pi ({\boldsymbol {\theta }})}{\pi ({\boldsymbol {y}})}},} 3807: 4293: 661: 3873: 4185: 3239: 3040: 2987: 519:. The linear predictor can take the form of a (Bayesian) additive model. All latent effects (the linear predictor, the intercept, coefficients of possible covariates, and so on) are collectively denoted by the vector 1033:{\displaystyle \pi ({\boldsymbol {x}}|{\boldsymbol {\theta }})\propto \left|{\boldsymbol {Q_{\theta }}}\right|^{1/2}\exp \left(-{\frac {1}{2}}{\boldsymbol {x}}^{T}{\boldsymbol {Q_{\theta }}}{\boldsymbol {x}}\right),} 330:
methods to compute posterior marginal distributions. Due to its relative speed even with large data sets for certain problems and models, INLA has been a popular inference method in applied statistics, in particular
2596: 3276: 3953: 4090: 4047: 4128: 2542: 4348: 1206: 898: 4295:
is more involved, and the INLA method provides three options for this: Gaussian approximation, Laplace approximation, or the simplified Laplace approximation. For the numerical integration to obtain
1130: 425: 3289: 3050: 2905: 2628: 1466: 2455: 2027: 4216: 2501: 1161: 4611:
Lindgren, Finn; Rue, Håvard; Lindström, Johan (2011). "An explicit link between Gaussian fields and Gaussian Markov random fields: the stochastic partial differential equation approach".
2937: 3906: 1067: 3736: 2832: 2618: 1456: 1262: 1089: 656: 609: 565: 841: 4004: 3982: 1434: 1240: 863: 819: 634: 587: 539: 797: 2859: 4224: 3816: 513: 456: 483: 4378:
Rue, HĂĄvard; Martino, Sara; Chopin, Nicolas (2009). "Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations".
298: 768:{\displaystyle \pi ({\boldsymbol {y}}|{\boldsymbol {x}},{\boldsymbol {\theta }})=\prod _{i\in {\mathcal {I}}}\pi (y_{i}|\eta _{i},{\boldsymbol {\theta }}),} 89: 4413:
Taylor, Benjamin M.; Diggle, Peter J. (2014). "INLA or MCMC? A tutorial and comparative evaluation for spatial prediction in log-Gaussian Cox processes".
900:
is a Gaussian Markov Random Field (GMRF) (that is, a multivariate Gaussian with additional conditional independence properties) with probability density
4133: 3194: 2995: 2942: 348: 4747: 2547: 3244: 3911: 4052: 4009: 4095: 2506: 1803:
Obtaining the exact posterior is generally a very difficult problem. In INLA, the main aim is to approximate the posterior marginals
326:. It is designed for a class of models called latent Gaussian models (LGMs), for which it can be a fast and accurate alternative for 4766: 4595: 4541: 4516: 291: 254: 821:(some elements may be unobserved, and for these INLA computes a posterior predictive distribution). Note that the linear predictor 4298: 181: 84: 3962:
The trick in the Laplace approximation above is the fact that the Gaussian approximation is applied on the full conditional of
1166: 871: 207: 1098: 352: 145: 370: 2135:{\displaystyle {\boldsymbol {\theta }}_{-j}=\left(\theta _{1},\dots ,\theta _{j-1},\theta _{j+1},\dots ,\theta _{m}\right)} 2864: 4658:"Using a Bayesian modelling approach (INLA-SPDE) to predict the occurrence of the Spinetail Devil Ray (Mobular mobular)" 2414: 284: 176: 114: 4771: 4190: 3802:{\displaystyle {\widetilde {\pi }}_{G}\left({\boldsymbol {x}}|{\boldsymbol {\theta }}_{k},{\boldsymbol {y}}\right)} 3279: 2460: 1135: 166: 135: 2910: 3882: 1043: 228: 109: 516: 486: 356: 327: 249: 161: 4633: 2815: 2601: 1439: 1245: 1072: 639: 592: 548: 824: 140: 1217: 344: 43: 4557:
Opitz, T. (2017). "Latent Gaussian modeling and INLA: A review with focus on space-time applications".
3987: 3965: 1417: 1223: 846: 802: 617: 570: 522: 267: 1132:
is its determinant. The precision matrix is sparse due to the GMRF assumption. The prior distribution
4669: 4049:
itself need not be close to a Gaussian, and so the Gaussian approximation is not directly applied on
223: 104: 74: 778: 4288:{\displaystyle {\widetilde {\pi }}\left(x_{i}|{\boldsymbol {\theta }}_{k},{\boldsymbol {y}}\right)} 3810: 323: 55: 47: 27: 3868:{\displaystyle {\pi }\left({\boldsymbol {x}}|{\boldsymbol {\theta }}_{k},{\boldsymbol {y}}\right)} 4638: 4566: 4440: 4422: 4395: 2837: 428: 319: 272: 197: 69: 3956: 99: 4350:, also three options are available: grid search, central composite design, or empirical Bayes. 491: 4743: 4695: 4591: 4537: 4512: 4489: 3876: 1265: 434: 202: 79: 51: 4685: 4677: 4628: 4620: 4479: 4471: 4432: 4387: 1092: 332: 94: 4006:. Applying the approximation here improves the accuracy of the method, since the posterior 461: 130: 4673: 4460:"Bayesian computation for Log-Gaussian Cox processes: a comparative analysis of methods" 4690: 4657: 4484: 4459: 3984:
in the denominator since it is usually close to a Gaussian due to the GMRF property of
542: 2153: 1811: 1163:
for the hyperparameters need not be Gaussian. However, the number of hyperparameters,
4760: 4642: 4624: 4391: 4444: 4399: 340: 244: 4475: 4180:{\displaystyle {\widetilde {\pi }}({\boldsymbol {\theta }}_{k}|{\boldsymbol {y}})} 4436: 4656:
Lezama-Ochoa, N.; Grazia Pennino, M.; Hall, M. A.; Lopez, J.; Murua, H. (2020).
4092:. The second important property of a GMRF, the sparsity of the precision matrix 2457:
is an approximated posterior density. The approximation to the marginal density
4681: 3234:{\displaystyle {\widetilde {\pi }}({\boldsymbol {\theta }}|{\boldsymbol {y}})} 3035:{\displaystyle {\widetilde {\pi }}({\boldsymbol {\theta }}|{\boldsymbol {y}})} 2982:{\displaystyle {\widetilde {\pi }}({\boldsymbol {\theta }}|{\boldsymbol {y}})} 427:
denote the response variable (that is, the observations) which belongs to an
4699: 4493: 4588:
Geospatial Health Data: Modeling and Visualization with R-INLA and Shiny
336: 35: 2591:{\displaystyle \pi (x_{i}|{\boldsymbol {\theta }},{\boldsymbol {y}})} 4571: 3271:{\displaystyle {\boldsymbol {\theta }}={\boldsymbol {\theta }}_{k}} 614:
The observations are assumed to be conditionally independent given
4427: 3948:{\displaystyle {\boldsymbol {x}}^{*}({\boldsymbol {\theta }}_{k})} 2145:
A key idea of INLA is to construct nested approximations given by
4085:{\displaystyle {\pi }({\boldsymbol {\theta }}|{\boldsymbol {y}})} 4042:{\displaystyle {\pi }({\boldsymbol {\theta }}|{\boldsymbol {y}})} 868:
For the model to be a latent Gaussian model, it is assumed that
4123:{\displaystyle {\boldsymbol {Q}}_{{\boldsymbol {\theta }}_{k}}} 2537:{\displaystyle \pi ({\boldsymbol {\theta }}|{\boldsymbol {y}})} 4343:{\displaystyle {\widetilde {\pi }}(x_{i}|{\boldsymbol {y}})} 784: 713: 3511: 3341: 4713: 4507:
Wang, Xiaofeng; Yue, Yu Ryan; Faraway, Julian J. (2018).
3955:. The mode can be found numerically for example with the 1201:{\displaystyle m=\mathrm {dim} ({\boldsymbol {\theta }})} 893:{\displaystyle {\boldsymbol {x}}|{\boldsymbol {\theta }}} 1125:{\displaystyle \left|{\boldsymbol {Q_{\theta }}}\right|} 4534:
Spatial and Spatio-temporal Bayesian Models with R-INLA
3241:
is obtained at a specific value of the hyperparameters
2503:
is obtained in a nested fashion by first approximating
420:{\displaystyle {\boldsymbol {y}}=(y_{1},\dots ,y_{n})} 4301: 4227: 4193: 4136: 4098: 4055: 4012: 3990: 3968: 3914: 3885: 3819: 3739: 3287: 3247: 3197: 3048: 2998: 2945: 2913: 2867: 2840: 2818: 2626: 2604: 2550: 2509: 2463: 2417: 2151: 2030: 1809: 1464: 1442: 1420: 1273: 1248: 1226: 1169: 1138: 1101: 1075: 1046: 906: 874: 849: 827: 805: 781: 664: 642: 620: 595: 573: 551: 525: 494: 464: 437: 373: 2900:{\displaystyle \pi (\theta _{j}|{\boldsymbol {y}})} 4342: 4287: 4210: 4179: 4122: 4084: 4041: 3998: 3976: 3947: 3900: 3867: 3801: 3725: 3270: 3233: 3183: 3034: 2981: 2931: 2899: 2853: 2826: 2804: 2612: 2590: 2536: 2495: 2450:{\displaystyle {\widetilde {\pi }}(\cdot |\cdot )} 2449: 2403: 2134: 2016: 1795: 1450: 1428: 1406: 1256: 1234: 1216:In Bayesian inference, one wants to solve for the 1200: 1155: 1124: 1083: 1061: 1032: 892: 857: 835: 813: 791: 767: 650: 628: 603: 581: 559: 533: 507: 477: 450: 419: 4634:20.500.11820/1084d335-e5b4-4867-9245-ec9c4f6f4645 4464:Journal of Statistical Computation and Simulation 4415:Journal of Statistical Computation and Simulation 799:is the set of indices for observed elements of 611:are random variables with prior distributions. 355:. The INLA method is implemented in the R-INLA 4559:Journal de la SociĂ©tĂ© Française de Statistique 4532:Blangiardo, Marta; Cameletti, Michela (2015). 1208:, is assumed to be small (say, less than 15). 4458:Teng, M.; Nathoo, F.; Johnson, T. D. (2017). 4211:{\displaystyle {{\boldsymbol {\theta }}_{k}}} 2496:{\displaystyle \pi (x_{i}|{\boldsymbol {y}})} 1156:{\displaystyle \pi ({\boldsymbol {\theta }})} 343:. It is also possible to combine INLA with a 292: 8: 2932:{\displaystyle {\boldsymbol {\theta }}_{-j}} 4130:, is required for efficient computation of 3901:{\displaystyle {\boldsymbol {\theta }}_{k}} 1062:{\displaystyle {\boldsymbol {Q_{\theta }}}} 2812:where the summation is over the values of 351:to study e.g. spatial point processes and 299: 285: 18: 4689: 4632: 4570: 4483: 4426: 4332: 4327: 4321: 4303: 4302: 4300: 4275: 4266: 4261: 4255: 4249: 4229: 4228: 4226: 4201: 4196: 4194: 4192: 4169: 4164: 4158: 4153: 4138: 4137: 4135: 4112: 4107: 4105: 4100: 4097: 4074: 4069: 4064: 4056: 4054: 4031: 4026: 4021: 4013: 4011: 3991: 3989: 3969: 3967: 3936: 3931: 3921: 3916: 3913: 3892: 3887: 3884: 3855: 3846: 3841: 3835: 3830: 3820: 3818: 3789: 3780: 3775: 3769: 3764: 3753: 3742: 3741: 3738: 3705: 3700: 3690: 3685: 3676: 3675: 3658: 3649: 3644: 3638: 3633: 3622: 3611: 3610: 3598: 3593: 3577: 3572: 3566: 3561: 3546: 3541: 3532: 3527: 3522: 3513: 3485: 3480: 3470: 3465: 3456: 3455: 3438: 3429: 3424: 3418: 3413: 3402: 3391: 3390: 3377: 3368: 3363: 3354: 3343: 3324: 3319: 3313: 3308: 3293: 3292: 3288: 3286: 3262: 3257: 3248: 3246: 3223: 3218: 3213: 3199: 3198: 3196: 3163: 3147: 3139: 3134: 3129: 3109: 3101: 3093: 3082: 3071: 3066: 3061: 3053: 3049: 3047: 3024: 3019: 3014: 3000: 2999: 2997: 2971: 2966: 2961: 2947: 2946: 2944: 2920: 2915: 2912: 2889: 2884: 2878: 2866: 2845: 2839: 2819: 2817: 2789: 2774: 2769: 2763: 2758: 2743: 2742: 2729: 2720: 2715: 2709: 2703: 2683: 2682: 2676: 2661: 2656: 2650: 2632: 2631: 2627: 2625: 2605: 2603: 2580: 2572: 2567: 2561: 2549: 2526: 2521: 2516: 2508: 2485: 2480: 2474: 2462: 2436: 2419: 2418: 2416: 2385: 2380: 2368: 2363: 2358: 2344: 2343: 2325: 2320: 2314: 2296: 2295: 2286: 2275: 2270: 2265: 2251: 2250: 2242: 2234: 2229: 2223: 2205: 2204: 2186: 2181: 2175: 2157: 2156: 2152: 2150: 2121: 2096: 2077: 2058: 2037: 2032: 2029: 1998: 1993: 1981: 1976: 1971: 1947: 1942: 1936: 1917: 1906: 1901: 1896: 1882: 1874: 1869: 1863: 1835: 1830: 1824: 1810: 1808: 1768: 1759: 1750: 1744: 1717: 1705: 1698: 1693: 1687: 1682: 1671: 1647: 1643: 1632: 1627: 1614: 1590: 1581: 1572: 1566: 1550: 1538: 1533: 1528: 1514: 1493: 1488: 1483: 1475: 1465: 1463: 1443: 1441: 1421: 1419: 1390: 1374: 1360: 1355: 1350: 1336: 1328: 1323: 1318: 1309: 1298: 1293: 1288: 1280: 1272: 1249: 1247: 1227: 1225: 1190: 1176: 1168: 1145: 1137: 1111: 1106: 1100: 1076: 1074: 1052: 1047: 1045: 1017: 1010: 1005: 999: 994: 983: 959: 955: 944: 939: 923: 918: 913: 905: 885: 880: 875: 873: 850: 848: 828: 826: 806: 804: 783: 782: 780: 754: 745: 736: 730: 712: 711: 704: 689: 681: 676: 671: 663: 643: 641: 621: 619: 596: 594: 574: 572: 552: 550: 526: 524: 499: 493: 469: 463: 442: 436: 408: 389: 374: 372: 1212:Approximate Bayesian inference with INLA 349:stochastic partial differential equation 312:Integrated nested Laplace approximations 172:Integrated nested Laplace approximations 4373: 4371: 4369: 4367: 4365: 4363: 4359: 4333: 4276: 4262: 4221:Obtaining the approximate distribution 4197: 4170: 4154: 4108: 4101: 4075: 4065: 4032: 4022: 3992: 3970: 3932: 3917: 3888: 3856: 3842: 3831: 3790: 3776: 3765: 3701: 3686: 3677: 3659: 3645: 3634: 3594: 3573: 3562: 3542: 3533: 3523: 3481: 3466: 3457: 3439: 3425: 3414: 3378: 3364: 3355: 3325: 3309: 3258: 3249: 3224: 3214: 3164: 3148: 3140: 3130: 3110: 3102: 3094: 3072: 3062: 3025: 3015: 2972: 2962: 2916: 2907:is computed by numerically integrating 2890: 2827:{\displaystyle {\boldsymbol {\theta }}} 2820: 2775: 2759: 2730: 2716: 2662: 2613:{\displaystyle {\boldsymbol {\theta }}} 2606: 2598:, and then numerically integrating out 2581: 2573: 2527: 2517: 2486: 2381: 2369: 2359: 2326: 2287: 2276: 2266: 2243: 2235: 2187: 2033: 1994: 1982: 1972: 1948: 1918: 1907: 1897: 1883: 1875: 1836: 1769: 1706: 1699: 1695: 1683: 1633: 1629: 1615: 1591: 1539: 1529: 1515: 1494: 1484: 1476: 1451:{\displaystyle {\boldsymbol {\theta }}} 1444: 1422: 1391: 1375: 1361: 1351: 1337: 1329: 1319: 1299: 1289: 1281: 1257:{\displaystyle {\boldsymbol {\theta }}} 1250: 1228: 1191: 1146: 1112: 1108: 1084:{\displaystyle {\boldsymbol {\theta }}} 1077: 1053: 1049: 1018: 1011: 1007: 995: 945: 941: 924: 914: 886: 876: 851: 829: 807: 755: 690: 682: 672: 651:{\displaystyle {\boldsymbol {\theta }}} 644: 622: 604:{\displaystyle {\boldsymbol {\theta }}} 597: 575: 560:{\displaystyle {\boldsymbol {\theta }}} 553: 527: 375: 236: 215: 189: 153: 122: 61: 26: 4509:Bayesian Regression Modeling with INLA 836:{\displaystyle {\boldsymbol {\eta }}} 7: 2992:To get the approximate distribution 2834:, with integration weights given by 1414:the joint posterior distribution of 2842: 2786: 1183: 1180: 1177: 14: 3999:{\displaystyle {\boldsymbol {x}}} 3977:{\displaystyle {\boldsymbol {x}}} 1429:{\displaystyle {\boldsymbol {x}}} 1235:{\displaystyle {\boldsymbol {x}}} 858:{\displaystyle {\boldsymbol {x}}} 814:{\displaystyle {\boldsymbol {y}}} 629:{\displaystyle {\boldsymbol {x}}} 582:{\displaystyle {\boldsymbol {x}}} 534:{\displaystyle {\boldsymbol {x}}} 4625:10.1111/j.1467-9868.2011.00777.x 4392:10.1111/j.1467-9868.2008.00700.x 266: 182:Approximate Bayesian computation 34: 208:Maximum a posteriori estimation 4738:Gomez-Rubio, Virgilio (2021). 4536:. John Wiley & Sons, Ltd. 4337: 4328: 4314: 4256: 4174: 4165: 4149: 4079: 4070: 4061: 4036: 4027: 4018: 3942: 3927: 3836: 3770: 3711: 3696: 3639: 3604: 3589: 3583: 3567: 3558: 3552: 3528: 3519: 3491: 3476: 3419: 3329: 3320: 3304: 3228: 3219: 3210: 3168: 3160: 3135: 3076: 3067: 3058: 3029: 3020: 3011: 2976: 2967: 2958: 2894: 2885: 2871: 2779: 2770: 2754: 2710: 2666: 2657: 2643: 2585: 2568: 2554: 2531: 2522: 2513: 2490: 2481: 2467: 2444: 2437: 2430: 2373: 2364: 2355: 2330: 2321: 2307: 2280: 2271: 2262: 2247: 2230: 2216: 2191: 2182: 2168: 1986: 1977: 1968: 1952: 1943: 1929: 1911: 1902: 1893: 1887: 1870: 1856: 1840: 1831: 1817: 1773: 1751: 1737: 1619: 1611: 1595: 1573: 1559: 1543: 1534: 1525: 1519: 1511: 1498: 1489: 1472: 1395: 1387: 1379: 1371: 1365: 1356: 1347: 1341: 1324: 1315: 1303: 1294: 1277: 1195: 1187: 1150: 1142: 928: 919: 910: 881: 792:{\displaystyle {\mathcal {I}}} 759: 737: 723: 694: 677: 668: 567:. As per Bayesian statistics, 414: 382: 318:) is a method for approximate 1: 4476:10.1080/00949655.2017.1326117 4740:Bayesian inference with INLA 4437:10.1080/00949655.2013.788653 3191:as the starting point. Then 545:of the model are denoted by 115:Principle of maximum entropy 3042:, one can use the relation 2854:{\displaystyle \Delta _{k}} 353:species distribution models 85:Bernstein–von Mises theorem 4788: 4682:10.1038/s41598-020-73879-3 508:{\displaystyle \eta _{i}} 110:Principle of indifference 16:Bayesian inference method 4767:Computational statistics 4742:. Chapman and Hall/CRC. 4590:. Chapman and Hall/CRC. 4511:. Chapman and Hall/CRC. 1220:of the latent variables 451:{\displaystyle \mu _{i}} 328:Markov chain Monte Carlo 162:Markov chain Monte Carlo 3280:Laplace's approximation 2861:. The approximation of 167:Laplace's approximation 154:Posterior approximation 4586:Moraga, Paula (2019). 4344: 4289: 4212: 4181: 4124: 4086: 4043: 4000: 3978: 3949: 3902: 3869: 3811:Gaussian approximation 3803: 3727: 3272: 3235: 3185: 3036: 2983: 2933: 2901: 2855: 2828: 2806: 2614: 2592: 2538: 2497: 2451: 2405: 2136: 2018: 1797: 1452: 1430: 1408: 1258: 1236: 1218:posterior distribution 1202: 1157: 1126: 1085: 1063: 1034: 894: 859: 837: 815: 793: 769: 652: 630: 605: 583: 561: 535: 509: 479: 452: 421: 363:Latent Gaussian models 273:Mathematics portal 216:Evidence approximation 4613:J. R. Statist. Soc. B 4380:J. R. Statist. Soc. B 4345: 4290: 4213: 4182: 4125: 4087: 4044: 4001: 3979: 3957:Newton-Raphson method 3950: 3903: 3870: 3804: 3728: 3273: 3236: 3186: 3037: 2984: 2934: 2902: 2856: 2829: 2807: 2615: 2593: 2539: 2498: 2452: 2406: 2137: 2019: 1798: 1453: 1431: 1409: 1259: 1237: 1203: 1158: 1127: 1086: 1064: 1035: 895: 860: 838: 816: 794: 770: 653: 631: 606: 584: 562: 536: 510: 480: 478:{\displaystyle y_{i}} 453: 422: 345:finite element method 177:Variational inference 4299: 4225: 4191: 4134: 4096: 4053: 4010: 3988: 3966: 3912: 3883: 3817: 3737: 3285: 3245: 3195: 3046: 2996: 2943: 2911: 2865: 2838: 2816: 2624: 2602: 2548: 2507: 2461: 2415: 2149: 2028: 1807: 1462: 1440: 1418: 1271: 1246: 1224: 1167: 1136: 1099: 1073: 1044: 904: 872: 847: 825: 803: 779: 662: 640: 618: 593: 571: 549: 523: 492: 485:) being linked to a 462: 435: 371: 255:Posterior predictive 224:Evidence lower bound 105:Likelihood principle 75:Bayesian probability 4674:2020NatSR..1018822L 515:via an appropriate 28:Bayesian statistics 22:Part of a series on 4772:Bayesian inference 4662:Scientific Reports 4340: 4285: 4208: 4177: 4120: 4082: 4039: 3996: 3974: 3945: 3898: 3865: 3799: 3723: 3721: 3268: 3231: 3181: 3179: 3032: 2979: 2929: 2897: 2851: 2824: 2802: 2800: 2681: 2610: 2588: 2534: 2493: 2447: 2401: 2399: 2132: 2014: 2012: 1793: 1791: 1722: 1555: 1448: 1426: 1404: 1254: 1232: 1198: 1153: 1122: 1091:-dependent sparse 1081: 1059: 1030: 890: 855: 833: 811: 789: 765: 719: 648: 626: 601: 579: 557: 531: 505: 475: 448: 429:exponential family 417: 333:spatial statistics 320:Bayesian inference 198:Bayesian estimator 146:Hierarchical model 70:Bayesian inference 4749:978-1-03-217453-2 4470:(11): 2227–2252. 4421:(10): 2266–2284. 4311: 4237: 4146: 3750: 3669: 3619: 3449: 3399: 3301: 3207: 3172: 3008: 2955: 2751: 2691: 2672: 2640: 2427: 2352: 2304: 2259: 2213: 2165: 1713: 1679: 1546: 1399: 991: 700: 309: 308: 203:Credible interval 136:Linear regression 4779: 4753: 4725: 4724: 4722: 4720: 4714:"R-INLA Project" 4710: 4704: 4703: 4693: 4653: 4647: 4646: 4636: 4608: 4602: 4601: 4583: 4577: 4576: 4574: 4554: 4548: 4547: 4529: 4523: 4522: 4504: 4498: 4497: 4487: 4455: 4449: 4448: 4430: 4410: 4404: 4403: 4375: 4349: 4347: 4346: 4341: 4336: 4331: 4326: 4325: 4313: 4312: 4304: 4294: 4292: 4291: 4286: 4284: 4280: 4279: 4271: 4270: 4265: 4259: 4254: 4253: 4239: 4238: 4230: 4217: 4215: 4214: 4209: 4207: 4206: 4205: 4200: 4186: 4184: 4183: 4178: 4173: 4168: 4163: 4162: 4157: 4148: 4147: 4139: 4129: 4127: 4126: 4121: 4119: 4118: 4117: 4116: 4111: 4104: 4091: 4089: 4088: 4083: 4078: 4073: 4068: 4060: 4048: 4046: 4045: 4040: 4035: 4030: 4025: 4017: 4005: 4003: 4002: 3997: 3995: 3983: 3981: 3980: 3975: 3973: 3954: 3952: 3951: 3946: 3941: 3940: 3935: 3926: 3925: 3920: 3907: 3905: 3904: 3899: 3897: 3896: 3891: 3874: 3872: 3871: 3866: 3864: 3860: 3859: 3851: 3850: 3845: 3839: 3834: 3824: 3808: 3806: 3805: 3800: 3798: 3794: 3793: 3785: 3784: 3779: 3773: 3768: 3758: 3757: 3752: 3751: 3743: 3732: 3730: 3729: 3724: 3722: 3715: 3714: 3710: 3709: 3704: 3695: 3694: 3689: 3680: 3674: 3670: 3668: 3667: 3663: 3662: 3654: 3653: 3648: 3642: 3637: 3627: 3626: 3621: 3620: 3612: 3607: 3603: 3602: 3597: 3582: 3581: 3576: 3570: 3565: 3551: 3550: 3545: 3536: 3531: 3526: 3514: 3502: 3495: 3494: 3490: 3489: 3484: 3475: 3474: 3469: 3460: 3454: 3450: 3448: 3447: 3443: 3442: 3434: 3433: 3428: 3422: 3417: 3407: 3406: 3401: 3400: 3392: 3387: 3386: 3382: 3381: 3373: 3372: 3367: 3358: 3344: 3328: 3323: 3318: 3317: 3312: 3303: 3302: 3294: 3277: 3275: 3274: 3269: 3267: 3266: 3261: 3252: 3240: 3238: 3237: 3232: 3227: 3222: 3217: 3209: 3208: 3200: 3190: 3188: 3187: 3182: 3180: 3173: 3171: 3167: 3156: 3152: 3151: 3143: 3138: 3133: 3119: 3118: 3114: 3113: 3105: 3097: 3083: 3075: 3070: 3065: 3057: 3041: 3039: 3038: 3033: 3028: 3023: 3018: 3010: 3009: 3001: 2988: 2986: 2985: 2980: 2975: 2970: 2965: 2957: 2956: 2948: 2938: 2936: 2935: 2930: 2928: 2927: 2919: 2906: 2904: 2903: 2898: 2893: 2888: 2883: 2882: 2860: 2858: 2857: 2852: 2850: 2849: 2833: 2831: 2830: 2825: 2823: 2811: 2809: 2808: 2803: 2801: 2794: 2793: 2778: 2773: 2768: 2767: 2762: 2753: 2752: 2744: 2738: 2734: 2733: 2725: 2724: 2719: 2713: 2708: 2707: 2693: 2692: 2684: 2680: 2665: 2660: 2655: 2654: 2642: 2641: 2633: 2619: 2617: 2616: 2611: 2609: 2597: 2595: 2594: 2589: 2584: 2576: 2571: 2566: 2565: 2543: 2541: 2540: 2535: 2530: 2525: 2520: 2502: 2500: 2499: 2494: 2489: 2484: 2479: 2478: 2456: 2454: 2453: 2448: 2440: 2429: 2428: 2420: 2410: 2408: 2407: 2402: 2400: 2393: 2392: 2384: 2372: 2367: 2362: 2354: 2353: 2345: 2329: 2324: 2319: 2318: 2306: 2305: 2297: 2290: 2279: 2274: 2269: 2261: 2260: 2252: 2246: 2238: 2233: 2228: 2227: 2215: 2214: 2206: 2190: 2185: 2180: 2179: 2167: 2166: 2158: 2141: 2139: 2138: 2133: 2131: 2127: 2126: 2125: 2107: 2106: 2088: 2087: 2063: 2062: 2045: 2044: 2036: 2023: 2021: 2020: 2015: 2013: 2006: 2005: 1997: 1985: 1980: 1975: 1951: 1946: 1941: 1940: 1921: 1910: 1905: 1900: 1886: 1878: 1873: 1868: 1867: 1839: 1834: 1829: 1828: 1802: 1800: 1799: 1794: 1792: 1785: 1781: 1780: 1776: 1772: 1764: 1763: 1754: 1749: 1748: 1721: 1709: 1704: 1703: 1702: 1692: 1691: 1686: 1680: 1672: 1656: 1655: 1651: 1642: 1638: 1637: 1636: 1618: 1601: 1594: 1586: 1585: 1576: 1571: 1570: 1554: 1542: 1537: 1532: 1518: 1497: 1492: 1487: 1479: 1457: 1455: 1454: 1449: 1447: 1435: 1433: 1432: 1427: 1425: 1413: 1411: 1410: 1405: 1400: 1398: 1394: 1382: 1378: 1364: 1359: 1354: 1340: 1332: 1327: 1322: 1310: 1302: 1297: 1292: 1284: 1263: 1261: 1260: 1255: 1253: 1241: 1239: 1238: 1233: 1231: 1207: 1205: 1204: 1199: 1194: 1186: 1162: 1160: 1159: 1154: 1149: 1131: 1129: 1128: 1123: 1121: 1117: 1116: 1115: 1093:precision matrix 1090: 1088: 1087: 1082: 1080: 1068: 1066: 1065: 1060: 1058: 1057: 1056: 1039: 1037: 1036: 1031: 1026: 1022: 1021: 1016: 1015: 1014: 1004: 1003: 998: 992: 984: 968: 967: 963: 954: 950: 949: 948: 927: 922: 917: 899: 897: 896: 891: 889: 884: 879: 864: 862: 861: 856: 854: 842: 840: 839: 834: 832: 820: 818: 817: 812: 810: 798: 796: 795: 790: 788: 787: 774: 772: 771: 766: 758: 750: 749: 740: 735: 734: 718: 717: 716: 693: 685: 680: 675: 657: 655: 654: 649: 647: 635: 633: 632: 627: 625: 610: 608: 607: 602: 600: 588: 586: 585: 580: 578: 566: 564: 563: 558: 556: 540: 538: 537: 532: 530: 514: 512: 511: 506: 504: 503: 487:linear predictor 484: 482: 481: 476: 474: 473: 457: 455: 454: 449: 447: 446: 431:, with the mean 426: 424: 423: 418: 413: 412: 394: 393: 378: 324:Laplace's method 301: 294: 287: 271: 270: 237:Model evaluation 38: 19: 4787: 4786: 4782: 4781: 4780: 4778: 4777: 4776: 4757: 4756: 4750: 4737: 4734: 4732:Further reading 4729: 4728: 4718: 4716: 4712: 4711: 4707: 4655: 4654: 4650: 4610: 4609: 4605: 4598: 4585: 4584: 4580: 4556: 4555: 4551: 4544: 4531: 4530: 4526: 4519: 4506: 4505: 4501: 4457: 4456: 4452: 4412: 4411: 4407: 4377: 4376: 4361: 4356: 4317: 4297: 4296: 4260: 4245: 4244: 4240: 4223: 4222: 4195: 4189: 4188: 4187:for each value 4152: 4132: 4131: 4106: 4099: 4094: 4093: 4051: 4050: 4008: 4007: 3986: 3985: 3964: 3963: 3930: 3915: 3910: 3909: 3886: 3881: 3880: 3840: 3829: 3825: 3815: 3814: 3774: 3763: 3759: 3740: 3735: 3734: 3720: 3719: 3699: 3684: 3643: 3632: 3628: 3609: 3608: 3592: 3571: 3540: 3515: 3510: 3509: 3500: 3499: 3479: 3464: 3423: 3412: 3408: 3389: 3388: 3362: 3353: 3349: 3345: 3340: 3339: 3332: 3307: 3283: 3282: 3256: 3243: 3242: 3193: 3192: 3178: 3177: 3128: 3124: 3120: 3092: 3088: 3084: 3044: 3043: 2994: 2993: 2941: 2940: 2914: 2909: 2908: 2874: 2863: 2862: 2841: 2836: 2835: 2814: 2813: 2799: 2798: 2785: 2757: 2714: 2699: 2698: 2694: 2646: 2622: 2621: 2600: 2599: 2557: 2546: 2545: 2505: 2504: 2470: 2459: 2458: 2413: 2412: 2398: 2397: 2379: 2338: 2333: 2310: 2292: 2291: 2219: 2199: 2194: 2171: 2147: 2146: 2117: 2092: 2073: 2054: 2053: 2049: 2031: 2026: 2025: 2011: 2010: 1992: 1960: 1955: 1932: 1923: 1922: 1859: 1848: 1843: 1820: 1805: 1804: 1790: 1789: 1755: 1740: 1733: 1729: 1694: 1681: 1667: 1663: 1628: 1623: 1622: 1599: 1598: 1577: 1562: 1501: 1460: 1459: 1438: 1437: 1416: 1415: 1383: 1311: 1269: 1268: 1244: 1243: 1222: 1221: 1214: 1165: 1164: 1134: 1133: 1107: 1102: 1097: 1096: 1071: 1070: 1048: 1042: 1041: 1006: 993: 979: 975: 940: 935: 934: 902: 901: 870: 869: 845: 844: 823: 822: 801: 800: 777: 776: 741: 726: 660: 659: 638: 637: 616: 615: 591: 590: 569: 568: 547: 546: 543:hyperparameters 521: 520: 495: 490: 489: 465: 460: 459: 438: 433: 432: 404: 385: 369: 368: 365: 305: 265: 250:Model averaging 229:Nested sampling 141:Empirical Bayes 131:Conjugate prior 100:Cromwell's rule 17: 12: 11: 5: 4785: 4783: 4775: 4774: 4769: 4759: 4758: 4755: 4754: 4748: 4733: 4730: 4727: 4726: 4705: 4648: 4619:(4): 423–498. 4603: 4596: 4578: 4549: 4542: 4524: 4517: 4499: 4450: 4405: 4386:(2): 319–392. 4358: 4357: 4355: 4352: 4339: 4335: 4330: 4324: 4320: 4316: 4310: 4307: 4283: 4278: 4274: 4269: 4264: 4258: 4252: 4248: 4243: 4236: 4233: 4204: 4199: 4176: 4172: 4167: 4161: 4156: 4151: 4145: 4142: 4115: 4110: 4103: 4081: 4077: 4072: 4067: 4063: 4059: 4038: 4034: 4029: 4024: 4020: 4016: 3994: 3972: 3944: 3939: 3934: 3929: 3924: 3919: 3895: 3890: 3863: 3858: 3854: 3849: 3844: 3838: 3833: 3828: 3823: 3797: 3792: 3788: 3783: 3778: 3772: 3767: 3762: 3756: 3749: 3746: 3718: 3713: 3708: 3703: 3698: 3693: 3688: 3683: 3679: 3673: 3666: 3661: 3657: 3652: 3647: 3641: 3636: 3631: 3625: 3618: 3615: 3606: 3601: 3596: 3591: 3588: 3585: 3580: 3575: 3569: 3564: 3560: 3557: 3554: 3549: 3544: 3539: 3535: 3530: 3525: 3521: 3518: 3512: 3508: 3505: 3503: 3501: 3498: 3493: 3488: 3483: 3478: 3473: 3468: 3463: 3459: 3453: 3446: 3441: 3437: 3432: 3427: 3421: 3416: 3411: 3405: 3398: 3395: 3385: 3380: 3376: 3371: 3366: 3361: 3357: 3352: 3348: 3342: 3338: 3335: 3333: 3331: 3327: 3322: 3316: 3311: 3306: 3300: 3297: 3291: 3290: 3265: 3260: 3255: 3251: 3230: 3226: 3221: 3216: 3212: 3206: 3203: 3176: 3170: 3166: 3162: 3159: 3155: 3150: 3146: 3142: 3137: 3132: 3127: 3123: 3117: 3112: 3108: 3104: 3100: 3096: 3091: 3087: 3081: 3078: 3074: 3069: 3064: 3060: 3056: 3052: 3051: 3031: 3027: 3022: 3017: 3013: 3007: 3004: 2978: 2974: 2969: 2964: 2960: 2954: 2951: 2926: 2923: 2918: 2896: 2892: 2887: 2881: 2877: 2873: 2870: 2848: 2844: 2822: 2797: 2792: 2788: 2784: 2781: 2777: 2772: 2766: 2761: 2756: 2750: 2747: 2741: 2737: 2732: 2728: 2723: 2718: 2712: 2706: 2702: 2697: 2690: 2687: 2679: 2675: 2671: 2668: 2664: 2659: 2653: 2649: 2645: 2639: 2636: 2630: 2629: 2608: 2587: 2583: 2579: 2575: 2570: 2564: 2560: 2556: 2553: 2533: 2529: 2524: 2519: 2515: 2512: 2492: 2488: 2483: 2477: 2473: 2469: 2466: 2446: 2443: 2439: 2435: 2432: 2426: 2423: 2396: 2391: 2388: 2383: 2378: 2375: 2371: 2366: 2361: 2357: 2351: 2348: 2342: 2339: 2337: 2334: 2332: 2328: 2323: 2317: 2313: 2309: 2303: 2300: 2294: 2293: 2289: 2285: 2282: 2278: 2273: 2268: 2264: 2258: 2255: 2249: 2245: 2241: 2237: 2232: 2226: 2222: 2218: 2212: 2209: 2203: 2200: 2198: 2195: 2193: 2189: 2184: 2178: 2174: 2170: 2164: 2161: 2155: 2154: 2130: 2124: 2120: 2116: 2113: 2110: 2105: 2102: 2099: 2095: 2091: 2086: 2083: 2080: 2076: 2072: 2069: 2066: 2061: 2057: 2052: 2048: 2043: 2040: 2035: 2009: 2004: 2001: 1996: 1991: 1988: 1984: 1979: 1974: 1970: 1967: 1964: 1961: 1959: 1956: 1954: 1950: 1945: 1939: 1935: 1931: 1928: 1925: 1924: 1920: 1916: 1913: 1909: 1904: 1899: 1895: 1892: 1889: 1885: 1881: 1877: 1872: 1866: 1862: 1858: 1855: 1852: 1849: 1847: 1844: 1842: 1838: 1833: 1827: 1823: 1819: 1816: 1813: 1812: 1788: 1784: 1779: 1775: 1771: 1767: 1762: 1758: 1753: 1747: 1743: 1739: 1736: 1732: 1728: 1725: 1720: 1716: 1712: 1708: 1701: 1697: 1690: 1685: 1678: 1675: 1670: 1666: 1662: 1659: 1654: 1650: 1646: 1641: 1635: 1631: 1626: 1621: 1617: 1613: 1610: 1607: 1604: 1602: 1600: 1597: 1593: 1589: 1584: 1580: 1575: 1569: 1565: 1561: 1558: 1553: 1549: 1545: 1541: 1536: 1531: 1527: 1524: 1521: 1517: 1513: 1510: 1507: 1504: 1502: 1500: 1496: 1491: 1486: 1482: 1478: 1474: 1471: 1468: 1467: 1446: 1424: 1403: 1397: 1393: 1389: 1386: 1381: 1377: 1373: 1370: 1367: 1363: 1358: 1353: 1349: 1346: 1343: 1339: 1335: 1331: 1326: 1321: 1317: 1314: 1308: 1305: 1301: 1296: 1291: 1287: 1283: 1279: 1276: 1266:Bayes' theorem 1252: 1230: 1213: 1210: 1197: 1193: 1189: 1185: 1182: 1179: 1175: 1172: 1152: 1148: 1144: 1141: 1120: 1114: 1110: 1105: 1079: 1055: 1051: 1029: 1025: 1020: 1013: 1009: 1002: 997: 990: 987: 982: 978: 974: 971: 966: 962: 958: 953: 947: 943: 938: 933: 930: 926: 921: 916: 912: 909: 888: 883: 878: 853: 831: 809: 786: 764: 761: 757: 753: 748: 744: 739: 733: 729: 725: 722: 715: 710: 707: 703: 699: 696: 692: 688: 684: 679: 674: 670: 667: 646: 624: 599: 577: 555: 529: 502: 498: 472: 468: 445: 441: 416: 411: 407: 403: 400: 397: 392: 388: 384: 381: 377: 364: 361: 347:solution of a 307: 306: 304: 303: 296: 289: 281: 278: 277: 276: 275: 260: 259: 258: 257: 252: 247: 239: 238: 234: 233: 232: 231: 226: 218: 217: 213: 212: 211: 210: 205: 200: 192: 191: 187: 186: 185: 184: 179: 174: 169: 164: 156: 155: 151: 150: 149: 148: 143: 138: 133: 125: 124: 123:Model building 120: 119: 118: 117: 112: 107: 102: 97: 92: 87: 82: 80:Bayes' theorem 77: 72: 64: 63: 59: 58: 40: 39: 31: 30: 24: 23: 15: 13: 10: 9: 6: 4: 3: 2: 4784: 4773: 4770: 4768: 4765: 4764: 4762: 4751: 4745: 4741: 4736: 4735: 4731: 4715: 4709: 4706: 4701: 4697: 4692: 4687: 4683: 4679: 4675: 4671: 4667: 4663: 4659: 4652: 4649: 4644: 4640: 4635: 4630: 4626: 4622: 4618: 4614: 4607: 4604: 4599: 4597:9780367357955 4593: 4589: 4582: 4579: 4573: 4568: 4564: 4560: 4553: 4550: 4545: 4543:9781118326558 4539: 4535: 4528: 4525: 4520: 4518:9781498727259 4514: 4510: 4503: 4500: 4495: 4491: 4486: 4481: 4477: 4473: 4469: 4465: 4461: 4454: 4451: 4446: 4442: 4438: 4434: 4429: 4424: 4420: 4416: 4409: 4406: 4401: 4397: 4393: 4389: 4385: 4381: 4374: 4372: 4370: 4368: 4366: 4364: 4360: 4353: 4351: 4322: 4318: 4308: 4305: 4281: 4272: 4267: 4250: 4246: 4241: 4234: 4231: 4219: 4202: 4159: 4143: 4140: 4113: 4057: 4014: 3960: 3958: 3937: 3922: 3893: 3878: 3861: 3852: 3847: 3826: 3821: 3812: 3795: 3786: 3781: 3760: 3754: 3747: 3744: 3716: 3706: 3691: 3681: 3671: 3664: 3655: 3650: 3629: 3623: 3616: 3613: 3599: 3586: 3578: 3555: 3547: 3537: 3516: 3506: 3504: 3496: 3486: 3471: 3461: 3451: 3444: 3435: 3430: 3409: 3403: 3396: 3393: 3383: 3374: 3369: 3359: 3350: 3346: 3336: 3334: 3314: 3298: 3295: 3281: 3263: 3253: 3204: 3201: 3174: 3157: 3153: 3144: 3125: 3121: 3115: 3106: 3098: 3089: 3085: 3079: 3054: 3005: 3002: 2990: 2952: 2949: 2924: 2921: 2879: 2875: 2868: 2846: 2795: 2790: 2782: 2764: 2748: 2745: 2739: 2735: 2726: 2721: 2704: 2700: 2695: 2688: 2685: 2677: 2673: 2669: 2651: 2647: 2637: 2634: 2577: 2562: 2558: 2551: 2510: 2475: 2471: 2464: 2441: 2433: 2424: 2421: 2394: 2389: 2386: 2376: 2349: 2346: 2340: 2335: 2315: 2311: 2301: 2298: 2283: 2256: 2253: 2239: 2224: 2220: 2210: 2207: 2201: 2196: 2176: 2172: 2162: 2159: 2143: 2128: 2122: 2118: 2114: 2111: 2108: 2103: 2100: 2097: 2093: 2089: 2084: 2081: 2078: 2074: 2070: 2067: 2064: 2059: 2055: 2050: 2046: 2041: 2038: 2007: 2002: 1999: 1989: 1965: 1962: 1957: 1937: 1933: 1926: 1914: 1890: 1879: 1864: 1860: 1853: 1850: 1845: 1825: 1821: 1814: 1786: 1782: 1777: 1765: 1760: 1756: 1745: 1741: 1734: 1730: 1726: 1723: 1718: 1714: 1710: 1688: 1676: 1673: 1668: 1664: 1660: 1657: 1652: 1648: 1644: 1639: 1624: 1608: 1605: 1603: 1587: 1582: 1578: 1567: 1563: 1556: 1551: 1547: 1522: 1508: 1505: 1503: 1480: 1469: 1401: 1384: 1368: 1344: 1333: 1312: 1306: 1285: 1274: 1267: 1219: 1211: 1209: 1173: 1170: 1139: 1118: 1103: 1094: 1027: 1023: 1000: 988: 985: 980: 976: 972: 969: 964: 960: 956: 951: 936: 931: 907: 866: 762: 751: 746: 742: 731: 727: 720: 708: 705: 701: 697: 686: 665: 612: 544: 518: 517:link function 500: 496: 488: 470: 466: 443: 439: 430: 409: 405: 401: 398: 395: 390: 386: 379: 362: 360: 358: 354: 350: 346: 342: 338: 334: 329: 325: 321: 317: 313: 302: 297: 295: 290: 288: 283: 282: 280: 279: 274: 269: 264: 263: 262: 261: 256: 253: 251: 248: 246: 243: 242: 241: 240: 235: 230: 227: 225: 222: 221: 220: 219: 214: 209: 206: 204: 201: 199: 196: 195: 194: 193: 188: 183: 180: 178: 175: 173: 170: 168: 165: 163: 160: 159: 158: 157: 152: 147: 144: 142: 139: 137: 134: 132: 129: 128: 127: 126: 121: 116: 113: 111: 108: 106: 103: 101: 98: 96: 95:Cox's theorem 93: 91: 88: 86: 83: 81: 78: 76: 73: 71: 68: 67: 66: 65: 60: 57: 53: 49: 45: 42: 41: 37: 33: 32: 29: 25: 21: 20: 4739: 4717:. Retrieved 4708: 4668:(1): 18822. 4665: 4661: 4651: 4616: 4612: 4606: 4587: 4581: 4562: 4558: 4552: 4533: 4527: 4508: 4502: 4467: 4463: 4453: 4418: 4414: 4408: 4383: 4379: 4220: 3961: 2991: 2144: 1458:is given by 1215: 867: 613: 366: 341:epidemiology 315: 311: 310: 245:Bayes factor 171: 3879:at a given 1264:. Applying 843:is part of 4761:Categories 4572:1708.02723 4354:References 190:Estimators 62:Background 48:Likelihood 4643:120949984 4565:: 62–85. 4428:1202.1738 4309:~ 4306:π 4263:θ 4235:~ 4232:π 4198:θ 4155:θ 4144:~ 4141:π 4109:θ 4066:θ 4058:π 4023:θ 4015:π 3933:θ 3923:∗ 3889:θ 3843:θ 3822:π 3777:θ 3748:~ 3745:π 3702:θ 3692:∗ 3646:θ 3617:~ 3614:π 3595:θ 3587:π 3574:θ 3556:π 3543:θ 3517:π 3507:∝ 3482:θ 3472:∗ 3426:θ 3397:~ 3394:π 3365:θ 3347:π 3337:∝ 3310:θ 3299:~ 3296:π 3259:θ 3250:θ 3215:θ 3205:~ 3202:π 3158:π 3141:θ 3122:π 3103:θ 3086:π 3063:θ 3055:π 3016:θ 3006:~ 3003:π 2963:θ 2953:~ 2950:π 2939:out from 2922:− 2917:θ 2876:θ 2869:π 2843:Δ 2821:θ 2787:Δ 2783:× 2760:θ 2749:~ 2746:π 2740:× 2717:θ 2689:~ 2686:π 2674:∑ 2638:~ 2635:π 2607:θ 2574:θ 2552:π 2518:θ 2511:π 2465:π 2442:⋅ 2434:⋅ 2425:~ 2422:π 2387:− 2382:θ 2360:θ 2350:~ 2347:π 2341:∫ 2312:θ 2302:~ 2299:π 2288:θ 2267:θ 2257:~ 2254:π 2236:θ 2211:~ 2208:π 2202:∫ 2163:~ 2160:π 2119:θ 2112:… 2094:θ 2082:− 2075:θ 2068:… 2056:θ 2039:− 2034:θ 2000:− 1995:θ 1973:θ 1966:π 1963:∫ 1934:θ 1927:π 1919:θ 1898:θ 1891:π 1876:θ 1854:π 1851:∫ 1815:π 1770:θ 1757:η 1735:π 1727:⁡ 1715:∑ 1700:θ 1669:− 1661:⁡ 1634:θ 1616:θ 1609:π 1606:∝ 1592:θ 1579:η 1557:π 1548:∏ 1540:θ 1523:π 1516:θ 1509:π 1506:∝ 1485:θ 1470:π 1445:θ 1385:π 1376:θ 1369:π 1362:θ 1345:π 1338:θ 1313:π 1290:θ 1275:π 1251:θ 1192:θ 1147:θ 1140:π 1113:θ 1078:θ 1054:θ 1012:θ 981:− 973:⁡ 946:θ 932:∝ 925:θ 908:π 887:θ 830:η 756:θ 743:η 721:π 709:∈ 702:∏ 691:θ 666:π 645:θ 598:θ 554:θ 497:η 440:μ 399:… 359:package. 322:based on 90:Coherence 44:Posterior 4719:21 April 4700:33139744 4494:29200537 4445:88511801 56:Evidence 4691:7606447 4670:Bibcode 4485:5708893 4400:1657669 3809:is the 337:ecology 4746:  4698:  4688:  4641:  4594:  4540:  4515:  4492:  4482:  4443:  4398:  3875:whose 3733:where 2411:where 2024:where 1040:where 775:where 541:. The 339:, and 4639:S2CID 4567:arXiv 4441:S2CID 4423:arXiv 4396:S2CID 3278:with 1069:is a 52:Prior 4744:ISBN 4721:2022 4696:PMID 4592:ISBN 4538:ISBN 4513:ISBN 4490:PMID 3877:mode 2544:and 1436:and 1242:and 1095:and 636:and 589:and 458:(of 367:Let 316:INLA 4686:PMC 4678:doi 4629:hdl 4621:doi 4563:158 4480:PMC 4472:doi 4433:doi 4388:doi 3959:. 3908:is 3813:to 2620:as 1724:log 1658:exp 970:exp 4763:: 4694:. 4684:. 4676:. 4666:10 4664:. 4660:. 4637:. 4627:. 4617:73 4615:. 4561:. 4488:. 4478:. 4468:87 4466:. 4462:. 4439:. 4431:. 4419:84 4417:. 4394:. 4384:71 4382:. 4362:^ 4218:. 2989:. 2142:. 865:. 658:: 335:, 54:Ă· 50:Ă— 46:= 4752:. 4723:. 4702:. 4680:: 4672:: 4645:. 4631:: 4623:: 4600:. 4575:. 4569:: 4546:. 4521:. 4496:. 4474:: 4447:. 4435:: 4425:: 4402:. 4390:: 4338:) 4334:y 4329:| 4323:i 4319:x 4315:( 4282:) 4277:y 4273:, 4268:k 4257:| 4251:i 4247:x 4242:( 4203:k 4175:) 4171:y 4166:| 4160:k 4150:( 4114:k 4102:Q 4080:) 4076:y 4071:| 4062:( 4037:) 4033:y 4028:| 4019:( 3993:x 3971:x 3943:) 3938:k 3928:( 3918:x 3894:k 3862:) 3857:y 3853:, 3848:k 3837:| 3832:x 3827:( 3796:) 3791:y 3787:, 3782:k 3771:| 3766:x 3761:( 3755:G 3717:, 3712:) 3707:k 3697:( 3687:x 3682:= 3678:x 3672:| 3665:) 3660:y 3656:, 3651:k 3640:| 3635:x 3630:( 3624:G 3605:) 3600:k 3590:( 3584:) 3579:k 3568:| 3563:x 3559:( 3553:) 3548:k 3538:, 3534:x 3529:| 3524:y 3520:( 3497:, 3492:) 3487:k 3477:( 3467:x 3462:= 3458:x 3452:| 3445:) 3440:y 3436:, 3431:k 3420:| 3415:x 3410:( 3404:G 3384:) 3379:y 3375:, 3370:k 3360:, 3356:x 3351:( 3330:) 3326:y 3321:| 3315:k 3305:( 3264:k 3254:= 3229:) 3225:y 3220:| 3211:( 3175:, 3169:) 3165:y 3161:( 3154:) 3149:y 3145:, 3136:| 3131:x 3126:( 3116:) 3111:y 3107:, 3099:, 3095:x 3090:( 3080:= 3077:) 3073:y 3068:| 3059:( 3030:) 3026:y 3021:| 3012:( 2977:) 2973:y 2968:| 2959:( 2925:j 2895:) 2891:y 2886:| 2880:j 2872:( 2847:k 2796:, 2791:k 2780:) 2776:y 2771:| 2765:k 2755:( 2736:) 2731:y 2727:, 2722:k 2711:| 2705:i 2701:x 2696:( 2678:k 2670:= 2667:) 2663:y 2658:| 2652:i 2648:x 2644:( 2586:) 2582:y 2578:, 2569:| 2563:i 2559:x 2555:( 2532:) 2528:y 2523:| 2514:( 2491:) 2487:y 2482:| 2476:i 2472:x 2468:( 2445:) 2438:| 2431:( 2395:, 2390:j 2377:d 2374:) 2370:y 2365:| 2356:( 2336:= 2331:) 2327:y 2322:| 2316:j 2308:( 2284:d 2281:) 2277:y 2272:| 2263:( 2248:) 2244:y 2240:, 2231:| 2225:i 2221:x 2217:( 2197:= 2192:) 2188:y 2183:| 2177:i 2173:x 2169:( 2129:) 2123:m 2115:, 2109:, 2104:1 2101:+ 2098:j 2090:, 2085:1 2079:j 2071:, 2065:, 2060:1 2051:( 2047:= 2042:j 2008:, 2003:j 1990:d 1987:) 1983:y 1978:| 1969:( 1958:= 1953:) 1949:y 1944:| 1938:j 1930:( 1915:d 1912:) 1908:y 1903:| 1894:( 1888:) 1884:y 1880:, 1871:| 1865:i 1861:x 1857:( 1846:= 1841:) 1837:y 1832:| 1826:i 1822:x 1818:( 1787:. 1783:) 1778:] 1774:) 1766:, 1761:i 1752:| 1746:i 1742:y 1738:( 1731:[ 1719:i 1711:+ 1707:x 1696:Q 1689:T 1684:x 1677:2 1674:1 1665:( 1653:2 1649:/ 1645:1 1640:| 1630:Q 1625:| 1620:) 1612:( 1596:) 1588:, 1583:i 1574:| 1568:i 1564:y 1560:( 1552:i 1544:) 1535:| 1530:x 1526:( 1520:) 1512:( 1499:) 1495:y 1490:| 1481:, 1477:x 1473:( 1423:x 1402:, 1396:) 1392:y 1388:( 1380:) 1372:( 1366:) 1357:| 1352:x 1348:( 1342:) 1334:, 1330:x 1325:| 1320:y 1316:( 1307:= 1304:) 1300:y 1295:| 1286:, 1282:x 1278:( 1229:x 1196:) 1188:( 1184:m 1181:i 1178:d 1174:= 1171:m 1151:) 1143:( 1119:| 1109:Q 1104:| 1050:Q 1028:, 1024:) 1019:x 1008:Q 1001:T 996:x 989:2 986:1 977:( 965:2 961:/ 957:1 952:| 942:Q 937:| 929:) 920:| 915:x 911:( 882:| 877:x 852:x 808:y 785:I 763:, 760:) 752:, 747:i 738:| 732:i 728:y 724:( 714:I 706:i 698:= 695:) 687:, 683:x 678:| 673:y 669:( 623:x 576:x 528:x 501:i 471:i 467:y 444:i 415:) 410:n 406:y 402:, 396:, 391:1 387:y 383:( 380:= 376:y 357:R 314:( 300:e 293:t 286:v

Index

Bayesian statistics

Posterior
Likelihood
Prior
Evidence
Bayesian inference
Bayesian probability
Bayes' theorem
Bernstein–von Mises theorem
Coherence
Cox's theorem
Cromwell's rule
Likelihood principle
Principle of indifference
Principle of maximum entropy
Conjugate prior
Linear regression
Empirical Bayes
Hierarchical model
Markov chain Monte Carlo
Laplace's approximation
Integrated nested Laplace approximations
Variational inference
Approximate Bayesian computation
Bayesian estimator
Credible interval
Maximum a posteriori estimation
Evidence lower bound
Nested sampling

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑