279:
350:, the posterior probability contains everything there is to know about an uncertain proposition (such as a scientific hypothesis, or parameter values), given prior knowledge and a mathematical model describing the observations available at a particular time. After the arrival of new - posterior or later in time - information, the current posterior probability may serve as the prior in another round of Bayesian updating.
1541:
are (50% of 0.4N)/(0.6N+ 50% of 0.4N) = 25%. In other words, if you separated out the group of trouser wearers, a quarter of that group will be girls. Therefore, if you see trousers, the most you can deduce is that you are looking at a single sample from a subset of students where 25% are girls. And by definition, chance of this random student being a girl is 25%. Every Bayes-theorem problem can be solved in this way.
47:
1540:
An intuitive way to solve this is to assume the school has N students. Number of boys = 0.6N and number of girls = 0.4N. If N is sufficiently large, total number of trouser wearers = 0.6N+ 50% of 0.4N. And number of girl trouser wearers = 50% of 0.4N. Therefore, in the population of trousers, girls
952:
Suppose there is a school with 60% boys and 40% girls as students. The girls wear trousers or skirts in equal numbers; all boys wear trousers. An observer sees a (random) student from a distance; all the observer can see is that this student is wearing trousers. What is the probability this student
2212:
methods by definition generate posterior probabilities, Machine
Learners usually supply membership values which do not induce any probabilistic confidence. It is desirable to transform or rescale membership values to class-membership probabilities, since they are comparable and additionally more
1762:
1065:, or the probability that the student is a girl regardless of any other information. Since the observer sees a random student, meaning that all students have the same probability of being observed, and the percentage of girls among the students is 40%, this probability equals 0.4.
2187:
Posterior probability is a conditional probability conditioned on randomly observed data. Hence it is a random variable. For a random variable, it is important to summarize its amount of uncertainty. One way to achieve this goal is to provide a
1535:
2080:
1575:
710:
1962:
819:
1405:
1342:
891:
2130:
615:
515:
471:
558:
384:(HPDI). But while conceptually simple, the posterior distribution is generally not tractable and therefore needs to be either analytically or numerically approximated.
2435:
1854:
931:
911:
842:
414:
1212:
1173:
1031:
1243:
1094:
1063:
743:
2176:
1815:
2150:
1982:
1874:
1789:
1134:
1114:
994:
974:
578:
434:
309:
100:
1421:
1989:
1415:
of the observer having spotted a girl given that the observed student is wearing trousers can be computed by substituting these values in the formula:
1175:, or the probability of the student wearing trousers given that the student is a girl. As they are as likely to wear skirts as trousers, this is 0.5.
1757:{\displaystyle f_{X\mid Y=y}(x)={f_{X}(x){\mathcal {L}}_{X\mid Y=y}(x) \over {\int _{-\infty }^{\infty }f_{X}(u){\mathcal {L}}_{X\mid Y=y}(u)\,du}}}
182:
2419:
2394:
2344:
2271:
623:
381:
2242:
2614:
2591:
2369:
1881:
302:
265:
192:
2227:
95:
377:
218:
2311:
751:
2205:
1558:
156:
2452:
Chapter 8 Introduction to
Continuous Prior and Posterior Distributions | An Introduction to Bayesian Reasoning and Methods
2633:
1768:
1351:
528:
393:
295:
187:
125:
2450:
2466:
2209:
2201:
177:
146:
1245:, or the probability of a (randomly selected) student wearing trousers regardless of any other information. Since
239:
120:
1248:
1345:
360:
260:
172:
2204:, posterior probabilities reflect the uncertainty of assessing an observation to particular class, see also
847:
327:
2515:
2232:
937:
151:
1214:, or the probability of the student wearing trousers given that the student is a boy. This is given as 1.
2237:
365:
347:
278:
2410:
Andrew Gelman, John B. Carlin, Hal S. Stern, David B. Dunson, Aki
Vehtari and Donald B. Rubin (2014).
2087:
1096:, or the probability that the student is not a girl (i.e. a boy) regardless of any other information (
2606:
1566:
234:
115:
85:
2514:
Clyde, Merlise; Çetinkaya-Rundel, Mine; Rundel, Colin; Banks, David; Chai, Christine; Huang, Lizzy.
2222:
1562:
477:
354:
339:
66:
58:
38:
583:
483:
439:
2560:
2429:
283:
208:
80:
534:
110:
2610:
2587:
2552:
2415:
2390:
2365:
2340:
2267:
2189:
1823:
1554:
524:
373:
335:
331:
213:
90:
62:
916:
896:
827:
399:
2544:
2292:
1180:
1141:
999:
105:
2532:
368:
conditional on a collection of observed data. From a given posterior distribution, various
1550:
1219:
1070:
1039:
719:
141:
2533:"Linear Discriminant Analysis for Prediction of Group Membership: A User-Friendly Primer"
2155:
1794:
2490:
2135:
1967:
1859:
1774:
1119:
1099:
996:
is that the student observed is wearing trousers. To compute the posterior probability
979:
959:
563:
419:
369:
343:
17:
2627:
2564:
2360:
Press, S. James (1989). "Approximations, Numerical
Methods, and Computer Programs".
1530:{\displaystyle P(G|T)={\frac {P(T|G)P(G)}{P(T)}}={\frac {0.5\times 0.4}{0.8}}=0.25.}
2075:{\displaystyle \int _{-\infty }^{\infty }f_{X}(u){\mathcal {L}}_{X\mid Y=y}(u)\,du}
255:
2517:
Chapter 1 The Basics of
Bayesian Statistics | An Introduction to Bayesian Thinking
2556:
2548:
2335:
Gill, Jeff (2014). "Summarizing
Posterior Distributions with Intervals".
2262:
Lambert, Ben (2018). "The posterior – the goal of
Bayesian inference".
46:
2296:
953:
is a girl? The correct answer can be computed using Bayes' theorem.
705:{\displaystyle p(\theta |x)={\frac {p(x|\theta )}{p(x)}}p(\theta )}
480:, which is the probability of the evidence given the parameters:
396:, the posterior probability is the probability of the parameters
2362:
Bayesian
Statistics : Principles, Models, and Applications
1957:{\displaystyle {\mathcal {L}}_{X\mid Y=y}(x)=f_{Y\mid X=x}(y)}
2289:
Inferences from observations to simple statistical hypotheses
2033:
1888:
1712:
1638:
2337:
Bayesian
Methods: A Social and Behavioral Sciences Approach
2537:
Advances in
Methods and Practices in Psychological Science
2312:"Understanding Bayes: Updating priors via the likelihood"
814:{\displaystyle p(x)=\int p(x|\theta )p(\theta )d\theta }
2339:(Third ed.). Chapman & Hall. pp. 42–48.
976:
is that the student observed is a girl, and the event
2158:
2138:
2090:
1992:
1970:
1884:
1862:
1826:
1797:
1777:
1578:
1424:
1354:
1251:
1222:
1183:
1144:
1122:
1102:
1073:
1042:
1002:
982:
962:
919:
899:
850:
830:
754:
722:
626:
586:
566:
537:
486:
442:
422:
402:
2364:. New York: John Wiley & Sons. pp. 69–102.
27:
Conditional probability used in Bayesian statistics
2170:
2144:
2124:
2074:
1976:
1956:
1868:
1848:
1809:
1783:
1756:
1553:given the value of another can be calculated with
1529:
1400:{\displaystyle P(T)=0.5\times 0.4+1\times 0.6=0.8}
1399:
1336:
1237:
1206:
1167:
1128:
1108:
1088:
1057:
1025:
988:
968:
925:
905:
885:
836:
813:
737:
704:
609:
572:
552:
509:
465:
428:
408:
364:usually describes the epistemic uncertainty about
2531:Boedeker, Peter; Kearns, Nathan T. (2019-07-09).
745:is the normalizing constant and is calculated as
617:, then the posterior probability is defined as
2584:An Introduction to Modern Bayesian Econometrics
1549:The posterior probability distribution of one
2491:"Posterior probability - formulasearchengine"
303:
8:
2434:: CS1 maint: multiple names: authors list (
1964:is the likelihood function as a function of
2603:Bayesian Statistics : An Introduction
1337:{\displaystyle P(T)=P(T|G)P(G)+P(T|B)P(B)}
310:
296:
29:
2157:
2137:
2095:
2089:
2065:
2038:
2032:
2031:
2015:
2005:
1997:
1991:
1969:
1927:
1893:
1887:
1886:
1883:
1861:
1831:
1825:
1796:
1776:
1744:
1717:
1711:
1710:
1694:
1684:
1676:
1671:
1643:
1637:
1636:
1620:
1613:
1583:
1577:
1503:
1460:
1448:
1434:
1423:
1353:
1311:
1276:
1250:
1221:
1193:
1182:
1154:
1143:
1121:
1101:
1072:
1041:
1012:
1001:
981:
961:
918:
898:
860:
849:
829:
782:
753:
721:
662:
650:
636:
625:
596:
585:
565:
536:
496:
485:
452:
441:
421:
401:
2387:Pattern Recognition and Machine Learning
2264:A Student's Guide to Bayesian Statistics
183:Integrated nested Laplace approximations
2254:
2213:easily applicable for post-processing.
936:The posterior probability is therefore
247:
226:
200:
164:
133:
72:
37:
2427:
886:{\displaystyle p(x|\theta )p(\theta )}
388:Definition in the distributional case
7:
2291:(PhD thesis). University of Sydney.
338:with information summarized by the
2006:
2001:
1685:
1680:
382:highest posterior density interval
25:
529:probability distribution function
2467:"Bayes' theorem - C o r T e x T"
2125:{\displaystyle f_{X\mid Y=y}(x)}
2082:is the normalizing constant, and
1411:Given all this information, the
520:The two are related as follows:
277:
193:Approximate Bayesian computation
45:
2192:of the posterior probability.
219:Maximum a posteriori estimation
2385:Christopher M. Bishop (2006).
2206:class-membership probabilities
2119:
2113:
2062:
2056:
2027:
2021:
1951:
1945:
1917:
1911:
1843:
1837:
1741:
1735:
1706:
1700:
1667:
1661:
1632:
1626:
1607:
1601:
1559:prior probability distribution
1494:
1488:
1480:
1474:
1468:
1461:
1454:
1442:
1435:
1428:
1364:
1358:
1331:
1325:
1319:
1312:
1305:
1296:
1290:
1284:
1277:
1270:
1261:
1255:
1232:
1226:
1201:
1194:
1187:
1162:
1155:
1148:
1116:is the complementary event to
1083:
1077:
1052:
1046:
1020:
1013:
1006:
942:Likelihood · Prior probability
880:
874:
868:
861:
854:
802:
796:
790:
783:
776:
764:
758:
732:
726:
699:
693:
684:
678:
670:
663:
656:
644:
637:
630:
604:
597:
590:
547:
541:
504:
497:
490:
460:
453:
446:
1:
2243:Metropolis–Hastings algorithm
2389:. Springer. pp. 21–24.
2132:is the posterior density of
1769:probability density function
893:over all possible values of
610:{\displaystyle p(x|\theta )}
510:{\displaystyle p(X|\theta )}
466:{\displaystyle p(\theta |X)}
394:variational Bayesian methods
376:can be derived, such as the
126:Principle of maximum entropy
2228:Bernstein–von Mises theorem
1565:, and then dividing by the
348:epistemological perspective
96:Bernstein–von Mises theorem
2650:
2266:. Sage. pp. 121–140.
2210:statistical classification
560:and that the observations
553:{\displaystyle p(\theta )}
1033:, we first need to know:
121:Principle of indifference
2582:Lancaster, Tony (2004).
2549:10.1177/2515245919849378
2414:. CRC Press. p. 7.
2310:Etz, Alex (2015-07-25).
2287:Grossman, Jason (2005).
1856:is the prior density of
1849:{\displaystyle f_{X}(x)}
1346:law of total probability
361:probability distribution
173:Markov chain Monte Carlo
2495:formulasearchengine.com
1136:). This is 60%, or 0.6.
926:{\displaystyle \theta }
906:{\displaystyle \theta }
837:{\displaystyle \theta }
409:{\displaystyle \theta }
328:conditional probability
178:Laplace's approximation
165:Posterior approximation
2601:Lee, Peter M. (2004).
2412:Bayesian Data Analysis
2233:Probability of success
2172:
2146:
2126:
2076:
1978:
1958:
1870:
1850:
1811:
1785:
1771:for a random variable
1758:
1531:
1401:
1338:
1239:
1208:
1207:{\displaystyle P(T|B)}
1169:
1168:{\displaystyle P(T|G)}
1130:
1110:
1090:
1059:
1027:
1026:{\displaystyle P(G|T)}
990:
970:
927:
907:
887:
838:
815:
739:
706:
611:
574:
554:
511:
476:It contrasts with the
467:
430:
410:
366:statistical parameters
342:via an application of
284:Mathematics portal
227:Evidence approximation
18:Posterior distribution
2586:. Oxford: Blackwell.
2238:Bayesian epistemology
2173:
2147:
2127:
2077:
1979:
1959:
1871:
1851:
1812:
1786:
1759:
1532:
1413:posterior probability
1402:
1339:
1240:
1209:
1170:
1131:
1111:
1091:
1060:
1028:
991:
971:
928:
908:
888:
839:
816:
740:
707:
612:
575:
555:
512:
468:
431:
411:
324:posterior probability
188:Variational inference
2156:
2136:
2088:
1990:
1968:
1882:
1860:
1824:
1795:
1775:
1767:gives the posterior
1576:
1567:normalizing constant
1422:
1352:
1249:
1238:{\displaystyle P(T)}
1220:
1181:
1142:
1120:
1100:
1089:{\displaystyle P(B)}
1071:
1058:{\displaystyle P(G)}
1040:
1000:
980:
960:
917:
897:
848:
828:
752:
738:{\displaystyle p(x)}
720:
624:
584:
564:
535:
484:
440:
420:
400:
378:maximum a posteriori
266:Posterior predictive
235:Evidence lower bound
116:Likelihood principle
86:Bayesian probability
2634:Bayesian statistics
2223:Prediction interval
2171:{\displaystyle Y=y}
2010:
1810:{\displaystyle Y=y}
1689:
1563:likelihood function
1557:by multiplying the
478:likelihood function
416:given the evidence
355:Bayesian statistics
39:Bayesian statistics
33:Part of a series on
2168:
2142:
2122:
2072:
1993:
1974:
1954:
1866:
1846:
1807:
1781:
1754:
1672:
1527:
1397:
1334:
1235:
1204:
1165:
1126:
1106:
1086:
1055:
1023:
986:
966:
923:
903:
883:
834:
811:
735:
702:
607:
580:have a likelihood
570:
550:
507:
463:
426:
406:
374:interval estimates
353:In the context of
330:that results from
209:Bayesian estimator
157:Hierarchical model
81:Bayesian inference
2421:978-1-4398-4095-5
2396:978-0-387-31073-2
2346:978-1-4398-6248-3
2273:978-1-4739-1636-4
2190:credible interval
2183:Credible interval
2145:{\displaystyle X}
1977:{\displaystyle x}
1869:{\displaystyle X}
1784:{\displaystyle X}
1752:
1519:
1498:
1129:{\displaystyle G}
1109:{\displaystyle B}
989:{\displaystyle T}
969:{\displaystyle G}
844:, or by summing
688:
573:{\displaystyle x}
436:, and is denoted
429:{\displaystyle X}
336:prior probability
320:
319:
214:Credible interval
147:Linear regression
16:(Redirected from
2641:
2620:
2605:(3rd ed.).
2597:
2569:
2568:
2528:
2522:
2521:
2511:
2505:
2504:
2502:
2501:
2487:
2481:
2480:
2478:
2477:
2471:sites.google.com
2463:
2457:
2456:
2446:
2440:
2439:
2433:
2425:
2407:
2401:
2400:
2382:
2376:
2375:
2357:
2351:
2350:
2332:
2326:
2325:
2323:
2322:
2307:
2301:
2300:
2284:
2278:
2277:
2259:
2177:
2175:
2174:
2169:
2151:
2149:
2148:
2143:
2131:
2129:
2128:
2123:
2112:
2111:
2081:
2079:
2078:
2073:
2055:
2054:
2037:
2036:
2020:
2019:
2009:
2004:
1983:
1981:
1980:
1975:
1963:
1961:
1960:
1955:
1944:
1943:
1910:
1909:
1892:
1891:
1875:
1873:
1872:
1867:
1855:
1853:
1852:
1847:
1836:
1835:
1816:
1814:
1813:
1808:
1790:
1788:
1787:
1782:
1763:
1761:
1760:
1755:
1753:
1751:
1734:
1733:
1716:
1715:
1699:
1698:
1688:
1683:
1670:
1660:
1659:
1642:
1641:
1625:
1624:
1614:
1600:
1599:
1536:
1534:
1533:
1528:
1520:
1515:
1504:
1499:
1497:
1483:
1464:
1449:
1438:
1406:
1404:
1403:
1398:
1343:
1341:
1340:
1335:
1315:
1280:
1244:
1242:
1241:
1236:
1213:
1211:
1210:
1205:
1197:
1174:
1172:
1171:
1166:
1158:
1135:
1133:
1132:
1127:
1115:
1113:
1112:
1107:
1095:
1093:
1092:
1087:
1064:
1062:
1061:
1056:
1032:
1030:
1029:
1024:
1016:
995:
993:
992:
987:
975:
973:
972:
967:
932:
930:
929:
924:
912:
910:
909:
904:
892:
890:
889:
884:
864:
843:
841:
840:
835:
820:
818:
817:
812:
786:
744:
742:
741:
736:
711:
709:
708:
703:
689:
687:
673:
666:
651:
640:
616:
614:
613:
608:
600:
579:
577:
576:
571:
559:
557:
556:
551:
516:
514:
513:
508:
500:
472:
470:
469:
464:
456:
435:
433:
432:
427:
415:
413:
412:
407:
312:
305:
298:
282:
281:
248:Model evaluation
49:
30:
21:
2649:
2648:
2644:
2643:
2642:
2640:
2639:
2638:
2624:
2623:
2617:
2600:
2594:
2581:
2578:
2576:Further reading
2573:
2572:
2530:
2529:
2525:
2513:
2512:
2508:
2499:
2497:
2489:
2488:
2484:
2475:
2473:
2465:
2464:
2460:
2448:
2447:
2443:
2426:
2422:
2409:
2408:
2404:
2397:
2384:
2383:
2379:
2372:
2359:
2358:
2354:
2347:
2334:
2333:
2329:
2320:
2318:
2309:
2308:
2304:
2286:
2285:
2281:
2274:
2261:
2260:
2256:
2251:
2219:
2198:
2185:
2154:
2153:
2152:given the data
2134:
2133:
2091:
2086:
2085:
2030:
2011:
1988:
1987:
1966:
1965:
1923:
1885:
1880:
1879:
1858:
1857:
1827:
1822:
1821:
1793:
1792:
1791:given the data
1773:
1772:
1709:
1690:
1635:
1616:
1615:
1579:
1574:
1573:
1551:random variable
1547:
1505:
1484:
1450:
1420:
1419:
1350:
1349:
1247:
1246:
1218:
1217:
1179:
1178:
1140:
1139:
1118:
1117:
1098:
1097:
1069:
1068:
1038:
1037:
998:
997:
978:
977:
958:
957:
950:
938:proportional to
915:
914:
895:
894:
846:
845:
826:
825:
824:for continuous
750:
749:
718:
717:
674:
652:
622:
621:
582:
581:
562:
561:
533:
532:
482:
481:
438:
437:
418:
417:
398:
397:
390:
316:
276:
261:Model averaging
240:Nested sampling
152:Empirical Bayes
142:Conjugate prior
111:Cromwell's rule
28:
23:
22:
15:
12:
11:
5:
2647:
2645:
2637:
2636:
2626:
2625:
2622:
2621:
2615:
2598:
2592:
2577:
2574:
2571:
2570:
2543:(3): 250–263.
2523:
2506:
2482:
2458:
2441:
2420:
2402:
2395:
2377:
2370:
2352:
2345:
2327:
2302:
2279:
2272:
2253:
2252:
2250:
2247:
2246:
2245:
2240:
2235:
2230:
2225:
2218:
2215:
2202:classification
2197:
2196:Classification
2194:
2184:
2181:
2180:
2179:
2167:
2164:
2161:
2141:
2121:
2118:
2115:
2110:
2107:
2104:
2101:
2098:
2094:
2083:
2071:
2068:
2064:
2061:
2058:
2053:
2050:
2047:
2044:
2041:
2035:
2029:
2026:
2023:
2018:
2014:
2008:
2003:
2000:
1996:
1985:
1973:
1953:
1950:
1947:
1942:
1939:
1936:
1933:
1930:
1926:
1922:
1919:
1916:
1913:
1908:
1905:
1902:
1899:
1896:
1890:
1877:
1865:
1845:
1842:
1839:
1834:
1830:
1806:
1803:
1800:
1780:
1765:
1764:
1750:
1747:
1743:
1740:
1737:
1732:
1729:
1726:
1723:
1720:
1714:
1708:
1705:
1702:
1697:
1693:
1687:
1682:
1679:
1675:
1669:
1666:
1663:
1658:
1655:
1652:
1649:
1646:
1640:
1634:
1631:
1628:
1623:
1619:
1612:
1609:
1606:
1603:
1598:
1595:
1592:
1589:
1586:
1582:
1569:, as follows:
1555:Bayes' theorem
1546:
1543:
1538:
1537:
1526:
1523:
1518:
1514:
1511:
1508:
1502:
1496:
1493:
1490:
1487:
1482:
1479:
1476:
1473:
1470:
1467:
1463:
1459:
1456:
1453:
1447:
1444:
1441:
1437:
1433:
1430:
1427:
1409:
1408:
1396:
1393:
1390:
1387:
1384:
1381:
1378:
1375:
1372:
1369:
1366:
1363:
1360:
1357:
1333:
1330:
1327:
1324:
1321:
1318:
1314:
1310:
1307:
1304:
1301:
1298:
1295:
1292:
1289:
1286:
1283:
1279:
1275:
1272:
1269:
1266:
1263:
1260:
1257:
1254:
1234:
1231:
1228:
1225:
1215:
1203:
1200:
1196:
1192:
1189:
1186:
1176:
1164:
1161:
1157:
1153:
1150:
1147:
1137:
1125:
1105:
1085:
1082:
1079:
1076:
1066:
1054:
1051:
1048:
1045:
1022:
1019:
1015:
1011:
1008:
1005:
985:
965:
949:
946:
922:
902:
882:
879:
876:
873:
870:
867:
863:
859:
856:
853:
833:
822:
821:
810:
807:
804:
801:
798:
795:
792:
789:
785:
781:
778:
775:
772:
769:
766:
763:
760:
757:
734:
731:
728:
725:
714:
713:
701:
698:
695:
692:
686:
683:
680:
677:
672:
669:
665:
661:
658:
655:
649:
646:
643:
639:
635:
632:
629:
606:
603:
599:
595:
592:
589:
569:
549:
546:
543:
540:
527:belief that a
506:
503:
499:
495:
492:
489:
462:
459:
455:
451:
448:
445:
425:
405:
389:
386:
318:
317:
315:
314:
307:
300:
292:
289:
288:
287:
286:
271:
270:
269:
268:
263:
258:
250:
249:
245:
244:
243:
242:
237:
229:
228:
224:
223:
222:
221:
216:
211:
203:
202:
198:
197:
196:
195:
190:
185:
180:
175:
167:
166:
162:
161:
160:
159:
154:
149:
144:
136:
135:
134:Model building
131:
130:
129:
128:
123:
118:
113:
108:
103:
98:
93:
91:Bayes' theorem
88:
83:
75:
74:
70:
69:
51:
50:
42:
41:
35:
34:
26:
24:
14:
13:
10:
9:
6:
4:
3:
2:
2646:
2635:
2632:
2631:
2629:
2618:
2616:0-340-81405-5
2612:
2608:
2604:
2599:
2595:
2593:1-4051-1720-6
2589:
2585:
2580:
2579:
2575:
2566:
2562:
2558:
2554:
2550:
2546:
2542:
2538:
2534:
2527:
2524:
2519:
2518:
2510:
2507:
2496:
2492:
2486:
2483:
2472:
2468:
2462:
2459:
2454:
2453:
2449:Ross, Kevin.
2445:
2442:
2437:
2431:
2423:
2417:
2413:
2406:
2403:
2398:
2392:
2388:
2381:
2378:
2373:
2371:0-471-63729-7
2367:
2363:
2356:
2353:
2348:
2342:
2338:
2331:
2328:
2317:
2316:The Etz-Files
2313:
2306:
2303:
2298:
2294:
2290:
2283:
2280:
2275:
2269:
2265:
2258:
2255:
2248:
2244:
2241:
2239:
2236:
2234:
2231:
2229:
2226:
2224:
2221:
2220:
2216:
2214:
2211:
2207:
2203:
2195:
2193:
2191:
2182:
2165:
2162:
2159:
2139:
2116:
2108:
2105:
2102:
2099:
2096:
2092:
2084:
2069:
2066:
2059:
2051:
2048:
2045:
2042:
2039:
2024:
2016:
2012:
1998:
1994:
1986:
1971:
1948:
1940:
1937:
1934:
1931:
1928:
1924:
1920:
1914:
1906:
1903:
1900:
1897:
1894:
1878:
1863:
1840:
1832:
1828:
1820:
1819:
1818:
1804:
1801:
1798:
1778:
1770:
1748:
1745:
1738:
1730:
1727:
1724:
1721:
1718:
1703:
1695:
1691:
1677:
1673:
1664:
1656:
1653:
1650:
1647:
1644:
1629:
1621:
1617:
1610:
1604:
1596:
1593:
1590:
1587:
1584:
1580:
1572:
1571:
1570:
1568:
1564:
1560:
1556:
1552:
1544:
1542:
1524:
1521:
1516:
1512:
1509:
1506:
1500:
1491:
1485:
1477:
1471:
1465:
1457:
1451:
1445:
1439:
1431:
1425:
1418:
1417:
1416:
1414:
1394:
1391:
1388:
1385:
1382:
1379:
1376:
1373:
1370:
1367:
1361:
1355:
1347:
1328:
1322:
1316:
1308:
1302:
1299:
1293:
1287:
1281:
1273:
1267:
1264:
1258:
1252:
1229:
1223:
1216:
1198:
1190:
1184:
1177:
1159:
1151:
1145:
1138:
1123:
1103:
1080:
1074:
1067:
1049:
1043:
1036:
1035:
1034:
1017:
1009:
1003:
983:
963:
954:
947:
945:
943:
939:
934:
920:
913:for discrete
900:
877:
871:
865:
857:
851:
831:
808:
805:
799:
793:
787:
779:
773:
770:
767:
761:
755:
748:
747:
746:
729:
723:
696:
690:
681:
675:
667:
659:
653:
647:
641:
633:
627:
620:
619:
618:
601:
593:
587:
567:
544:
538:
530:
526:
521:
518:
501:
493:
487:
479:
474:
457:
449:
443:
423:
403:
395:
387:
385:
383:
380:(MAP) or the
379:
375:
371:
367:
363:
362:
356:
351:
349:
345:
341:
337:
333:
329:
326:is a type of
325:
313:
308:
306:
301:
299:
294:
293:
291:
290:
285:
280:
275:
274:
273:
272:
267:
264:
262:
259:
257:
254:
253:
252:
251:
246:
241:
238:
236:
233:
232:
231:
230:
225:
220:
217:
215:
212:
210:
207:
206:
205:
204:
199:
194:
191:
189:
186:
184:
181:
179:
176:
174:
171:
170:
169:
168:
163:
158:
155:
153:
150:
148:
145:
143:
140:
139:
138:
137:
132:
127:
124:
122:
119:
117:
114:
112:
109:
107:
106:Cox's theorem
104:
102:
99:
97:
94:
92:
89:
87:
84:
82:
79:
78:
77:
76:
71:
68:
64:
60:
56:
53:
52:
48:
44:
43:
40:
36:
32:
31:
19:
2602:
2583:
2540:
2536:
2526:
2516:
2509:
2498:. Retrieved
2494:
2485:
2474:. Retrieved
2470:
2461:
2451:
2444:
2411:
2405:
2386:
2380:
2361:
2355:
2336:
2330:
2319:. Retrieved
2315:
2305:
2288:
2282:
2263:
2257:
2199:
2186:
1766:
1548:
1539:
1412:
1410:
955:
951:
941:
940:the product
935:
823:
715:
522:
519:
475:
391:
358:
352:
323:
321:
256:Bayes factor
54:
1545:Calculation
1348:), this is
344:Bayes' rule
2500:2022-08-19
2476:2022-08-18
2321:2022-08-18
2249:References
956:The event
359:posterior
346:. From an
340:likelihood
201:Estimators
73:Background
59:Likelihood
2565:199007973
2557:2515-2459
2430:cite book
2297:2123/9107
2208:. While
2100:∣
2043:∣
2007:∞
2002:∞
1999:−
1995:∫
1932:∣
1898:∣
1722:∣
1686:∞
1681:∞
1678:−
1674:∫
1648:∣
1588:∣
1510:×
1386:×
1374:×
1344:(via the
921:θ
901:θ
878:θ
866:θ
832:θ
809:θ
800:θ
788:θ
771:∫
697:θ
668:θ
634:θ
602:θ
545:θ
502:θ
450:θ
404:θ
101:Coherence
55:Posterior
2628:Category
2217:See also
1817:, where
523:Given a
332:updating
67:Evidence
1561:by the
948:Example
2613:
2590:
2563:
2555:
2418:
2393:
2368:
2343:
2270:
716:where
357:, the
2607:Wiley
2561:S2CID
1525:0.25.
525:prior
370:point
63:Prior
2611:ISBN
2588:ISBN
2553:ISSN
2436:link
2416:ISBN
2391:ISBN
2366:ISBN
2341:ISBN
2268:ISBN
372:and
334:the
322:The
2545:doi
2293:hdl
2200:In
1517:0.8
1513:0.4
1507:0.5
1395:0.8
1389:0.6
1377:0.4
1371:0.5
531:is
392:In
2630::
2609:.
2559:.
2551:.
2539:.
2535:.
2493:.
2469:.
2432:}}
2428:{{
2314:.
944:.
933:.
517:.
473:.
65:Ă·
61:Ă—
57:=
2619:.
2596:.
2567:.
2547::
2541:2
2520:.
2503:.
2479:.
2455:.
2438:)
2424:.
2399:.
2374:.
2349:.
2324:.
2299:.
2295::
2276:.
2178:.
2166:y
2163:=
2160:Y
2140:X
2120:)
2117:x
2114:(
2109:y
2106:=
2103:Y
2097:X
2093:f
2070:u
2067:d
2063:)
2060:u
2057:(
2052:y
2049:=
2046:Y
2040:X
2034:L
2028:)
2025:u
2022:(
2017:X
2013:f
1984:,
1972:x
1952:)
1949:y
1946:(
1941:x
1938:=
1935:X
1929:Y
1925:f
1921:=
1918:)
1915:x
1912:(
1907:y
1904:=
1901:Y
1895:X
1889:L
1876:,
1864:X
1844:)
1841:x
1838:(
1833:X
1829:f
1805:y
1802:=
1799:Y
1779:X
1749:u
1746:d
1742:)
1739:u
1736:(
1731:y
1728:=
1725:Y
1719:X
1713:L
1707:)
1704:u
1701:(
1696:X
1692:f
1668:)
1665:x
1662:(
1657:y
1654:=
1651:Y
1645:X
1639:L
1633:)
1630:x
1627:(
1622:X
1618:f
1611:=
1608:)
1605:x
1602:(
1597:y
1594:=
1591:Y
1585:X
1581:f
1522:=
1501:=
1495:)
1492:T
1489:(
1486:P
1481:)
1478:G
1475:(
1472:P
1469:)
1466:G
1462:|
1458:T
1455:(
1452:P
1446:=
1443:)
1440:T
1436:|
1432:G
1429:(
1426:P
1407:.
1392:=
1383:1
1380:+
1368:=
1365:)
1362:T
1359:(
1356:P
1332:)
1329:B
1326:(
1323:P
1320:)
1317:B
1313:|
1309:T
1306:(
1303:P
1300:+
1297:)
1294:G
1291:(
1288:P
1285:)
1282:G
1278:|
1274:T
1271:(
1268:P
1265:=
1262:)
1259:T
1256:(
1253:P
1233:)
1230:T
1227:(
1224:P
1202:)
1199:B
1195:|
1191:T
1188:(
1185:P
1163:)
1160:G
1156:|
1152:T
1149:(
1146:P
1124:G
1104:B
1084:)
1081:B
1078:(
1075:P
1053:)
1050:G
1047:(
1044:P
1021:)
1018:T
1014:|
1010:G
1007:(
1004:P
984:T
964:G
881:)
875:(
872:p
869:)
862:|
858:x
855:(
852:p
806:d
803:)
797:(
794:p
791:)
784:|
780:x
777:(
774:p
768:=
765:)
762:x
759:(
756:p
733:)
730:x
727:(
724:p
712:,
700:)
694:(
691:p
685:)
682:x
679:(
676:p
671:)
664:|
660:x
657:(
654:p
648:=
645:)
642:x
638:|
631:(
628:p
605:)
598:|
594:x
591:(
588:p
568:x
548:)
542:(
539:p
505:)
498:|
494:X
491:(
488:p
461:)
458:X
454:|
447:(
444:p
424:X
311:e
304:t
297:v
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.