279:
1671:, which models two possible outcomes. The Bernoulli distribution has a single parameter equal to the probability of one outcome, which in most cases is the probability of landing on heads. Devising a good model for the data is central in Bayesian inference. In most cases, models only approximate the true process, and may not take into account certain factors influencing the data. In Bayesian inference, probabilities can be assigned to model parameters. Parameters can be represented as
47:
474:
of the 20th century, Bayesian methods were viewed unfavorably by many statisticians due to philosophical and practical considerations. Many
Bayesian methods required much computation to complete, and most methods that were widely used during the century were based on the frequentist interpretation. However, with the advent of powerful computers and new
1655:
and hypotheses are considered to be fixed. Probabilities are not assigned to parameters or hypotheses in frequentist inference. For example, it would not make sense in frequentist inference to directly assign a probability to an event that can only happen once, such as the result of the next flip of
473:
developed the
Bayesian interpretation of probability. Laplace used methods that would now be considered Bayesian to solve a number of statistical problems. Many Bayesian methods were developed by later authors, but the term was not commonly used to describe such methods until the 1950s. During much
1758:
generates a posterior distribution, which has a central role in
Bayesian statistics, together with other distributions like the posterior predictive distribution and the prior predictive distribution. The correct visualization, analysis, and interpretation of these distributions is key to properly
1503:
is difficult to calculate as the calculation would involve sums or integrals that would be time-consuming to evaluate, so often only the product of the prior and likelihood is considered, since the evidence does not change in the same analysis. The posterior is proportional to this product:
2166:
Hajiramezanali, E. & Dadaneh, S. Z. & Karbalayghareh, A. & Zhou, Z. & Qian, X. Bayesian multi-domain learning for cancer subtype discovery from next-generation sequencing count data. 32nd
Conference on Neural Information Processing Systems (NIPS 2018), Montréal, Canada.
1437:
1749:
Exploratory data analysis seeks to reveal structure, or simple descriptions in data. We look at numbers or graphs and try to find patterns. We pursue leads suggested by background information, imagination, patterns perceived, and experience with other data
1783:
All these tasks are part of the
Exploratory analysis of Bayesian models approach and successfully performing them is central to the iterative and interactive modeling process. These tasks require both numerical and visual summaries.
1666:
specify a set of statistical assumptions and processes that represent how the sample data are generated. Statistical models have a number of parameters that can be modified. For example, a coin can be represented as samples from a
1209:
2425:
Vehtari, Aki; Gelman, Andrew; Simpson, Daniel; Carpenter, Bob; Bürkner, Paul-Christian (2021). "Rank-Normalization, Folding, and
Localization: An Improved Rˆ for Assessing Convergence of MCMC (With Discussion)".
2211:
van de Schoot, Rens; Depaoli, Sarah; King, Ruth; Kramer, Bianca; Märtens, Kaspar; Tadesse, Mahlet G.; Vannucci, Marina; Gelman, Andrew; Veen, Duco; Willemsen, Joukje; Yau, Christopher (January 14, 2021).
664:
458:. Since Bayesian statistics treats probability as a degree of belief, Bayes' theorem can directly assign a probability distribution that quantifies the belief to the parameter or set of parameters.
466:
419:. The degree of belief may be based on prior knowledge about the event, such as the results of previous experiments, or on personal beliefs about the event. This differs from a number of other
1572:
1190:
996:
877:
701:
1725:
techniques to include the outcome of earlier experiments in the design of the next experiment. This is achieved by updating 'beliefs' through the use of prior and
1615:
1501:
1472:
1121:
1069:
778:
1965:
The Theory That Would Not Die: How Bayes' Rule
Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy
1089:
1040:
1020:
961:
941:
921:
901:
842:
822:
802:
749:
725:
574:
554:
534:
514:
309:
100:
1705:
For reporting the results of a
Bayesian statistical analysis, Bayesian analysis reporting guidelines (BARG) are provided in an open-access article by
2184:
Lee, Se Yoon; Mallick, Bani (2021). "Bayesian
Hierarchical Modeling: Application Towards Production Results in the Eagle Ford Shale of South Texas".
182:
431:
of the relative frequency of an event after many trials. More concretely, analysis in
Bayesian methods codifies prior knowledge in the form of a
1801:
581:
2621:
2599:
2580:
2561:
2537:
2103:
2033:
1972:
1944:
1916:
1885:
496:
Bayes' theorem is used in Bayesian methods to update probabilities, which are degrees of belief, after obtaining new data. Given two events
2312:
Diaconis, Persi (2011) Theories of Data Analysis: From Magical Thinking Through Classical Statistics. John Wiley & Sons, Ltd 2:e55
2648:
2128:
1509:
446:
of an event based on data as well as prior information or beliefs about the event or conditions related to the event. For example, in
2518:
2482:
1629:
The general set of statistical techniques can be divided into a number of activities, many of which have special Bayesian versions.
391:
357:
302:
265:
2147:; Vanpaemel, W (2015). "Bayesian Estimation in Hierarchical Models". In Busemeyer, J R; Wang, Z; Townsend, J T; Eidels, A (eds.).
192:
1432:{\displaystyle P(B)=P(B\mid A_{1})P(A_{1})+P(B\mid A_{2})P(A_{2})+\dots +P(B\mid A_{n})P(A_{n})=\sum _{i}P(B\mid A_{i})P(A_{i})}
95:
2380:
Gabry, Jonah; Simpson, Daniel; Vehtari, Aki; Betancourt, Michael; Gelman, Andrew (2019). "Visualization in Bayesian workflow".
218:
1718:
1692:
156:
1762:
When working with Bayesian models there are a series of related tasks that need to be addressed besides inference itself:
1691:
for any unknown parameters. Indeed, parameters of prior distributions may themselves have prior distributions, leading to
2723:
1618:
295:
187:
125:
2474:
Bayesian Analysis with Python: Introduction to statistical modeling and probabilistic programming using PyMC3 and ArviZ
2506:
751:
represents the evidence, or new data that is to be taken into account (such as the result of a series of coin flips).
420:
824:
before evidence is taken into account. The prior probability may also quantify prior knowledge or information about
1730:
416:
177:
146:
1130:
1742:
1585:
239:
120:
1767:
1124:
479:
451:
260:
172:
1729:. This allows the design of experiments to make good use of resources of all types. An example of this is the
2502:
2049:
Lee, Se Yoon (2021). "Gibbs sampler and coordinate ascent variational inference: A set-theoretical review".
1869:
443:
424:
1675:. Bayesian inference uses Bayes' theorem to update probabilities after more evidence is obtained or known.
1726:
1668:
151:
1806:
1793:
1648:
1644:
1201:
999:
54:
1588:
methods, remains the same. The posterior can be approximated even without computing the exact value of
278:
1702:
For conducting a Bayesian statistical analysis, best practices are discussed by van de Schoot et al.
2342:
1657:
1577:
470:
404:
234:
115:
85:
2148:
2676:
1722:
880:
428:
66:
58:
2453:
2435:
2407:
2389:
2243:
2168:
2076:
2058:
1755:
1688:
1663:
1638:
1193:
704:
447:
432:
283:
208:
80:
966:
847:
671:
110:
1766:
Diagnoses of the quality of the inference, this is needed when using numerical methods such as
1745:
approach to the needs and peculiarities of Bayesian modeling. In the words of Persi Diaconis:
2617:
2614:
The Bayesian Choice : From Decision-Theoretic Foundations to Computational Implementation
2595:
2576:
2557:
2533:
2514:
2478:
2295:
2124:
2099:
2029:
1968:
1960:
1940:
1912:
1904:
1881:
1696:
1684:
1581:
781:
491:
469:
published in 1763. In several papers spanning from the late 18th to the early 19th centuries,
455:
439:
213:
90:
62:
2684:
2609:
2445:
2399:
2360:
2350:
2313:
2285:
2277:
2261:
2233:
2225:
2193:
2144:
2068:
2001:
1932:
1706:
368:
328:
105:
442:
to compute and update probabilities after obtaining new data. Bayes' theorem describes the
2707:
2673:
2547:
1833:
1672:
1591:
1477:
1448:
1097:
1045:
754:
141:
2346:
1687:
using Bayesian statistics has the identifying feature of requiring the specification of
2290:
2265:
1074:
1025:
1005:
946:
926:
906:
886:
827:
807:
787:
734:
710:
559:
539:
519:
499:
1828:
1773:
Model criticism, including evaluations of both model assumptions and model predictions
2717:
2247:
2080:
1865:
2697:
2457:
2411:
2680:
1873:
1197:
482:, Bayesian methods have seen increasing use within statistics in the 21st century.
462:
255:
17:
2072:
707:, it has a specific interpretation in Bayesian statistics. In the above equation,
2664:
2472:
731:(such as the statement that a coin lands on heads fifty percent of the time) and
1656:
a fair coin. However, it would make sense to state that the proportion of heads
728:
408:
2331:"ArviZ a unified library for exploratory analysis of Bayesian models in Python"
2281:
2229:
2197:
2317:
1647:
where uncertainty in inferences is quantified using probability. In classical
400:
2666:
Bayesians Versus Frequentists A Philosophical Debate on Statistical Reasoning
1741:
Exploratory analysis of Bayesian models is an adaptation or extension of the
1652:
475:
2551:
2299:
1909:
Statistical Rethinking : A Bayesian Course with Examples in R and Stan
1721:
includes a concept called 'influence of prior beliefs'. This approach uses
2382:
Journal of the Royal Statistical Society, Series A (Statistics in Society)
1442:
2365:
2355:
2330:
2329:
Kumar, Ravin; Carroll, Colin; Hartikainen, Ari; Martin, Osvaldo (2019).
2403:
2238:
46:
2449:
1042:
into account. Essentially, Bayes' theorem updates one's prior beliefs
2006:
1989:
1584:
of the posterior and is often computed in Bayesian statistics using
923:
is true. The likelihood quantifies the extent to which the evidence
2440:
2394:
2172:
2063:
2633:
1776:
Comparison of models, including model selection or model averaging
1441:
When there are an infinite number of outcomes, it is necessary to
2150:
The Oxford Handbook of Computational and Mathematical Psychology
2028:(2nd ed.). Providence, RI: American Mathematical Society.
1937:
Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan
1798:
For a list of mathematical logic notation used in this article
883:, which can be interpreted as the probability of the evidence
383:
380:
450:, Bayes' theorem can be used to estimate the parameters of a
377:
346:
343:
337:
2689:
1759:
answer the questions that motivate the inference process.
2634:
Bayes Rules! An Introduction to Applied Bayesian Modeling
659:{\displaystyle P(A\mid B)={\frac {P(B\mid A)P(A)}{P(B)}}}
2213:
1695:, also known as multi-level modeling. A special case is
2631:
Johnson, Alicia A.; Ott, Miles Q.; Dogucu, Mine. (2022)
465:, who formulated a specific case of Bayes' theorem in
1594:
1512:
1480:
1451:
1212:
1133:
1100:
1077:
1048:
1028:
1008:
969:
949:
929:
909:
889:
850:
830:
810:
790:
757:
737:
713:
703:. Although Bayes' theorem is a fundamental result of
674:
584:
562:
542:
522:
502:
392:
358:
349:
340:
2019:
2017:
1779:
Preparation of the results for a particular audience
374:
334:
371:
331:
1609:
1566:
1495:
1466:
1431:
1184:
1115:
1083:
1063:
1034:
1014:
990:
955:
935:
915:
895:
871:
836:
816:
796:
772:
743:
719:
695:
658:
568:
548:
528:
508:
2051:Communications in Statistics - Theory and Methods
1872:; Stern, Hal S.; Dunson, David B.; Vehtari, Aki;
1617:with methods such as Markov chain Monte Carlo or
2024:Grinstead, Charles M.; Snell, J. Laurie (2006).
1990:"When Did Bayesian Inference Become "Bayesian"?"
1567:{\displaystyle P(A\mid B)\propto P(B\mid A)P(A)}
1747:
427:interpretation, which views probability as the
2573:A First Course in Bayesian Statistical Methods
2528:Bolstad, William M.; Curran, James M. (2016).
303:
8:
2698:"A Gentle Introduction to Bayesian Analysis"
2156:. Oxford University Press. pp. 279–299.
1899:
1897:
1185:{\displaystyle \{A_{1},A_{2},\dots ,A_{n}\}}
1179:
1134:
2096:Bayesian and frequentist regression methods
1474:using the law of total probability. Often,
2649:"A Gentle Tutorial in Bayesian Statistics"
2553:Think Bayes: Bayesian Statistics in Python
310:
296:
29:
2439:
2393:
2364:
2354:
2289:
2237:
2062:
2005:
1593:
1511:
1479:
1450:
1420:
1401:
1379:
1363:
1344:
1310:
1291:
1263:
1244:
1211:
1173:
1154:
1141:
1132:
1099:
1076:
1047:
1027:
1007:
968:
948:
928:
908:
888:
849:
829:
809:
789:
756:
736:
712:
673:
606:
583:
561:
541:
521:
501:
2266:"Bayesian Analysis Reporting Guidelines"
1967:(First ed.). Chapman and Hall/CRC.
1880:(Third ed.). Chapman and Hall/CRC.
183:Integrated nested Laplace approximations
2692:and examples available for downloading.
1820:
1737:Exploratory analysis of Bayesian models
1660:as the number of coin flips increases.
247:
226:
200:
164:
133:
72:
37:
1911:(2nd ed.). Chapman and Hall/CRC.
1802:Notation in probability and statistics
405:Bayesian interpretation of probability
2636:. Chapman and Hall ISBN 9780367255398
1860:
1858:
1856:
1854:
1852:
1850:
1848:
1846:
1844:
1002:, the probability of the proposition
7:
2616:(2nd ed.). New York: Springer.
2592:Bayesian Statistics: An Introduction
2575:(2nd ed.). New York: Springer.
804:which expresses one's beliefs about
2530:Introduction to Bayesian Statistics
2214:"Bayesian statistics and modelling"
1071:after considering the new evidence
461:Bayesian statistics is named after
25:
576:is true is expressed as follows:
536:, the conditional probability of
438:Bayesian statistical methods use
27:Theory in the field of statistics
1939:(2nd ed.). Academic Press.
1094:The probability of the evidence
367:
327:
277:
193:Approximate Bayesian computation
45:
2708:Bayesian A/B Testing Calculator
2335:Journal of Open Source Software
1445:over all outcomes to calculate
219:Maximum a posteriori estimation
2218:Nature Reviews Methods Primers
1834:Merriam-Webster.com Dictionary
1719:Bayesian design of experiments
1693:Bayesian hierarchical modeling
1604:
1598:
1561:
1555:
1549:
1537:
1528:
1516:
1490:
1484:
1461:
1455:
1426:
1413:
1407:
1388:
1369:
1356:
1350:
1331:
1316:
1303:
1297:
1278:
1269:
1256:
1250:
1231:
1222:
1216:
1110:
1104:
1058:
1052:
985:
973:
866:
854:
767:
761:
684:
678:
650:
644:
636:
630:
624:
612:
600:
588:
421:interpretations of probability
399:) is a theory in the field of
1:
2685:doi:10.4249/scholarpedia.5230
2073:10.1080/03610926.2021.1921214
1988:Fienberg, Stephen E. (2006).
1643:Bayesian inference refers to
1619:variational Bayesian methods
1123:can be calculated using the
126:Principle of maximum entropy
2026:Introduction to probability
96:Bernstein–von Mises theorem
2740:
2556:(2nd ed.). O'Reilly.
2282:10.1038/s41562-021-01177-7
2230:10.1038/s43586-020-00001-2
2198:10.1007/s13571-020-00245-8
2121:Applied Bayesian modelling
2098:. New York, NY: Springer.
1731:multi-armed bandit problem
1636:
1200:, which is the set of all
1022:after taking the evidence
991:{\displaystyle P(A\mid B)}
872:{\displaystyle P(B\mid A)}
696:{\displaystyle P(B)\neq 0}
489:
2318:10.1002/9781118150702.ch1
1743:exploratory data analysis
1586:mathematical optimization
943:supports the proposition
121:Principle of indifference
2477:. Packt Publishing Ltd.
2471:Martin, Osvaldo (2018).
1768:Markov chain Monte Carlo
1204:of an experiment, then,
1125:law of total probability
480:Markov chain Monte Carlo
452:probability distribution
173:Markov chain Monte Carlo
2594:(4th ed.). Wiley.
2571:Hoff, Peter D. (2009).
2532:(3rd ed.). Wiley.
2123:(2nd ed.). Wiley.
2119:Congdon, Peter (2014).
2094:Wakefield, Jon (2013).
444:conditional probability
178:Laplace's approximation
165:Posterior approximation
2690:Bayesian modeling book
2590:Lee, Peter M. (2012).
2270:Nature Human Behaviour
1878:Bayesian Data Analysis
1752:
1727:posterior distribution
1669:Bernoulli distribution
1611:
1568:
1497:
1468:
1433:
1186:
1117:
1085:
1065:
1036:
1016:
992:
957:
937:
917:
897:
873:
838:
818:
798:
774:
745:
721:
697:
660:
570:
550:
530:
510:
485:
284:Mathematics portal
227:Evidence approximation
1807:List of logic symbols
1794:Bayesian epistemology
1713:Design of experiments
1649:frequentist inference
1645:statistical inference
1612:
1569:
1498:
1469:
1434:
1187:
1118:
1086:
1066:
1037:
1017:
1000:posterior probability
993:
958:
938:
918:
898:
874:
839:
819:
799:
775:
746:
727:usually represents a
722:
698:
661:
571:
551:
531:
511:
188:Variational inference
2696:Rens van de Schoot.
2610:Robert, Christian P.
1679:Statistical modeling
1610:{\displaystyle P(B)}
1592:
1578:maximum a posteriori
1510:
1496:{\displaystyle P(B)}
1478:
1467:{\displaystyle P(B)}
1449:
1210:
1131:
1116:{\displaystyle P(B)}
1098:
1075:
1064:{\displaystyle P(A)}
1046:
1026:
1006:
967:
947:
927:
907:
887:
848:
828:
808:
788:
773:{\displaystyle P(A)}
755:
735:
711:
672:
582:
560:
540:
520:
500:
471:Pierre-Simon Laplace
266:Posterior predictive
235:Evidence lower bound
116:Likelihood principle
86:Bayesian probability
2724:Bayesian statistics
2677:David Spiegelhalter
2674:Bayesian statistics
2513:. New York: Wiley.
2507:Smith, Adrian F. M.
2356:10.21105/joss.01143
2347:2019JOSS....4.1143K
1723:sequential analysis
1689:prior distributions
1683:The formulation of
1658:approaches one-half
881:likelihood function
323:Bayesian statistics
39:Bayesian statistics
33:Part of a series on
18:Bayesian Statistics
2404:10.1111/rssa.12378
1905:McElreath, Richard
1837:. Merriam-Webster.
1685:statistical models
1664:Statistical models
1639:Bayesian inference
1633:Bayesian inference
1607:
1564:
1493:
1464:
1429:
1384:
1182:
1113:
1081:
1061:
1032:
1012:
988:
953:
933:
913:
893:
869:
834:
814:
794:
770:
741:
717:
705:probability theory
693:
656:
566:
546:
526:
506:
448:Bayesian inference
433:prior distribution
209:Bayesian estimator
157:Hierarchical model
81:Bayesian inference
2663:Jordi Vallverdu.
2623:978-0-387-71598-8
2601:978-1-118-33257-3
2582:978-1-4419-2828-3
2563:978-1-4920-8946-9
2539:978-1-118-09156-2
2503:Bernardo, José M.
2450:10.1214/20-BA1221
2428:Bayesian Analysis
2276:(10): 1282–1291.
2105:978-1-4419-0924-4
2035:978-0-8218-9414-9
1994:Bayesian Analysis
1974:978-0-3001-8822-6
1946:978-0-12-405888-0
1918:978-0-367-13991-9
1887:978-1-4398-4095-5
1756:inference process
1697:Bayesian networks
1375:
1084:{\displaystyle B}
1035:{\displaystyle B}
1015:{\displaystyle A}
956:{\displaystyle A}
936:{\displaystyle B}
916:{\displaystyle A}
896:{\displaystyle B}
837:{\displaystyle A}
817:{\displaystyle A}
797:{\displaystyle A}
782:prior probability
744:{\displaystyle B}
720:{\displaystyle A}
654:
569:{\displaystyle B}
549:{\displaystyle A}
529:{\displaystyle B}
509:{\displaystyle A}
456:statistical model
320:
319:
214:Credible interval
147:Linear regression
16:(Redirected from
2731:
2704:
2702:
2670:
2659:
2657:
2656:
2627:
2605:
2586:
2567:
2548:Downey, Allen B.
2543:
2524:
2489:
2488:
2468:
2462:
2461:
2443:
2422:
2416:
2415:
2397:
2377:
2371:
2370:
2368:
2358:
2326:
2320:
2310:
2304:
2303:
2293:
2264:(Aug 16, 2021).
2258:
2252:
2251:
2241:
2208:
2202:
2201:
2181:
2175:
2164:
2158:
2157:
2155:
2141:
2135:
2134:
2116:
2110:
2109:
2091:
2085:
2084:
2066:
2057:(6): 1549–1568.
2046:
2040:
2039:
2021:
2012:
2011:
2009:
2007:10.1214/06-BA101
1985:
1979:
1978:
1961:McGrayne, Sharon
1957:
1951:
1950:
1929:
1923:
1922:
1901:
1892:
1891:
1874:Rubin, Donald B.
1862:
1839:
1838:
1825:
1707:John K. Kruschke
1673:random variables
1625:Bayesian methods
1616:
1614:
1613:
1608:
1573:
1571:
1570:
1565:
1502:
1500:
1499:
1494:
1473:
1471:
1470:
1465:
1438:
1436:
1435:
1430:
1425:
1424:
1406:
1405:
1383:
1368:
1367:
1349:
1348:
1315:
1314:
1296:
1295:
1268:
1267:
1249:
1248:
1191:
1189:
1188:
1183:
1178:
1177:
1159:
1158:
1146:
1145:
1122:
1120:
1119:
1114:
1090:
1088:
1087:
1082:
1070:
1068:
1067:
1062:
1041:
1039:
1038:
1033:
1021:
1019:
1018:
1013:
997:
995:
994:
989:
962:
960:
959:
954:
942:
940:
939:
934:
922:
920:
919:
914:
902:
900:
899:
894:
878:
876:
875:
870:
843:
841:
840:
835:
823:
821:
820:
815:
803:
801:
800:
795:
779:
777:
776:
771:
750:
748:
747:
742:
726:
724:
723:
718:
702:
700:
699:
694:
665:
663:
662:
657:
655:
653:
639:
607:
575:
573:
572:
567:
555:
553:
552:
547:
535:
533:
532:
527:
515:
513:
512:
507:
413:degree of belief
395:
390:
389:
386:
385:
382:
379:
376:
373:
361:
356:
355:
352:
351:
348:
345:
342:
339:
336:
333:
312:
305:
298:
282:
281:
248:Model evaluation
49:
30:
21:
2739:
2738:
2734:
2733:
2732:
2730:
2729:
2728:
2714:
2713:
2700:
2695:
2679:, Kenneth Rice
2662:
2654:
2652:
2647:Theo Kypraios.
2646:
2643:
2624:
2608:
2602:
2589:
2583:
2570:
2564:
2546:
2540:
2527:
2521:
2511:Bayesian Theory
2501:
2498:
2496:Further reading
2493:
2492:
2485:
2470:
2469:
2465:
2424:
2423:
2419:
2379:
2378:
2374:
2328:
2327:
2323:
2311:
2307:
2260:
2259:
2255:
2210:
2209:
2205:
2183:
2182:
2178:
2165:
2161:
2153:
2143:
2142:
2138:
2131:
2118:
2117:
2113:
2106:
2093:
2092:
2088:
2048:
2047:
2043:
2036:
2023:
2022:
2015:
1987:
1986:
1982:
1975:
1959:
1958:
1954:
1947:
1931:
1930:
1926:
1919:
1903:
1902:
1895:
1888:
1870:Carlin, John B.
1864:
1863:
1842:
1827:
1826:
1822:
1817:
1790:
1739:
1715:
1681:
1641:
1635:
1627:
1590:
1589:
1580:, which is the
1508:
1507:
1476:
1475:
1447:
1446:
1416:
1397:
1359:
1340:
1306:
1287:
1259:
1240:
1208:
1207:
1169:
1150:
1137:
1129:
1128:
1096:
1095:
1073:
1072:
1044:
1043:
1024:
1023:
1004:
1003:
965:
964:
945:
944:
925:
924:
905:
904:
885:
884:
846:
845:
826:
825:
806:
805:
786:
785:
753:
752:
733:
732:
709:
708:
670:
669:
640:
608:
580:
579:
558:
557:
538:
537:
518:
517:
498:
497:
494:
488:
393:
370:
366:
359:
330:
326:
316:
276:
261:Model averaging
240:Nested sampling
152:Empirical Bayes
142:Conjugate prior
111:Cromwell's rule
28:
23:
22:
15:
12:
11:
5:
2737:
2735:
2727:
2726:
2716:
2715:
2712:
2711:
2705:
2693:
2687:
2671:
2660:
2642:
2641:External links
2639:
2638:
2637:
2628:
2622:
2606:
2600:
2587:
2581:
2568:
2562:
2544:
2538:
2525:
2519:
2497:
2494:
2491:
2490:
2483:
2463:
2417:
2388:(2): 389–402.
2372:
2321:
2305:
2253:
2203:
2176:
2159:
2136:
2130:978-1119951513
2129:
2111:
2104:
2086:
2041:
2034:
2013:
1980:
1973:
1952:
1945:
1933:Kruschke, John
1924:
1917:
1893:
1886:
1866:Gelman, Andrew
1840:
1819:
1818:
1816:
1813:
1812:
1811:
1810:
1809:
1804:
1796:
1789:
1786:
1781:
1780:
1777:
1774:
1771:
1738:
1735:
1714:
1711:
1680:
1677:
1637:Main article:
1634:
1631:
1626:
1623:
1606:
1603:
1600:
1597:
1563:
1560:
1557:
1554:
1551:
1548:
1545:
1542:
1539:
1536:
1533:
1530:
1527:
1524:
1521:
1518:
1515:
1492:
1489:
1486:
1483:
1463:
1460:
1457:
1454:
1428:
1423:
1419:
1415:
1412:
1409:
1404:
1400:
1396:
1393:
1390:
1387:
1382:
1378:
1374:
1371:
1366:
1362:
1358:
1355:
1352:
1347:
1343:
1339:
1336:
1333:
1330:
1327:
1324:
1321:
1318:
1313:
1309:
1305:
1302:
1299:
1294:
1290:
1286:
1283:
1280:
1277:
1274:
1271:
1266:
1262:
1258:
1255:
1252:
1247:
1243:
1239:
1236:
1233:
1230:
1227:
1224:
1221:
1218:
1215:
1181:
1176:
1172:
1168:
1165:
1162:
1157:
1153:
1149:
1144:
1140:
1136:
1112:
1109:
1106:
1103:
1080:
1060:
1057:
1054:
1051:
1031:
1011:
987:
984:
981:
978:
975:
972:
952:
932:
912:
892:
868:
865:
862:
859:
856:
853:
833:
813:
793:
769:
766:
763:
760:
740:
716:
692:
689:
686:
683:
680:
677:
652:
649:
646:
643:
638:
635:
632:
629:
626:
623:
620:
617:
614:
611:
605:
602:
599:
596:
593:
590:
587:
565:
545:
525:
505:
492:Bayes' theorem
490:Main article:
487:
486:Bayes' theorem
484:
440:Bayes' theorem
423:, such as the
318:
317:
315:
314:
307:
300:
292:
289:
288:
287:
286:
271:
270:
269:
268:
263:
258:
250:
249:
245:
244:
243:
242:
237:
229:
228:
224:
223:
222:
221:
216:
211:
203:
202:
198:
197:
196:
195:
190:
185:
180:
175:
167:
166:
162:
161:
160:
159:
154:
149:
144:
136:
135:
134:Model building
131:
130:
129:
128:
123:
118:
113:
108:
103:
98:
93:
91:Bayes' theorem
88:
83:
75:
74:
70:
69:
51:
50:
42:
41:
35:
34:
26:
24:
14:
13:
10:
9:
6:
4:
3:
2:
2736:
2725:
2722:
2721:
2719:
2710:Dynamic Yield
2709:
2706:
2699:
2694:
2691:
2688:
2686:
2682:
2678:
2675:
2672:
2668:
2667:
2661:
2650:
2645:
2644:
2640:
2635:
2632:
2629:
2625:
2619:
2615:
2611:
2607:
2603:
2597:
2593:
2588:
2584:
2578:
2574:
2569:
2565:
2559:
2555:
2554:
2549:
2545:
2541:
2535:
2531:
2526:
2522:
2520:0-471-92416-4
2516:
2512:
2508:
2504:
2500:
2499:
2495:
2486:
2484:9781789341652
2480:
2476:
2475:
2467:
2464:
2459:
2455:
2451:
2447:
2442:
2437:
2433:
2429:
2421:
2418:
2413:
2409:
2405:
2401:
2396:
2391:
2387:
2383:
2376:
2373:
2367:
2362:
2357:
2352:
2348:
2344:
2340:
2336:
2332:
2325:
2322:
2319:
2315:
2309:
2306:
2301:
2297:
2292:
2287:
2283:
2279:
2275:
2271:
2267:
2263:
2262:Kruschke, J K
2257:
2254:
2249:
2245:
2240:
2235:
2231:
2227:
2223:
2219:
2215:
2207:
2204:
2199:
2195:
2191:
2187:
2180:
2177:
2174:
2170:
2163:
2160:
2152:
2151:
2146:
2145:Kruschke, J K
2140:
2137:
2132:
2126:
2122:
2115:
2112:
2107:
2101:
2097:
2090:
2087:
2082:
2078:
2074:
2070:
2065:
2060:
2056:
2052:
2045:
2042:
2037:
2031:
2027:
2020:
2018:
2014:
2008:
2003:
1999:
1995:
1991:
1984:
1981:
1976:
1970:
1966:
1962:
1956:
1953:
1948:
1942:
1938:
1934:
1928:
1925:
1920:
1914:
1910:
1906:
1900:
1898:
1894:
1889:
1883:
1879:
1875:
1871:
1867:
1861:
1859:
1857:
1855:
1853:
1851:
1849:
1847:
1845:
1841:
1836:
1835:
1830:
1824:
1821:
1814:
1808:
1805:
1803:
1800:
1799:
1797:
1795:
1792:
1791:
1787:
1785:
1778:
1775:
1772:
1769:
1765:
1764:
1763:
1760:
1757:
1751:
1746:
1744:
1736:
1734:
1732:
1728:
1724:
1720:
1712:
1710:
1708:
1703:
1700:
1698:
1694:
1690:
1686:
1678:
1676:
1674:
1670:
1665:
1661:
1659:
1654:
1650:
1646:
1640:
1632:
1630:
1624:
1622:
1620:
1601:
1595:
1587:
1583:
1579:
1574:
1558:
1552:
1546:
1543:
1540:
1534:
1531:
1525:
1522:
1519:
1513:
1505:
1487:
1481:
1458:
1452:
1444:
1439:
1421:
1417:
1410:
1402:
1398:
1394:
1391:
1385:
1380:
1376:
1372:
1364:
1360:
1353:
1345:
1341:
1337:
1334:
1328:
1325:
1322:
1319:
1311:
1307:
1300:
1292:
1288:
1284:
1281:
1275:
1272:
1264:
1260:
1253:
1245:
1241:
1237:
1234:
1228:
1225:
1219:
1213:
1205:
1203:
1199:
1195:
1174:
1170:
1166:
1163:
1160:
1155:
1151:
1147:
1142:
1138:
1126:
1107:
1101:
1092:
1078:
1055:
1049:
1029:
1009:
1001:
982:
979:
976:
970:
950:
930:
910:
890:
882:
863:
860:
857:
851:
831:
811:
791:
783:
764:
758:
738:
730:
714:
706:
690:
687:
681:
675:
666:
647:
641:
633:
627:
621:
618:
615:
609:
603:
597:
594:
591:
585:
577:
563:
543:
523:
503:
493:
483:
481:
477:
472:
468:
464:
459:
457:
453:
449:
445:
441:
436:
434:
430:
426:
422:
418:
414:
410:
406:
403:based on the
402:
398:
397:
388:
364:
363:
354:
324:
313:
308:
306:
301:
299:
294:
293:
291:
290:
285:
280:
275:
274:
273:
272:
267:
264:
262:
259:
257:
254:
253:
252:
251:
246:
241:
238:
236:
233:
232:
231:
230:
225:
220:
217:
215:
212:
210:
207:
206:
205:
204:
199:
194:
191:
189:
186:
184:
181:
179:
176:
174:
171:
170:
169:
168:
163:
158:
155:
153:
150:
148:
145:
143:
140:
139:
138:
137:
132:
127:
124:
122:
119:
117:
114:
112:
109:
107:
106:Cox's theorem
104:
102:
99:
97:
94:
92:
89:
87:
84:
82:
79:
78:
77:
76:
71:
68:
64:
60:
56:
53:
52:
48:
44:
43:
40:
36:
32:
31:
19:
2681:Scholarpedia
2665:
2653:. Retrieved
2630:
2613:
2591:
2572:
2552:
2529:
2510:
2473:
2466:
2431:
2427:
2420:
2385:
2381:
2375:
2366:11336/114615
2341:(33): 1143.
2338:
2334:
2324:
2308:
2273:
2269:
2256:
2221:
2217:
2206:
2189:
2185:
2179:
2162:
2149:
2139:
2120:
2114:
2095:
2089:
2054:
2050:
2044:
2025:
1997:
1993:
1983:
1964:
1955:
1936:
1927:
1908:
1877:
1832:
1823:
1782:
1761:
1753:
1748:
1740:
1716:
1704:
1701:
1682:
1662:
1642:
1628:
1575:
1506:
1440:
1206:
1198:sample space
1093:
667:
578:
495:
463:Thomas Bayes
460:
437:
412:
411:expresses a
322:
321:
256:Bayes factor
38:
2683:4(8):5230.
2239:1874/415909
2224:(1): 1–26.
2000:(1): 1–40.
903:given that
729:proposition
556:given that
425:frequentist
409:probability
2655:2013-11-03
2441:1903.08008
2395:1709.01449
2173:1810.09433
2064:2008.01006
1829:"Bayesian"
1815:References
1770:techniques
1653:parameters
476:algorithms
401:statistics
201:Estimators
73:Background
59:Likelihood
2248:234108684
2186:Sankhya B
2081:220935477
1544:∣
1532:∝
1523:∣
1443:integrate
1395:∣
1377:∑
1338:∣
1323:⋯
1285:∣
1238:∣
1194:partition
1164:…
980:∣
861:∣
688:≠
619:∣
595:∣
101:Coherence
55:Posterior
2718:Category
2612:(2007).
2550:(2021).
2509:(2000).
2458:88522683
2412:26590874
2300:34400814
2192:: 1–43.
1963:(2012).
1935:(2014).
1907:(2020).
1876:(2013).
1788:See also
1750:analyses
1651:, model
1202:outcomes
407:, where
67:Evidence
2343:Bibcode
2291:8526359
1196:of the
998:is the
879:is the
780:is the
467:a paper
362:-zee-ən
2620:
2598:
2579:
2560:
2536:
2517:
2481:
2456:
2410:
2298:
2288:
2246:
2127:
2102:
2079:
2032:
1971:
1943:
1915:
1884:
668:where
415:in an
2701:(PDF)
2651:(PDF)
2454:S2CID
2436:arXiv
2434:(2).
2408:S2CID
2390:arXiv
2244:S2CID
2169:arXiv
2154:(PDF)
2077:S2CID
2059:arXiv
1192:is a
1127:. If
478:like
429:limit
417:event
396:-zhən
63:Prior
2618:ISBN
2596:ISBN
2577:ISBN
2558:ISBN
2534:ISBN
2515:ISBN
2479:ISBN
2296:PMID
2125:ISBN
2100:ISBN
2030:ISBN
1969:ISBN
1941:ISBN
1913:ISBN
1882:ISBN
1754:The
1717:The
1582:mode
1576:The
516:and
2446:doi
2400:doi
2386:182
2361:hdl
2351:doi
2314:doi
2286:PMC
2278:doi
2234:hdl
2226:doi
2194:doi
2069:doi
2002:doi
784:of
454:or
394:BAY
365:or
360:BAY
2720::
2505:;
2452:.
2444:.
2432:16
2430:.
2406:.
2398:.
2384:.
2359:.
2349:.
2337:.
2333:.
2294:.
2284:.
2272:.
2268:.
2242:.
2232:.
2220:.
2216:.
2190:84
2188:.
2075:.
2067:.
2055:51
2053:.
2016:^
1996:.
1992:.
1896:^
1868:;
1843:^
1831:.
1733:.
1709:.
1699:.
1621:.
1091:.
963:.
844:.
435:.
384:ən
378:eɪ
338:eɪ
65:÷
61:×
57:=
2703:.
2669:.
2658:.
2626:.
2604:.
2585:.
2566:.
2542:.
2523:.
2487:.
2460:.
2448::
2438::
2414:.
2402::
2392::
2369:.
2363::
2353::
2345::
2339:4
2316::
2302:.
2280::
2274:5
2250:.
2236::
2228::
2222:1
2200:.
2196::
2171::
2133:.
2108:.
2083:.
2071::
2061::
2038:.
2010:.
2004::
1998:1
1977:.
1949:.
1921:.
1890:.
1605:)
1602:B
1599:(
1596:P
1562:)
1559:A
1556:(
1553:P
1550:)
1547:A
1541:B
1538:(
1535:P
1529:)
1526:B
1520:A
1517:(
1514:P
1491:)
1488:B
1485:(
1482:P
1462:)
1459:B
1456:(
1453:P
1427:)
1422:i
1418:A
1414:(
1411:P
1408:)
1403:i
1399:A
1392:B
1389:(
1386:P
1381:i
1373:=
1370:)
1365:n
1361:A
1357:(
1354:P
1351:)
1346:n
1342:A
1335:B
1332:(
1329:P
1326:+
1320:+
1317:)
1312:2
1308:A
1304:(
1301:P
1298:)
1293:2
1289:A
1282:B
1279:(
1276:P
1273:+
1270:)
1265:1
1261:A
1257:(
1254:P
1251:)
1246:1
1242:A
1235:B
1232:(
1229:P
1226:=
1223:)
1220:B
1217:(
1214:P
1180:}
1175:n
1171:A
1167:,
1161:,
1156:2
1152:A
1148:,
1143:1
1139:A
1135:{
1111:)
1108:B
1105:(
1102:P
1079:B
1059:)
1056:A
1053:(
1050:P
1030:B
1010:A
986:)
983:B
977:A
974:(
971:P
951:A
931:B
911:A
891:B
867:)
864:A
858:B
855:(
852:P
832:A
812:A
792:A
768:)
765:A
762:(
759:P
739:B
715:A
691:0
685:)
682:B
679:(
676:P
651:)
648:B
645:(
642:P
637:)
634:A
631:(
628:P
625:)
622:A
616:B
613:(
610:P
604:=
601:)
598:B
592:A
589:(
586:P
564:B
544:A
524:B
504:A
387:/
381:ʒ
375:b
372:ˈ
369:/
353:/
350:n
347:ə
344:i
341:z
335:b
332:ˈ
329:/
325:(
311:e
304:t
297:v
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.