1933:
either 1) dismisses a positive result as a false positive (ie., assumes it is false) since the actual incidence rate in the population (or a previously tested population) is low relative to the false positive rate, or 2) dismisses the possibility of a false positive result (ie., assumes it is true) because the actual incidence rate in the population (or previously tested population) is high relative to the false positive rate. In other words, Base Rate
Fallacy is when one has a bias toward the rate of incidence in a population or previously experienced population when evaluating the Base Rate Error rather than taking the normal, more objective, precautions to compensate for Base Rate Error. Base Rate Fallacy is also a factor in Confirmation Biases (a feedback loop where the base rate fallacy can falsely feed into reinforcing the perception of the incidence rate). Additionally, False Positive Paradox was merge into this article, but it is an example of Base Rate Error, not Base Rate Fallacy. —
1767:
1065:
1762:{\displaystyle {\begin{aligned}&{}\quad P(\mathrm {terrorist} \mid \mathrm {bell} )\\&={\frac {P(\mathrm {bell} \mid \mathrm {terrorist} )P(\mathrm {terrorist} )}{P(\mathrm {bell} )}}\\&={\frac {P(\mathrm {bell} \mid \mathrm {terrorist} )\times P(\mathrm {terrorist} )}{P(\mathrm {bell} \mid \mathrm {terrorist} )\times P(\mathrm {terrorist} )+P(\mathrm {bell} \mid \mathrm {nonterrorist} )\times P(\mathrm {nonterrorist} )}}\\&={\frac {0.99\cdot (100/1,000,000)}{{\frac {0.99\cdot 100}{1,000,000}}+{\frac {0.01\cdot 999,900}{1,000,000}}}}\\&=1/102\approx 1\%\end{aligned}}}
867:
would be about the same as the false-positive rate of the device. These special conditions hold sometimes: as for instance, about half the women undergoing a pregnancy test are actually pregnant, and some pregnancy tests give about the same rates of false positives and of false negatives. In this case, the rate of false positives per positive test will be nearly equal to the rate of false positives per nonpregnant woman. This is why it is very easy to fall into this fallacy: by coincidence it gives the correct answer in many common situations.
851:
consider what happens if an identical alarm system were set up in a second city with no terrorists at all. As in the first city, the alarm sounds for 1 out of every 100 non-terrorist inhabitants detected, but unlike in the first city, the alarm never sounds for a terrorist. Therefore 100% of all occasions of the alarm sounding are for non-terrorists, but a false negative rate cannot even be calculated. The 'number of non-terrorists per 100 bells' in that city is 100, yet P(
452:
243:
222:
1839:
777:
believe
Knowledge articles should be written in a way that someone with only an elementary knowledge of math and statistics could read it (Though there are always exceptions, like there are to all rules, off course). And to anser your question, yes I do believe there is "a happy medium that simplifies without taking so much away". --Spannerjam 16:31, 2 September 2013 (UTC)
191:
484:
354:
327:
654:"In some experiments, students were asked to estimate the Grade Point Averages of hypothetical students. When given relevant statistics about GPA distribution, students tended to ignore them if given descriptive information about the particular student, even if the new descriptive information did not seem to have anything to do with school performance."
1844:
2047:
364:
253:
2036:
Throughout this article, I found "Incidence" to be defined as "the proportion of diseased people in the population", while in
Epidemiology this definition actually pertains to "Prevalence". Incidence would be the proportion of people getting the condition in a condition-free population over a certain
1932:
Isn't this article confusing Base Rate Error with Base Rate
Fallacy/Bias? The first is something that can be calculated, it is not a cognitive bias, but rather a statistical factor one needs to be aware of when evaluating the results of a test in a population. The second is a cognitive bias where one
870:
In many real-world situations, though, particularly problems like detecting criminals in a largely law-abiding population, the small proportion of targets in the large population makes the base rate fallacy very applicable. Even a very low false-positive rate will result in so many false alarms as to
866:
The base rate fallacy is so misleading in this example because there are many more non-terrorists than terrorists. If, instead, the city had about as many terrorists as non-terrorists, and the false-positive rate and the false-negative rate were nearly equal, then the probability of misidentification
1899:
Its opposite, the Iron Law of
Phlegmatism, which I have just concocted, is equally true: some people are so impressed with the hum-drum ordinary that they can't open their eyes to see something new. Here I have shown an example of Somebody-Or-Other's Principle, which states that a Really Big Truth
862:
Imagine that the city's entire population of one million people pass in front of the camera. About 99 of the 100 terrorists will trigger the alarm — and so will about 9,999 of the 999,900 non-terrorists. Therefore, about 10,098 people will trigger the alarm, among which about 99 will be terrorists.
624:
I thought that base rate neglect involved ignoring priors in a
Bayesian context. For example, if a medical test with a 5% false positive rate is applied to a population whose background incidence of the disease is, say, 1%, the great majority test results will be faulty: they will indicate disease
1997:
Given its prominence in psychology discussions of base rate neglect, I'm surprised that the taxi problem is not discussed anywhere on this page. Some discussion of the problem could be added to the "Findings in psychology" section, or the problem could be added as the first of the examples in the
850:
The fallacy arises from confusing the natures of two different failure rates. The 'number of non-bells per 100 terrorists' and the 'number of non-terrorists per 100 bells' are unrelated quantities. One does not necessarily equal the other, and they don't even have to be almost equal. To show this,
776:
I assume you are referring to me. My guiding philosophy is "We should not write so that it is possible for the reader to understand us, but so that it is impossible for him to misunderstand us." The reason I deleted or dramatically altered much of the content was that it to me seemed perplexing. I
1977:
Understanding the effects of base rates is critical for properly interpreting the results from cancer screenings. Unfortunately, very few people understand that in the context of the general screening of asymptomatic patients, most of those who receive a positive (indicative of cancer) screening
1917:
There is no reason why you should not write a better intro para. I have difficulty following this stuff, but my principal objection to the intro para as it stands is that it doesn't describe the Base rate fallacy. At least, not in terms that the interested generalist can follow. A good wiki
2096:
Have you ever sat through a boring lecture on some topic where you were thinking something like "Why is this even relevant? How would I ever use this knowledge in real life"? Well, we have an important topic, and nowadays
Knowledge articles are one of the most common ways to learn about a topic
1821:
The false positive paradox is a common example of the base rate fallacy and probably doesn't deserve its own article. However, the text of the other article does a better job of explaining the idea in plain language (as opposed to just throwing math at you) than this one, and thus should not be
780:
Very sorry to get your name wrong, Spannerjam: too many of my tabs open at once and I got confused. It looks like you're still working on the article, so I'll withhold further comment until you've done more. The section on mathematical formalism was correct, and probability is relatively simple
1049:
527:
The article currently has an example that reads: "A group of policemen have breathalyzers displaying false drunkness in 5% of the cases. 1/1000 chauffeurs are drunk drivers. Suppose the policemen then stop a driver and force him to take a breathalyzer test. How high is the probability that a
2131:
There are far more salespeople, but those salespeople are not usually "very shy and quiet", so why is the assumption unjustified? Isn't it like hearing that somebody speaks
Finnish and (justifiably) thinking it likely that they are from Finland, even though most people are not from there?
755:
Where are the reliable sources identifying Base rate fallacy as a cognitive bias? More neutral to describe it as an error. Also, I agree that the article needed simplifying, that the initial example was too complicated and that the article had too much unreferenced content, but it seems
843:), the probability that a terrorist has been detected given the ringing of the bell? Someone making the 'base rate fallacy' would infer that there is a 99% chance that the detected person is a terrorist. Although the inference seems to make sense, it is actually bad reasoning, and a
1772:
Thus, in the example the probability was overestimated by more than 100 times due to the failure to take into account the fact that there are about 10000 times more nonterrorists than terrorists (a.k.a. failure to take into account the 'prior probability' of being a terrorist).
1918:
entry para should answer (1) "What is it?" and (2) "Why should I care?", say I. If this one also starts out with an assertion which - according to you (and who should doubt it?) - is plain wrong, and if you know what the base rate fallacy is, please share? Please? Success
2126:
For example, if someone hears that a friend is very shy and quiet, they might think the friend is more likely to be a librarian than a salesperson, even though there are far more salespeople than librarians overall - hence making it more likely that their friend is actually a
807:
and I've copied it here so to give other editors a chance to make it more accessible and consistent. I think the article needs a section towards the end on Bayes
Theorem, but it needs to be explained adequately for those who don't understand the formalism.
2093:. As the article mentions, people have been imprisoned because of it. Second, the reason there are examples of medical testing and law enforcement is because those areas are where the fallacy often appears – and most impactful when it does – in real life.
1957:
I agree that the lede is not readily accessible to general readers. Base rate is not clearly defined, and no examples are provided to illustrate why base rate neglect is such an important phenomenon. I'll work on a revision and will post it here for
760:
took out a lot of content which is actually quite useful, eg that it's also known as Base rate neglect and that the error involves confusing two different failure rates. Is there a happy medium that simplifies without taking so much away? Cheers,
657:"This finding has been used to argue that interviews are an unnecessary part of the college admissions process because empirical evidence shows that interviewers are unable to pick successful candidates better than basic statistics."
1978:
outcome do not have cancer or would not die from the cancer if they had not been screened. I would like to add a section to the page that discusses base rate neglect in the context of the interpretation of cancer screening results.
781:
maths, so I think it belongs in the article, but a long way down since it's the least accessible content, and it needs to be made consonant with the given example. "Error in thinking" is much preferable, so thanks for that. Cheers,
564:
Example #1 of Drunk
Drivers is still wrong. The example states the breathalyzer only gives false positives 5% of the time and yet the example claims the probability a positive test is accurate is only 2%. This is patently absurd.
898:, the base rate fallacy is committed by assuming that P(terrorist | bell) equals P(bell | terrorist) and then adding the premise that P(bell | terrorist)=99%. Now, is it true that P(terrorist | bell) equals P(bell | terrorist)?
1070:
735:
Hi IP editor 219.25.218.88. Upon thinking about it, it seems the definition should be expanded, because the typical error actually excludes both. I added a mathematical formalism section to try to clarify the matter.
904:
153:
1877:
a formal fallacy (also called deductive fallacy) is a pattern of reasoning rendered invalid by a flaw in its logical structure that can neatly be expressed in a standard logic system, for example propositional
2097:
whether we like it or not. By giving engaging examples that are relevant to people's lives, people can absorb the statistical concepts and connect them to the practical situations they might be used in.
2133:
812:
In a city of 1 million inhabitants let there be 100 terrorists and 999,900 non-terrorists. To simplify the example, it is assumed that all people present in the city are inhabitants. Thus, the
820:
probability of that same inhabitant being a non-terrorist is 0.9999. In an attempt to catch the terrorists, the city installs an alarm system with a surveillance camera and automatic
546:
You are correct: thanks for pointing this out. Although it's unoriginal and used by a lot of books, maybe the article should lead with Tversky and Kahneman's blue/green taxi example?
863:
So, the probability that a person triggering the alarm actually is a terrorist, is only about 99 in 10,098, which is less than 1%, and very, very far below our initial guess of 99%.
147:
2074:
Is it really necessary to use examples relating to Covid, drink driving, surveillance etc. that arouse strong political feelings, to illustrate a philosophical concept?
385:
on Knowledge. If you would like to support the project, please visit the project page, where you can get more details on how you can help, and where you can join the
1059:
to take into account the prior probability of any randomly selected inhabitant in the city being a terrorist and the total probability of the bell ringing:
2175:
2160:
422:
412:
309:
299:
79:
44:
1044:{\displaystyle P(\mathrm {terrorist} \mid \mathrm {bell} )\,{\overset {\underset {\mathrm {?} }{}}{=}}\,P(\mathrm {bell} \mid \mathrm {terrorist} ).}
2165:
2185:
387:
275:
85:
2170:
2155:
579:
No, it's not absurd. Calculate all parameters passionately. Or may be you are trying to say that conditions of this example is ridiculous?
717:
2137:
2190:
679:
The first fallacy assumes that students and college interviewers possess the same lack of skill in judging relevance of information.
831:
The false positive rate: If the camera scans a non-terrorist, a bell will not ring 99% of the time, but it will ring 1% of the time.
828:
The false negative rate: If the camera scans a terrorist, a bell will ring 99% of the time, and it will fail to ring 1% of the time.
377:
332:
266:
227:
1896:
It's cocktail-party social science, like the Peter Principle and the Dunning-Kruger Effect. Great fun, and sometimes useful.
835:
Suppose now that an inhabitant triggers the alarm. What is the chance that the person is a terrorist? In other words, what is P(
632:
I agree with the previous poster. This article seems totally bogus after reading other explanations of the base rate fallacy at
168:
2180:
1875:
This base rate fallacy is not a "formal fallacy" as claimed in the opening paragraph. Utter nonsense. From the click-through:
99:
30:
2100:
If you have some other example in mind that would get this across better with lower risk of polarization, please feel free to
135:
104:
20:
74:
699:
It looks like the text has been changed, but I still can't make heads or tails of it; maybe it needs a concrete example?
1940:
202:
1907:
510:
459:
337:
65:
584:
129:
821:
125:
721:
1934:
1781:
786:
766:
551:
109:
2079:
714:
580:
1903:
1815:
175:
2109:
2059:
208:
2141:
2075:
1796:
691:
633:
503:
on 4 November 2018. For the contribution history and old versions of the redirected page, please see
688:
B. Statistics are a better judge of suitability for academic placement than descriptive information.
528:
chauffeur who fails the test (assuming you don't anything else about him or her) is driving drunk?"
190:
2021:
1923:
675:
Therefore, college interviews are irrelevant when considering suitability of an academic candidate.
161:
55:
274:
on Knowledge. If you would like to participate, please visit the project page, where you can join
2003:
1983:
1963:
1823:
1804:
1777:
782:
762:
741:
704:
547:
70:
1871:
Not A Formal Fallacy -- and starting an article with an obviously false claim is a Bad Idea(tm.)
641:
603:
For instance, appealing to vivid examples should not be taken knowledge of prior probabilities.
859:) = 0%. There is zero chance that a terrorist has been detected given the ringing of the bell.
816:
probability of a randomly selected inhabitant of the city being a terrorist is 0.0001, and the
141:
2101:
1860:
1851:
844:
570:
500:
496:
369:
258:
51:
24:
2105:
2055:
1827:
667:
Therefore, college interviewers will ignore statistics in favour of descriptive information.
1056:
536:
2017:
1919:
637:
451:
2149:
1999:
1979:
1959:
1800:
737:
700:
847:
will show that the chances they are a terrorist are actually near 1%, not near 99%.
684:
A. Statistics are always relevant, and descriptive information is always irrelevant.
2014:
1856:
566:
671:
The descriptive information given to students in the experiment were irrelevant.
242:
221:
608:
532:
531:
This example can not be solved, because it contains too little information. --
382:
359:
271:
248:
817:
813:
616:
2113:
2083:
2063:
2025:
2007:
1987:
1967:
1944:
1927:
1911:
1889:
claimed, I am saying that I have observed somewhat the same thing. It's a
1864:
1831:
1808:
1799:
could help with making the examples more intuitive and easy to understand.
1785:
790:
770:
745:
725:
708:
694:
644:
588:
574:
555:
540:
2015:
https://en.wikipedia.org/Representativeness_heuristic#The_taxicab_problem
715:
http://www.ted.com/talks/peter_donnelly_shows_how_stats_fool_juries.html
381:, a collaborative effort to improve the coverage of content related to
353:
326:
634:
http://www.schneier.com/blog/archives/2006/07/terrorists_data.html
2040:
I think this is confusing for a reader and should be corrected.
663:
Students ignore statistics in favour of descriptive information.
478:
184:
15:
450:
2013:
WP has some coverage here that might be cross refrerenced:
673:
College interviews give a form of descriptive information.
665:
College interviews give a form of descriptive information.
1055:
That is not true. Instead, the correct calculation uses
625:
where there is none. It's not that the medical test is
804:
757:
505:
491:
160:
2054:
Thank you, I have retitled the relevant subsections.
1900:
is one the opposite of which is equally true. FWIW.
1068:
907:
615:
Yea, that makes no sense. I'll fix it in a minute. --
2089:
First of all, base rate fallacy is more than just a
629:
though: rather its significance can be overweighed.
270:, a collaborative effort to improve the coverage of
1882:It is a plausibly claimed empirical observation.
638:
http://citeseer.ist.psu.edu/axelsson00baserate.html
433:
1761:
1043:
1795:I think including tree diagrams like the ones in
33:for general discussion of the article's subject.
640:. Somebody should really correct this article.
713:Here's another example (starts after 11 min):
607:I don't understand what this statement means?
174:
8:
824:. The software has two failure rates of 1%:
803:The following was removed from the article
509:; for the discussion at that location, see
430:
321:
216:
2119:
1891:Ya see that sort of thing pretty often...
1738:
1683:
1648:
1623:
1608:
1553:
1503:
1486:
1445:
1404:
1387:
1347:
1306:
1289:
1280:
1249:
1209:
1171:
1154:
1145:
1117:
1085:
1075:
1069:
1067:
1006:
989:
973:
965:
946:
914:
906:
878:(Section heading) Mathematical formalism
871:make such a system useless in practice.
2134:2A00:23C5:FE1C:3701:64AA:CCF2:BB1C:5B9E
981:
963:
323:
218:
188:
2090:
391:about philosophy content on Knowledge.
7:
1838:
375:This article is within the scope of
264:This article is within the scope of
207:It is of interest to the following
23:for discussing improvements to the
2176:Low-importance Philosophy articles
2161:Mid-importance psychology articles
1752:
1587:
1584:
1581:
1578:
1575:
1572:
1569:
1566:
1563:
1560:
1557:
1554:
1537:
1534:
1531:
1528:
1525:
1522:
1519:
1516:
1513:
1510:
1507:
1504:
1496:
1493:
1490:
1487:
1470:
1467:
1464:
1461:
1458:
1455:
1452:
1449:
1446:
1429:
1426:
1423:
1420:
1417:
1414:
1411:
1408:
1405:
1397:
1394:
1391:
1388:
1372:
1369:
1366:
1363:
1360:
1357:
1354:
1351:
1348:
1331:
1328:
1325:
1322:
1319:
1316:
1313:
1310:
1307:
1299:
1296:
1293:
1290:
1259:
1256:
1253:
1250:
1234:
1231:
1228:
1225:
1222:
1219:
1216:
1213:
1210:
1196:
1193:
1190:
1187:
1184:
1181:
1178:
1175:
1172:
1164:
1161:
1158:
1155:
1127:
1124:
1121:
1118:
1110:
1107:
1104:
1101:
1098:
1095:
1092:
1089:
1086:
1031:
1028:
1025:
1022:
1019:
1016:
1013:
1010:
1007:
999:
996:
993:
990:
956:
953:
950:
947:
939:
936:
933:
930:
927:
924:
921:
918:
915:
14:
2120:I don't get the librarian example
2070:Politically contentious examples
2045:
1842:
1837:
482:
397:Knowledge:WikiProject Philosophy
362:
352:
325:
284:Knowledge:WikiProject Psychology
251:
241:
220:
189:
45:Click here to start a new topic.
2166:WikiProject Psychology articles
417:This article has been rated as
400:Template:WikiProject Philosophy
304:This article has been rated as
287:Template:WikiProject Psychology
2142:23:12, 13 September 2024 (UTC)
1998:"Examples" section. Thoughts?
1993:Tversky/Kahneman taxi problem?
1643:
1617:
1591:
1550:
1541:
1483:
1474:
1442:
1433:
1384:
1376:
1344:
1335:
1286:
1263:
1246:
1238:
1206:
1200:
1151:
1131:
1082:
1035:
986:
960:
911:
882:In the above example, where P(
660:This is fallacious in itself:
556:13:26, 14 September 2013 (UTC)
541:11:36, 14 September 2013 (UTC)
1:
2186:Low-importance logic articles
1945:07:09, 16 December 2019 (UTC)
1786:13:31, 7 September 2013 (UTC)
1077:
791:19:29, 2 September 2013 (UTC)
771:15:13, 2 September 2013 (UTC)
695:21:26, 12 December 2006 (UTC)
278:and see a list of open tasks.
42:Put new text under old text.
2114:03:50, 19 January 2024 (UTC)
2064:03:50, 19 January 2024 (UTC)
2026:03:36, 11 January 2023 (UTC)
1928:17:51, 9 February 2018 (UTC)
1912:07:45, 5 December 2017 (UTC)
1865:11:28, 4 November 2018 (UTC)
2171:C-Class Philosophy articles
2156:C-Class psychology articles
2084:21:12, 21 August 2023 (UTC)
2032:"Incidence" vs "Prevalence"
2008:21:38, 1 January 2019 (UTC)
1988:21:38, 1 January 2019 (UTC)
1968:21:38, 1 January 2019 (UTC)
890:) means the probability of
822:facial recognition software
726:23:21, 21 August 2009 (UTC)
709:08:01, 2 January 2008 (UTC)
682:The second fallacy assumes:
589:18:29, 16 August 2023 (UTC)
50:New to Knowledge? Welcome!
2207:
1809:12:02, 12 April 2017 (UTC)
423:project's importance scale
310:project's importance scale
2191:Logic task force articles
1832:02:02, 16 July 2017 (UTC)
746:18:18, 10 June 2010 (UTC)
645:21:28, 11 July 2007 (UTC)
611:19:44, 17 Jun 2004 (UTC)
458:
429:
416:
347:
303:
236:
215:
80:Be welcoming to newcomers
619:21:36, 17 Jun 2004 (UTC)
575:15:23, 4 June 2017 (UTC)
434:Associated task forces:
2181:C-Class logic articles
1816:False positive paradox
1763:
1045:
492:False positive paradox
455:
378:WikiProject Philosophy
267:WikiProject Psychology
197:This article is rated
75:avoid personal attacks
2104:and add it yourself.
2091:philosophical concept
1814:Proposed merger from
1776:(end of copied text)
1764:
1046:
489:The contents of the
454:
100:Neutral point of view
1066:
905:
650:Fallacies in example
105:No original research
403:Philosophy articles
290:psychology articles
1822:deleted entirely.
1759:
1757:
1078:
1041:
982:
978:
964:
456:
388:general discussion
203:content assessment
86:dispute resolution
47:
1953:Improve the lede?
1904:David Lloyd-Jones
1722:
1719:
1678:
1595:
1267:
979:
970:
845:calculation below
751:A cognitive bias?
523:Incorrect example
517:
516:
501:Base rate fallacy
477:
476:
473:
472:
469:
468:
465:
464:
370:Philosophy portal
320:
319:
316:
315:
259:Psychology portal
183:
182:
66:Assume good faith
43:
25:Base rate fallacy
2198:
2053:
2049:
2048:
1973:Cancer screening
1937:
1849:
1846:
1845:
1841:
1840:
1768:
1766:
1765:
1760:
1758:
1742:
1723:
1721:
1720:
1718:
1701:
1684:
1679:
1677:
1660:
1649:
1646:
1627:
1609:
1596:
1594:
1590:
1540:
1499:
1473:
1432:
1400:
1379:
1375:
1334:
1302:
1281:
1268:
1266:
1262:
1241:
1237:
1199:
1167:
1146:
1130:
1113:
1076:
1050:
1048:
1047:
1042:
1034:
1002:
980:
977:
972:
966:
959:
942:
581:Derek Di My Mind
508:
486:
485:
479:
441:
431:
405:
404:
401:
398:
395:
372:
367:
366:
365:
356:
349:
348:
343:
340:
329:
322:
292:
291:
288:
285:
282:
261:
256:
255:
254:
245:
238:
237:
232:
224:
217:
200:
194:
193:
185:
179:
178:
164:
95:Article policies
16:
2206:
2205:
2201:
2200:
2199:
2197:
2196:
2195:
2146:
2145:
2122:
2072:
2046:
2044:
2034:
1995:
1975:
1955:
1935:
1893:kind of truth.
1873:
1847:
1843:
1819:
1793:
1756:
1755:
1728:
1725:
1724:
1702:
1685:
1661:
1650:
1647:
1610:
1601:
1598:
1597:
1380:
1282:
1273:
1270:
1269:
1242:
1147:
1138:
1135:
1134:
1073:
1064:
1063:
971:
903:
902:
801:
753:
733:
692:Sasuke Sarutobi
652:
601:
525:
504:
483:
439:
402:
399:
396:
393:
392:
368:
363:
361:
341:
335:
289:
286:
283:
280:
279:
257:
252:
250:
230:
201:on Knowledge's
198:
121:
116:
115:
114:
91:
61:
12:
11:
5:
2204:
2202:
2194:
2193:
2188:
2183:
2178:
2173:
2168:
2163:
2158:
2148:
2147:
2121:
2118:
2117:
2116:
2098:
2094:
2071:
2068:
2067:
2066:
2033:
2030:
2029:
2028:
1994:
1991:
1974:
1971:
1954:
1951:
1950:
1949:
1948:
1947:
1885:By calling it
1872:
1869:
1868:
1867:
1818:
1812:
1792:
1789:
1770:
1769:
1754:
1751:
1748:
1745:
1741:
1737:
1734:
1731:
1729:
1727:
1726:
1717:
1714:
1711:
1708:
1705:
1700:
1697:
1694:
1691:
1688:
1682:
1676:
1673:
1670:
1667:
1664:
1659:
1656:
1653:
1645:
1642:
1639:
1636:
1633:
1630:
1626:
1622:
1619:
1616:
1613:
1607:
1604:
1602:
1600:
1599:
1593:
1589:
1586:
1583:
1580:
1577:
1574:
1571:
1568:
1565:
1562:
1559:
1556:
1552:
1549:
1546:
1543:
1539:
1536:
1533:
1530:
1527:
1524:
1521:
1518:
1515:
1512:
1509:
1506:
1502:
1498:
1495:
1492:
1489:
1485:
1482:
1479:
1476:
1472:
1469:
1466:
1463:
1460:
1457:
1454:
1451:
1448:
1444:
1441:
1438:
1435:
1431:
1428:
1425:
1422:
1419:
1416:
1413:
1410:
1407:
1403:
1399:
1396:
1393:
1390:
1386:
1383:
1378:
1374:
1371:
1368:
1365:
1362:
1359:
1356:
1353:
1350:
1346:
1343:
1340:
1337:
1333:
1330:
1327:
1324:
1321:
1318:
1315:
1312:
1309:
1305:
1301:
1298:
1295:
1292:
1288:
1285:
1279:
1276:
1274:
1272:
1271:
1265:
1261:
1258:
1255:
1252:
1248:
1245:
1240:
1236:
1233:
1230:
1227:
1224:
1221:
1218:
1215:
1212:
1208:
1205:
1202:
1198:
1195:
1192:
1189:
1186:
1183:
1180:
1177:
1174:
1170:
1166:
1163:
1160:
1157:
1153:
1150:
1144:
1141:
1139:
1137:
1136:
1133:
1129:
1126:
1123:
1120:
1116:
1112:
1109:
1106:
1103:
1100:
1097:
1094:
1091:
1088:
1084:
1081:
1074:
1072:
1071:
1057:Bayes' theorem
1054:
1052:
1051:
1040:
1037:
1033:
1030:
1027:
1024:
1021:
1018:
1015:
1012:
1009:
1005:
1001:
998:
995:
992:
988:
985:
976:
969:
962:
958:
955:
952:
949:
945:
941:
938:
935:
932:
929:
926:
923:
920:
917:
913:
910:
880:
879:
874:
833:
832:
829:
811:
800:
797:
796:
795:
794:
793:
752:
749:
732:
729:
689:
687:
685:
683:
676:
674:
672:
668:
666:
664:
651:
648:
623:
621:
620:
606:
600:
599:Something else
597:
596:
595:
594:
593:
592:
591:
559:
558:
524:
521:
519:
515:
514:
487:
475:
474:
471:
470:
467:
466:
463:
462:
457:
447:
446:
444:
442:
436:
435:
427:
426:
419:Low-importance
415:
409:
408:
406:
374:
373:
357:
345:
344:
342:Low‑importance
330:
318:
317:
314:
313:
306:Mid-importance
302:
296:
295:
293:
276:the discussion
263:
262:
246:
234:
233:
231:Mid‑importance
225:
213:
212:
206:
195:
181:
180:
118:
117:
113:
112:
107:
102:
93:
92:
90:
89:
82:
77:
68:
62:
60:
59:
48:
39:
38:
35:
34:
28:
13:
10:
9:
6:
4:
3:
2:
2203:
2192:
2189:
2187:
2184:
2182:
2179:
2177:
2174:
2172:
2169:
2167:
2164:
2162:
2159:
2157:
2154:
2153:
2151:
2144:
2143:
2139:
2135:
2129:
2128:
2115:
2111:
2107:
2103:
2099:
2095:
2092:
2088:
2087:
2086:
2085:
2081:
2077:
2069:
2065:
2061:
2057:
2052:
2043:
2042:
2041:
2038:
2031:
2027:
2023:
2019:
2016:
2012:
2011:
2010:
2009:
2005:
2001:
1992:
1990:
1989:
1985:
1981:
1972:
1970:
1969:
1965:
1961:
1952:
1946:
1942:
1938:
1931:
1930:
1929:
1925:
1921:
1916:
1915:
1914:
1913:
1909:
1905:
1901:
1897:
1894:
1892:
1888:
1883:
1880:
1879:
1870:
1866:
1862:
1858:
1855:
1853:
1836:
1835:
1834:
1833:
1829:
1825:
1817:
1813:
1811:
1810:
1806:
1802:
1798:
1791:Visualization
1790:
1788:
1787:
1783:
1779:
1778:MartinPoulter
1774:
1749:
1746:
1743:
1739:
1735:
1732:
1730:
1715:
1712:
1709:
1706:
1703:
1698:
1695:
1692:
1689:
1686:
1680:
1674:
1671:
1668:
1665:
1662:
1657:
1654:
1651:
1640:
1637:
1634:
1631:
1628:
1624:
1620:
1614:
1611:
1605:
1603:
1547:
1544:
1500:
1480:
1477:
1439:
1436:
1401:
1381:
1341:
1338:
1303:
1283:
1277:
1275:
1243:
1203:
1168:
1148:
1142:
1140:
1114:
1079:
1062:
1061:
1060:
1058:
1038:
1003:
983:
974:
967:
943:
908:
901:
900:
899:
897:
893:
889:
885:
877:
876:
875:
872:
868:
864:
860:
858:
854:
848:
846:
842:
838:
830:
827:
826:
825:
823:
819:
815:
809:
806:
798:
792:
788:
784:
783:MartinPoulter
779:
778:
775:
774:
773:
772:
768:
764:
763:MartinPoulter
759:
750:
748:
747:
743:
739:
730:
728:
727:
723:
719:
718:89.142.146.23
716:
711:
710:
706:
702:
697:
696:
693:
680:
677:
669:
661:
658:
655:
649:
647:
646:
643:
639:
635:
630:
628:
618:
614:
613:
612:
610:
604:
598:
590:
586:
582:
578:
577:
576:
572:
568:
563:
562:
561:
560:
557:
553:
549:
548:MartinPoulter
545:
544:
543:
542:
538:
534:
529:
522:
520:
512:
511:its talk page
507:
502:
498:
494:
493:
488:
481:
480:
461:
453:
449:
448:
445:
443:
438:
437:
432:
428:
424:
420:
414:
411:
410:
407:
390:
389:
384:
380:
379:
371:
360:
358:
355:
351:
350:
346:
339:
334:
331:
328:
324:
311:
307:
301:
298:
297:
294:
277:
273:
269:
268:
260:
249:
247:
244:
240:
239:
235:
229:
226:
223:
219:
214:
210:
204:
196:
192:
187:
186:
177:
173:
170:
167:
163:
159:
155:
152:
149:
146:
143:
140:
137:
134:
131:
127:
124:
123:Find sources:
120:
119:
111:
110:Verifiability
108:
106:
103:
101:
98:
97:
96:
87:
83:
81:
78:
76:
72:
69:
67:
64:
63:
57:
53:
52:Learn to edit
49:
46:
41:
40:
37:
36:
32:
26:
22:
18:
17:
2130:
2127:salesperson.
2125:
2123:
2073:
2050:
2039:
2035:
1996:
1976:
1956:
1902:
1898:
1895:
1890:
1886:
1884:
1881:
1876:
1874:
1850:
1820:
1797:this article
1794:
1775:
1771:
1053:
895:
891:
887:
883:
881:
873:
869:
865:
861:
856:
852:
849:
840:
836:
834:
810:
805:in this edit
802:
799:Removed text
754:
734:
712:
698:
681:
678:
670:
662:
659:
656:
653:
631:
626:
622:
605:
602:
530:
526:
518:
490:
418:
386:
376:
305:
265:
209:WikiProjects
171:
165:
157:
150:
144:
138:
132:
122:
94:
19:This is the
2106:Coolclawcat
2076:Subspace345
2056:Coolclawcat
506:its history
148:free images
31:not a forum
2150:Categories
2037:timespan.
1936:al-Shimoni
731:Definition
627:irrelevant
495:page were
394:Philosophy
383:philosophy
333:Philosophy
281:Psychology
272:Psychology
228:Psychology
2124:It says:
2018:DKEdwards
1920:Charles01
1887:plausibly
1854:complete.
818:base rate
814:base rate
758:this edit
88:if needed
71:Be polite
21:talk page
2000:Regutten
1980:Regutten
1960:Regutten
1958:comment.
1801:Anka.213
738:WavePart
701:Reyemile
56:get help
29:This is
27:article.
2102:be bold
1857:Klbrain
567:RonCram
421:on the
308:on the
199:C-class
154:WP refs
142:scholar
1852:Merger
1824:Jode32
894:given
686:AND/OR
497:merged
205:scale.
126:Google
1878:logic
642:Jbl26
609:CSTAR
533:Kaba3
499:into
460:Logic
338:Logic
169:JSTOR
130:books
84:Seek
2138:talk
2110:talk
2080:talk
2060:talk
2051:Done
2022:talk
2004:talk
1984:talk
1964:talk
1941:talk
1924:talk
1908:talk
1861:talk
1828:talk
1805:talk
1782:talk
1687:0.01
1652:0.99
1612:0.99
787:talk
767:talk
742:talk
722:talk
705:talk
636:and
617:Taak
585:talk
571:talk
552:talk
537:talk
162:FENS
136:news
73:and
1744:102
1716:000
1710:000
1699:900
1693:999
1675:000
1669:000
1658:100
1641:000
1635:000
1621:100
690:--
413:Low
300:Mid
176:TWL
2152::
2140:)
2112:)
2082:)
2062:)
2024:)
2006:)
1986:)
1966:)
1943:)
1926:)
1910:)
1863:)
1830:)
1807:)
1784:)
1753:%
1747:≈
1690:⋅
1655:⋅
1615:⋅
1545:×
1501:∣
1437:×
1402:∣
1339:×
1304:∣
1169:∣
1115:∣
1004:∣
944:∣
886:|
855:|
839:|
789:)
769:)
744:)
724:)
707:)
587:)
573:)
554:)
539:)
440:/
336::
156:)
54:;
2136:(
2108:(
2078:(
2058:(
2020:(
2002:(
1982:(
1962:(
1939:(
1922:(
1906:(
1859:(
1848:Y
1826:(
1803:(
1780:(
1750:1
1740:/
1736:1
1733:=
1713:,
1707:,
1704:1
1696:,
1681:+
1672:,
1666:,
1663:1
1644:)
1638:,
1632:,
1629:1
1625:/
1618:(
1606:=
1592:)
1588:t
1585:s
1582:i
1579:r
1576:o
1573:r
1570:r
1567:e
1564:t
1561:n
1558:o
1555:n
1551:(
1548:P
1542:)
1538:t
1535:s
1532:i
1529:r
1526:o
1523:r
1520:r
1517:e
1514:t
1511:n
1508:o
1505:n
1497:l
1494:l
1491:e
1488:b
1484:(
1481:P
1478:+
1475:)
1471:t
1468:s
1465:i
1462:r
1459:o
1456:r
1453:r
1450:e
1447:t
1443:(
1440:P
1434:)
1430:t
1427:s
1424:i
1421:r
1418:o
1415:r
1412:r
1409:e
1406:t
1398:l
1395:l
1392:e
1389:b
1385:(
1382:P
1377:)
1373:t
1370:s
1367:i
1364:r
1361:o
1358:r
1355:r
1352:e
1349:t
1345:(
1342:P
1336:)
1332:t
1329:s
1326:i
1323:r
1320:o
1317:r
1314:r
1311:e
1308:t
1300:l
1297:l
1294:e
1291:b
1287:(
1284:P
1278:=
1264:)
1260:l
1257:l
1254:e
1251:b
1247:(
1244:P
1239:)
1235:t
1232:s
1229:i
1226:r
1223:o
1220:r
1217:r
1214:e
1211:t
1207:(
1204:P
1201:)
1197:t
1194:s
1191:i
1188:r
1185:o
1182:r
1179:r
1176:e
1173:t
1165:l
1162:l
1159:e
1156:b
1152:(
1149:P
1143:=
1132:)
1128:l
1125:l
1122:e
1119:b
1111:t
1108:s
1105:i
1102:r
1099:o
1096:r
1093:r
1090:e
1087:t
1083:(
1080:P
1039:.
1036:)
1032:t
1029:s
1026:i
1023:r
1020:o
1017:r
1014:r
1011:e
1008:t
1000:l
997:l
994:e
991:b
987:(
984:P
975:?
968:=
961:)
957:l
954:l
951:e
948:b
940:t
937:s
934:i
931:r
928:o
925:r
922:r
919:e
916:t
912:(
909:P
896:B
892:T
888:B
884:T
857:B
853:T
841:B
837:T
785:(
765:(
740:(
720:(
703:(
583:(
569:(
550:(
535:(
513:.
425:.
312:.
211::
172:·
166:·
158:·
151:·
145:·
139:·
133:·
128:(
58:.
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.