Knowledge (XXG)

Cognitive bias mitigation

Source đź“ť

506:. This approach involves three steps: with a specific project in mind, identify a number of past projects that share a large number of elements with the project under scrutiny; for this group of projects, establish a probability distribution of the parameter that is being forecast; and, compare the specific project with the group of similar projects, in order to establish the most likely value of the selected parameter for the specific project. This simply stated method masks potential complexity regarding application to real-life projects: few projects are characterizable by a single parameter; multiple parameters exponentially complicates the process; gathering sufficient data on which to build robust probability distributions is problematic; and, project outcomes are rarely unambiguous and their reportage is often skewed by stakeholders' interests. Nonetheless, this approach has merit as part of a cognitive bias mitigation protocol when the process is applied with a maximum of diligence, in situations where good data is available and all stakeholders can be expected to cooperate. 140:, in which a July 23, 1983 Air Canada flight from Montreal to Edmonton ran out of fuel 41,000 feet over Manitoba because of a measurement error on refueling, an outcome later determined to be the result of a series of unchecked assumptions made by ground personnel. Without power to operate radio, radar or other navigation aids, and only manual operation of the aircraft's control surfaces, the flight crew managed to locate an abandoned Canadian Air Force landing strip near Gimli, Manitoba. Without engine power, and with only manual wheel braking, the pilot put the aircraft down, complete with 61 passengers plus crew, and safely brought it to a stop. This outcome was the result of skill (the pilot had glider experience) and luck (the co-pilot just happened to know about the airstrip); there were no deaths, the damage to the aircraft was modest, and there were knowledgeable survivors to inform modifications to fueling procedures at all Canadian airports. 589:, of their actualization and then choosing which advisories, if any, to act on. In this view, System 2 is slow, simple-minded and lazy, usually defaulting to System 1 advisories and overriding them only when intensively trained to do so or when cognitive dissonance would result. In this view, our 'heuristic toolkit' resides largely in System 1, conforming to the view of cognitive biases being unconscious, automatic and very difficult to detect and override. Evolutionary psychology practitioners emphasize that our heuristic toolkit, despite the apparent abundance of 'reasoning errors' attributed to it, actually performs exceptionally well, given the rate at which it must operate, the range of judgments it produces, and the stakes involved. The System 1/2 view of the human reasoning mechanism appears to have empirical plausibility (see 562:, Tooby, Haselton, Confer and others posit that cognitive biases are more properly referred to as cognitive heuristics, and should be viewed as a toolkit of cognitive shortcuts selected for by evolutionary pressure and thus are features rather than flaws, as assumed in the prevalent view. Theoretical models and analyses supporting this view are plentiful. This view suggests that negative reasoning outcomes arise primarily because the reasoning challenges faced by modern humans, and the social and political context within which these are presented, make demands on our ancient 'heuristic toolkit' that at best create confusion as to which heuristics to apply in a given situation, and at worst generate what adherents of the prevalent view call 'reasoning errors'. 647:. Although there is some attention paid to the human reasoning mechanism itself, the dominant approach is to anticipate problematic situations, constrain human operations through process mandates, and guide human decisions through fixed response protocols specific to the domain involved. While this approach can produce effective responses to critical situations under stress, the protocols involved must be viewed as having limited generalizability beyond the domain for which they were developed, with the implication that solutions in this discipline may provide only generic frameworks to a theory and practice of cognitive bias mitigation. 577:, and possibly other cognitive biases, which is a radical departure from the prevalent view, which holds that human reasoning is intended to assist individual economic decisions. Their view suggests that it evolved as a social phenomenon and that the goal was argumentation, i.e. to convince others and to be careful when others try to convince us. It is too early to tell whether this idea applies more generally to other cognitive biases, but the point of view supporting the theory may be useful in the construction of a theory and practice of cognitive bias mitigation. 205:, an accomplished professional in the medical field, recounts the results of an initiative at a major US hospital, in which a test run showed that doctors skipped at least one of only 5 steps in 1/3 of certain surgery cases, after which nurses were given the authority and responsibility to catch doctors missing any steps in a simple checklist aimed at reducing central line infections. In the subsequent 15-month period, infection rates went from 11% to 0%, 8 deaths were avoided and some $ 2 million in avoidable costs were saved. 199:, in which several Computer Aided Dispatch (CAD) system failures resulted in out-of-specification service delays and reports of deaths attributed to these delays. A 1992 system failure was particularly impactful, with service delays of up to 11 hours resulting in an estimated 30 unnecessary deaths in addition to hundreds of delayed medical procedures. This incident is one example of how large computer system development projects exhibit major flaws in planning, design, execution, test, deployment and maintenance. 382:, is a method of studying strategic decision making in situations involving multi-step interactions with multiple agents with or without perfect information. As with decision theory, the theoretical underpinning of game theory assumes that all decision makers are rational agents trying to maximize the economic expected value/utility of their choices, and that to accomplish this they utilize formal analytical methods such as mathematics, probability, statistics, and logic under cognitive resource constraints. 539:
suggests that for the task of estimating software projects, despite the strong analytical aspect of this task, standards of performance focusing on workplace social context were much more dominant than formal analytical methods. This finding, if generalizable to other tasks and disciplines, would discount the potential of expert-level training as a cognitive bias mitigation approach, and could contribute a narrow but important idea to a theory and practice of cognitive bias mitigation.
281:, the tendency to judge something as belonging to a class based on a few salient characteristics without accounting for base rates of those characteristics. For example, the belief that one will not become an alcoholic because one lacks some characteristic of an alcoholic stereotype, or, that one has a higher probability to win the lottery because one buys tickets from the same kind of vendor as several known big winners. 305:, the tendency to estimate that what is easily remembered is more likely than that which is not. For example, estimating that an information meeting on municipal planning will be boring because the last such meeting you attended (on a different topic) was so, or, not believing your Member of Parliament's promise to fight for women's equality because he didn't show up to your home bake sale fundraiser for him. 550:
attribution error, projection bias, and representativeness), provided participants with individualized feedback, mitigating strategies, and practice, resulted in an immediate reduction of more than 30% in commission of the biases and a long term (2 to 3-month delay) reduction of more than 20%. The instructional videos were also effective, but were less effective than the games.
386:
range of scenarios, game theory predictions, like those in decision theory, often do not match actual human choices. As with decision theory, practitioners tend to view such deviations as 'irrational', and rather than attempt to model such behavior, by implication hold that cognitive bias mitigation can only be achieved by decision makers becoming more like rational agents.
90:, and perhaps other cognitive biases. Five people, including both expedition leaders, lost their lives despite explicit warnings in briefings prior to and during the ascent of Everest. In addition to the leaders' mistakes, most team members, though they recognized their leader's faulty judgments, failed to insist on following through on the established ascent rules. 725: 535:
reported in popular-audience media of firefighter captains, military platoon leaders and others making correct, snap judgments under extreme duress suggest that these responses are likely not generalizable and may contribute to a theory and practice of cognitive bias mitigation only the general idea of domain-specific intensive training.
739: 253:, the tendency to react to how information is framed, beyond its factual content. For example, choosing no surgery when told it has a 10% failure rate, where one would have opted for surgery if told it has a 90% success rate, or, opting not to choose organ donation as part of driver's license renewal when the default is 'No'. 390:
optimal decision strategies are available for all agents, real-life decision-makers often do not find them; indeed they sometimes apparently do not even try to find them, suggesting that some agents are not consistently 'rational'. game theory does not appear to accommodate any kind of agent other than the rational agent.
685:
amount of appropriate real world 'training sets' for the neural network portion of such models; characterizing real-life decision-making situations and outcomes so as to drive models effectively; and the lack of direct mapping from a neural network's internal structure to components of the human reasoning mechanism.
261:, the tendency to produce an estimate near a cue amount that may or may not have been intentionally offered. For example, producing a quote based on a manager's preferences, or, negotiating a house purchase price from the starting amount suggested by a real estate agent rather than an objective assessment of value. 366:
do; practitioners tend to acknowledge the persistent existence of 'irrational' behavior, and while some mention human motivation and biases as possible contributors to such behavior, these factors are not made explicit in their models. Practitioners tend to treat deviations from what a rational agent
322:
An increasing number of academic and professional disciplines are identifying means of cognitive bias mitigation. What follows is a characterization of the assumptions, theories, methods and results, in disciplines concerned with the efficacy of human reasoning, that plausibly bear on a theory and/or
183:
of May 18, 2006, in which two mining professionals and two paramedics at the closed Sullivan mine in British Columbia, Canada, all specifically trained in safety measures, lost their lives by failing to understand a life-threatening situation that in hindsight was obvious. The first person to succumb
704:
Another study takes a step back from focussing on cognitive biases and describes a framework for identifying "Performance Norms", criteria by which reasoning outcomes are judged correct or incorrect, so as to determine when cognitive bias mitigation is required, to guide identification of the biases
538:
Similarly, expert-level training in such foundational disciplines as mathematics, statistics, probability, logic, etc. can be useful for cognitive bias mitigation when the expected standard of performance reflects such formal analytical methods. However, a study of software engineering professionals
440:
and others have authored recent articles in business and trade magazines addressing the notion of cognitive bias mitigation in a limited form. These contributions assert that cognitive bias mitigation is necessary and offer general suggestions for how to achieve it, though the guidance is limited to
549:
A recent research effort by Morewedge and colleagues (2015) found evidence for domain-general forms of debiasing. In two longitudinal experiments, debiasing training techniques featuring interactive games that elicited six cognitive biases (anchoring, bias blind spot, confirmation bias, fundamental
534:
Intensive situational training is capable of providing individuals with what appears to be cognitive bias mitigation in decision making, but amounts to a fixed strategy of selecting the single best response to recognized situations regardless of the 'noise' in the environment. Studies and anecdotes
525:
and others. One line of Gigerenzer's work led to the "Fast and Frugal" framing of the human reasoning mechanism, which focused on the primacy of 'recognition' in decision making, backed up by tie-resolving heuristics operating in a low cognitive resource environment. In a series of objective tests,
530:
using formal analytical methods. One contribution to a theory and practice of cognitive bias mitigation from this approach is that it addresses mitigation without explicitly targeting individual cognitive biases and focuses on the reasoning mechanism itself to avoid cognitive biases manifestation.
358:
do, given the goal of maximizing expected value/utility; in this approach there is no explicit representation in practitioners' models of unconscious factors such as cognitive biases, i.e. all factors are considered conscious choice parameters for all agents. Practitioners tend to treat deviations
684:
In principle, such models are capable of modeling decision making that takes account of human needs and motivations within social contexts, and suggest their consideration in a theory and practice of cognitive bias mitigation. Challenges to realizing this potential: accumulating the considerable
634:
above, suggests that our cognitive heuristics are at their best when operating in a social, political and economic environment most like that of the Paleolithic/Holocene. If this is true, then one possible means to achieve at least some cognitive bias mitigation is to mimic, as much as possible,
385:
One major difference between decision theory and game theory is the notion of 'equilibrium', a situation in which all agents agree on a strategy because any deviation from this strategy punishes the deviating agent. Despite analytical proofs of the existence of at least one equilibrium in a wide
478:, the inference being that this part of the human brain is implicated in creating the deviations from rational agent choices noted in emotionally valent economic decision making. Practitioners in this discipline have demonstrated correlations between brain activity in this part of the brain and 482:
activity, and neuronal activation has been shown to have measurable, consistent effects on decision making. These results must be considered speculative and preliminary, but are nonetheless suggestive of the possibility of real-time identification of brain states associated with cognitive bias
389:
In the full range of game theory models there are many that do not guarantee the existence of equilibria, i.e. there are conflict situations where there is no set of agents' strategies that all agents agree are in their best interests. However, even when theoretical equilibria exist, i.e. when
188:
environment at the bottom of a sump within a sampling shed, accessed by a ladder. After the first fatality, three other co-workers, all trained in hazardous operational situations, one after the other lost their lives in exactly the same manner, each apparently discounting the evidence of the
36:
Coherent, comprehensive theories of cognitive bias mitigation are lacking. This article describes debiasing tools, methods, proposals and other initiatives, in academic and professional disciplines concerned with the efficacy of human reasoning, associated with the concept of cognitive bias
617:
Anthropologists have provided generally accepted scenarios of how our progenitors lived and what was important in their lives. These scenarios of social, political, and economic organization are not uniform throughout history or geography, but there is a degree of stability throughout the
584:
and the concept of our reasoning mechanism being segregated (approximately) into 'System 1' and 'System 2'. In this view, System 1 is the 'first line' of cognitive processing of all perceptions, including internally generated 'pseudo-perceptions', which automatically, subconsciously and
101:
may have had in the global financial crisis beginning in 2007. Their conclusion was that the expertise level of stock analysts and traders made them highly resistant to signals that did not conform to their beliefs in the continuation of the status quo. In the grip of strong
2388:
Caliki, G., Bener, A., Arsian, B. (2010). "An Analysis of the Effects of Company Culture, Education and Experience on Confirmation Bias Levels of Software Developers and Testers." ADM/IEEE 32nd International Conference on Software Engineering – ICSE 2010 Volume 2:
273:), the failure to reset one's expectations based on one's current situation. For example, refusing to pay again to purchase a replacement for a lost ticket to a desired entertainment, or, refusing to sell a sizable long stock position in a rapidly falling market. 420:
was an early inspiration for this discipline, and has been further developed by its practitioners. It is one of the earliest economic theories that explicitly acknowledge the notion of cognitive bias, though the model itself accounts for only a few, including
297:, the tendency to assess one's previous decisions as more effective than they were. For example, 'recalling' one's prediction that Vancouver would lose the 2011 Stanley Cup, or, 'remembering' to have identified the proximate cause of the 2007 Great Recession. 289:, the tendency to attribute unverified capabilities in a person based on an observed capability. For example, believing an Oscar-winning actor's assertion regarding the harvest of Atlantic seals, or, assuming that a tall, handsome man is intelligent and kind. 1159:
Stephenson, Arthur G.; LaPiana, Lia S.; Mulville, Daniel R.; Rutledge, Peter J.; Bauer, Frank H.; Folta, David; Dukeman, Greg A.; Sackheim, Robert et al (1999-11-10). "Mars Climate Orbiter Mishap Investigation Board Phase I Report." National Air and Space
359:
from what a rational agent would do as 'errors of irrationality', with the implication that cognitive bias mitigation can only be achieved by decision makers becoming more like rational agents, though no explicit measures for achieving this are proffered.
585:
near-instantaneously produces emotionally valenced judgments of their probable effect on the individual's well-being. By contrast, System 2 is responsible for 'executive control', taking System 1's judgments as advisories, making future predictions, via
343:, is explicitly focused on human reasoning, judgment, choice and decision making, primarily in 'one-shot games' between two agents with or without perfect information. The theoretical underpinning of decision theory assumes that all decision makers are 245:, the tendency to seek out only that information that supports one's preconceptions, and to discount that which does not. For example, hearing only one side of a political debate, or, failing to accept the evidence that one's job has become redundant. 313:, the tendency to do or believe what others do or believe. For example, voting for a political candidate because your father unfailingly voted for that candidate's party, or, not objecting to a bully's harassment because the rest of your peers don't. 117:, a Nobel Laureate in Economics, reports in a peer-reviewed study that highly experienced financial managers performed 'no better than chance', largely due to similar factors as reported in the study above, which he termed the "illusion of skill". 1386:
Wright J. R., Leyton-Brown, K., Behavioral Game-Theoretic Models: A Bayesian Framework For Parameter Analysis, to appear in Proceedings of the 11th International Conference on Autonomous Agents and Multiagent Systems (AAMAS 2012), (8 pages),
1336:
Kahneman, D. (2000). "Experienced Utility and Objective Happiness: A Moment-Based Approach." Chapter 37 in: D. Kahneman and A. Tversky (Eds.) "Choices, Values and Frames." New York: Cambridge University Press and the Russell Sage Foundation,
44:
standard for decision making versus one grounded in human social needs and motivations. The debate also contrasts the methods used to analyze and predict human decision making, i.e. formal analysis emphasizing intellectual capacities versus
693:
This discipline, though not focused on improving human reasoning outcomes as an end goal, is one in which the need for such improvement has been explicitly recognized, though the term "cognitive bias mitigation" is not universally used.
78:. This study concluded that several cognitive biases were 'in play' on the mountain, along with other human dynamics. This was a case of highly trained, experienced people breaking their own rules, apparently under the influence of the 327:
or their mitigation, in others on unstated but self-evident applicability. This characterization is organized along lines reflecting historical segmentation of disciplines, though in practice there is a significant amount of overlap.
61:
is that they manifest automatically and unconsciously over a wide range of human reasoning, so even those aware of the existence of the phenomenon are unable to detect, let alone mitigate, their manifestation via awareness only.
705:
that may be 'in play' in a real-world situation, and subsequently to prescribe their mitigations. This study refers to a broad research program with the goal of moving toward a theory and practice of cognitive bias mitigation.
2343:
Roth, E. et al. (1994). An empirical investigation of operator performance in cognitive demanding simulated emergencies. NUREG/CR-6208, Westinghouse Science and Technology Center. Report prepared for Nuclear Regulatory
608:
technology. While this notion must remain speculative until further work is done, it appears to be a productive basis for conceiving options for constructing a theory and practice of cognitive bias mitigation.
433:, and perhaps others. No mention is made in formal prospect theory of cognitive bias mitigation, and there is no evidence of peer-reviewed work on cognitive bias mitigation in other areas of this discipline. 367:
would do as evidence of important, but as yet not understood, decision-making variables, and have as yet no explicit or implicit contributions to make to a theory and practice of cognitive bias mitigation.
2135:
Fiedler, K. & HĂĽtter, M. (2014). The limits of automaticity. J. Sherman, B. Gawronski, & Y. Trope (Eds.), Dual Processes in Social Psychology (pp. 497-513). New York: Guilford Publications, Inc.
1777:
Conroy, P., Kruchten, P. (2012). "Performance Norms: An Approach to Reducing Rework in Software Development", to appear in IEEE Xplore re 2012 Canadian Conference on Electrical and Computing Engineering.
2137: 1364:
Broseta, B., Costa-Gomes, M., Crawford, V. (2000). "Cognition and Behavior in Normal-Form Games: An Experimental Study." Department of Economics, University of California at San Diego, Permalink:
2093:
Garcia-Marques, L.; Ferreira, M. B. (2011). "Friends and foes of theory construction in psychological science: Vague dichotomies, unified theories of cognition, and the new experimentalism".
74:
One study explicitly focused on cognitive bias as a potential contributor to a disaster-level event; this study examined the causes of the loss of several members of two expedition teams on
558:
This discipline explicitly challenges the prevalent view that humans are rational agents maximizing expected value/utility, using formal analytical methods to do so. Practitioners such as
1889:
Haselton, M. G.; Bryant, G. A.; Wilke, A.; Frederick, D. A.; Galperin, A.; Frankenhuis, W. E.; Moore, T. (2009). "Adaptive Rationality: An Evolutionary Perspective on Cognitive Bias".
604:
offers empirical support for the concept of segregating the human reasoning mechanism into System 1 and System 2, as described above, based on brain activity imaging experiments using
351:
of their choices, and that to accomplish this they utilize formal analytical methods such as mathematics, probability, statistics, and logic under cognitive resource constraints.
153:
described the systemic cause of this mishap as an organizational failure, with the specific, proximate cause being unchecked assumptions across mission teams regarding the mix of
1065:
Roberto, M. A. (2002). "Lessons from Everest: The Interaction of Cognitive Bias, Psychological, Safety and System Complexity." California Management Review (2002) 45(1): 136–158.
1787:
Weinstein, N. D. (1980). "Unrealistic Optimism About Future Life Events". Department of Human Ecology and Social Sciences, Cook College, Rutgers, The State University".
681:
models, where weights govern the contribution of signals to each connection, allow very small models to perform rather complex decision-making tasks at high fidelity.
236:
can also produce negative outcomes in our everyday lives, though rarely as serious as in the examples above. An illustrative selection, recounted in multiple studies:
3061: 189:
previous victims' fate. The power of confirmation bias alone would be sufficient to explain why this happened, but other cognitive biases probably manifested as well.
713:
Other initiatives aimed directly at a theory and practice of cognitive bias mitigation may exist within other disciplines under different labels than employed here.
1355:
Camerer, C. F., Ho T.-H., Chong, J.-K. (2002). "A Cognitive Hierarchy Theory of One-Shot Games and Experimental Analysis." Forth, Quarterly Journal of Economics."
2138:
https://www.researchgate.net/profile/Mandy_Huetter/publication/308789747_The_limits_of_automaticity/links/58ad4fd24585155ae77aefac/The-limits-of-automaticity.pdf
2362:
Wilson, J.R. (1993). SHEAN (Simplified Human Error Analysis code) and automated THERP. United States Department of Energy Technical Report Number WINCO–11908.
2157:
Changeux, J.-P. P., A. Damasio, et al., Eds. (2007). Neurobiology of Human Values (Research and Perspectives in Neurosciences). Heidelberg, Germany, Springer.
3036: 2353:
Wiegmann, D. & Shappell, S. (2003). A human error approach to aviation accident analysis: The human factors analysis and classification system.. Ashgate.
635:
Paleolithic/Holocene social, political and economic scenarios when one is performing a reasoning task that could attract negative cognitive bias effects.
406:, explicitly consider the effects of social, cognitive and emotional factors on individuals' economic decisions. These disciplines combine insights from 46: 1077:"The Illogicality of Stock-Brokers: Psychological Experiments on the Effects of Prior Knowledge and Belief Biases on Logical Reasoning in Stock Trading" 1925:
Haselton, M. G., D. Nettie, et al. (2005). "The Evolution of Cognitive Bias." Handbook of Evolutionary Psychology. D. M. Buss. Hoboken, Wiley: 724–746.
40:
A long-standing debate regarding human decision making bears on the development of a theory and practice of bias mitigation. This debate contrasts the
922:
Haselton, M. G., D. Nettie, et al. (2005). The Evolution of Cognitive Bias. Handbook of Evolutionary Psychology. D. M. Buss. Hoboken, Wiley: 724–746.
3066: 593:, next, and for empirical and theoretical arguments against, see ) and thus may contribute to a theory and practice of cognitive bias mitigation. 1765:
Shermer, M. (2010). A review of Paul Thagard's "The Brain and the Meaning of Life". Skeptic Magazine. Altadena, CA, Skeptics Society. 16: 60–61.
546:
by showing subjects other subjects' outputs from a reasoning task, with the result that their subsequent decision-making was somewhat debiased.
1639:
The Common Neural Basis of Autobiographical Memory, Prospection, Navigation, Theory of Mind and the Default Mode: A Quantitative Meta-Analysis.
2376: 542:
Laboratory experiments in which cognitive bias mitigation is an explicit goal are rare. One 1980 study explored the notion of reducing the
491:
Several streams of investigation in this discipline are noteworthy for their possible relevance to a theory of cognitive bias mitigation.
876:
Gigerenzer, G. (2006). "Bounded and Rational." Contemporary Debates in Cognitive Science. R. J. Stainton, Blackwell Publishing: 115–133.
1447:
Mullainathan, Sendhil, and Richard Thaler. "Behavioral Economics." MIT Department of Economics Working Paper 00-27. (September 2000).
1438:
Kahneman, D. "Maps of Bounded Rationality: Psychology for Behavioral Economics." American Economic Review (December 2003): 1449–1475.
2256:
Kuhn, S. L.; Stiner, M. C. (2006). "What's a Mother To Do? The Division of Labor among Neanderthals and Modern Humans in Eurasia".
643:
A number of paradigms, methods and tools for improving human performance reliability have been developed within the discipline of
2463: 208:
Other disaster-level examples of negative outcomes resulting from human error, possibly including multiple cognitive biases: the
1872: 3056: 2866: 888:
Gilovich, T. (1991). How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life. New York, NY, The Free Press.
1178:
Beynon-Davies, P., "Information systems `failure': case of the LASCAD project", European Journal of Information Systems, 1995.
452:
is a discipline made possible by advances in brain activity imaging technologies. This discipline merges some of the ideas in
2985: 2371:
Sutton, R. S., Barto, A. G. (1998). MIT CogNet Ebook Collection; MITCogNet 1998, Adaptive Computation and Machine Learning,
483:
manifestation, and the possibility of purposeful interventions at the neuronal level to achieve cognitive bias mitigation.
70:
There are few studies explicitly linking cognitive biases to real-world incidents with highly negative outcomes. Examples:
2871: 2525: 158: 2520: 277: 1501:
Kahneman, D., Lovallo, D., Sibony, O. (2011). "Before You Make That Big Decision." Harvard Business Review, June, 2011.
2643: 2024: 752: 503: 249: 75: 1980:
Chudek, M.; Henrich, J. (2011). "Culture–Gene Coevolution, Norm-Psychology and the Emergence of Human Prosociality".
110:, they apparently could not see the signals of financial collapse, even after they had become evident to non-experts. 2325:
Gertman, D., Blackman, H., Marble, J., Byers, J. and Smith, C. (2005). The SPAR-H human reliability analysis method.
2979: 2505: 209: 2611: 33:– unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors. 3159: 2833: 2633: 2082: 670: 666: 1169:
British Columbia Ministry of Energy, Mines and Petroleum Resources: Sullivan Mine Accident Report, May 17, 2006.
935:
Henrich; et al. (2010). "Markets, Religion, Community Size, and the Evolution of Fairness and Punishment".
831:
Epley, N.; Gilovich, T. (2006). "The Anchoring-and-Adjustment Heuristic: Why the Adjustments are Insufficient".
3138: 3133: 3046: 2928: 2693: 2673: 2569: 233: 213: 194: 1935:
Confer; et al. (2010). "Evolutionary Psychology: Controversies, Questions, Prospects, and Limitations".
1824:
Morewedge, C. K.; Yoon, H.; Scopelliti, I.; Symborski, C. W.; Korris, J. H.; Kassam, K. S. (13 August 2015).
1712:
Gigerenzer, G.; Goldstein, D. G. (1996). "Reasoning the Fast and Frugal Way: Models of Bounded Rationality".
1756:
Gladwell, M. (2006). Blink: The Power of Thinking Without Thinking. New York, NY, Little, Brown and Company.
1249: 757: 659: 581: 3119: 2803: 2783: 2564: 2542: 1944: 1898: 1796: 1721: 1466: 952: 453: 301: 87: 1023:
Schacter, D. L. (1999). "The Seven Sins of Memory: Insights From Psychology and Cognitive Neuroscience".
2898: 2813: 2788: 2733: 2401: 79: 2411: 818:
Ariely, D. (2008). Predictably Irrational: The Hidden Forces That Shape Our Decisions, Harper Collins.
161:
used in different systems on the craft. A host of cognitive biases can be imagined in this situation:
2851: 2703: 2579: 2456: 1825: 1088: 944: 527: 513:
is one that may inform significant advances in cognitive bias mitigation. Originally conceived of by
475: 399: 265: 145: 1949: 1903: 1801: 1726: 1471: 1263: 957: 3006: 2923: 2823: 2758: 2698: 2688: 2683: 2547: 1591:"The Representation of Economic Value in the Orbitofrontal Cortex is Invariant for Changes of Menu" 797: 510: 221: 566: 125:
was central to highly negative potential or actual real-world outcomes, in which manifestation of
2903: 2888: 2648: 2638: 2621: 2308: 2273: 2148:
Damasio, A. (2010). Self Comes to Mind: Constructing the Conscious Brain. New York, NY, Pantheon.
2118: 2052: 2005: 1848: 1571: 1484: 1421: 1319: 1226: 1048: 978: 856: 777: 403: 3016: 2953: 2938: 2861: 2843: 2778: 2574: 2490: 2372: 2238: 2110: 2044: 1997: 1962: 1739: 1620: 1563: 1528: 1116: 1040: 970: 848: 767: 744: 730: 698: 644: 574: 457: 241: 225: 162: 103: 83: 2170:, University of Pittsburgh Of the Commonwealth System of Higher Education, 17(4), pp 439–448. 149:, which on September 23, 1999 "encountered Mars at an improperly low altitude" and was lost. 3083: 2943: 2883: 2808: 2793: 2653: 2606: 2515: 2510: 2495: 2300: 2265: 2228: 2220: 2189: 2102: 2036: 1989: 1954: 1908: 1840: 1806: 1731: 1694: 1667: 1610: 1602: 1555: 1546:
Rustichini, A (2009). "Neuroeconomics: What have we found, and what should we search for?".
1520: 1476: 1413: 1311: 1275: 1216: 1106: 1096: 1032: 962: 840: 655: 514: 430: 323:
practice of cognitive bias mitigation. In most cases this is based on explicit reference to
309: 1524: 3051: 3041: 2818: 2798: 2713: 2616: 2591: 2586: 2559: 2537: 2449: 1876: 901:
Hammond, J. S.; Keeney, R. L.; et al. (2006). "The Hidden Traps in Decision Making".
762: 522: 495: 437: 417: 411: 379: 344: 336: 170: 114: 107: 1652: 1457:
Kahneman, D.; Tversky, A. (1979). "Prospect Theory: An Analysis of Decision Under Risk".
677:. The multilayer, cross-connected signal collection and propagation structure typical of 1398: 1346:
Binmore, K. (2007). "A Very Short Introduction to Game Theory." Oxford University Press.
1296: 1092: 948: 49:
emphasizing emotional states. This article identifies elements relevant to this debate.
3093: 3088: 3078: 3001: 2918: 2878: 2828: 2773: 2763: 2748: 2743: 2708: 2663: 2628: 2532: 2481: 2334:
Hollnagel, E. (1998). Cognitive reliability and error analysis method: CREAM. Elsevier.
2233: 2208: 1615: 1590: 1511:
Loewenstein, George; Rick, Scott; Cohen, Jonathan D. (January 2008). "Neuroeconomics".
1111: 1076: 678: 461: 449: 348: 324: 293: 257: 166: 126: 98: 58: 41: 30: 2193: 3153: 3031: 3011: 2974: 2948: 2933: 2913: 2893: 2856: 2768: 2728: 2723: 2718: 2596: 2500: 2180:
Gabow, S. L. (1977). "Population Structure and the Rate of Hominid Brain Evolution".
1869: 1377:
Myerson, R. B. (1991). "Game Theory: Analysis of Conflict." Harvard University Press.
844: 543: 471: 422: 178: 2312: 2277: 2009: 1425: 1323: 1052: 1010:
Nozick, R. (1993). The Nature of Rationality. Ewing, NJ, Princeton University Press.
860: 2991: 2753: 2738: 2056: 1852: 1826:"Debiasing Decisions: Improved Decision Making With a Single Training Intervention" 1575: 1230: 982: 601: 499: 202: 154: 135: 2122: 1101: 521:
as opposed to optimizing, this idea found experimental expression in the work of
2908: 2678: 2668: 2658: 2554: 1735: 1190:
Mann, C. C. (2002). "Why Software is So Bad." Technology Review, MIT, July 2002.
782: 674: 619: 586: 570: 518: 479: 375: 285: 122: 2426: 1993: 1912: 1810: 1671: 1559: 1417: 1365: 1315: 1280: 1036: 509:
A concept rooted in considerations of the actual machinery of human reasoning,
464:
in an attempt to better understand the neural basis for human decision making.
441:
only a few cognitive biases and is not self-evidently generalizable to others.
3026: 3021: 2996: 2083:
https://cloudfront.escholarship.org/dist/prd/content/qt76d4d629/qt76d4d629.pdf
2040: 1147:
ID=19830723-0 (1983). "Gimli Glider Accident Report." Aviation Safety Network
720: 407: 2207:
Hamilton, M. J.; Milne, B. T.; Walker, R.S.; Burger, O.; Brown, J.H. (2007).
2106: 1844: 474:
is consistently involved in resolving economic decision situations that have
354:
Normative, or prescriptive, decision theory concerns itself with what people
3114: 2601: 966: 772: 426: 340: 270: 217: 20: 2242: 2224: 2114: 2048: 2001: 1966: 1624: 1567: 1532: 1120: 1044: 974: 852: 526:
models based on this approach outperformed models based on rational agents
362:
Positive, or descriptive, decision theory concerns itself with what people
57:
A large body of evidence has established that a defining characteristic of
1743: 1221: 1204: 998:
Lerher, J. (2009). How We Decide. New York, NY, Houghton Mifflin Harcourt.
3073: 2958: 1698: 623: 559: 2074:
Evans, J. B. T. (2006). Dual system theories of cognition: Some issues.
1685:
Simon, H. A. (1991). "Bounded Rationality and Organizational Learning".
669:, an approach inspired by the imagined structure and function of actual 185: 2304: 1488: 787: 665:
One technique particularly applicable to cognitive bias mitigation is
1958: 2436: 1480: 37:
mitigation; most address mitigation tacitly rather than explicitly.
2416: 2269: 1606: 662:, has been used to investigate human learning and decision making. 502:, expanded upon by others, and applied in real-life situations, is 169:, overconfidence effect, availability bias, and even the meta-bias 2421: 792: 2472: 1868:
Cosmides, L., Tooby, J. "Evolutionary Psychology: A Primer." at
605: 467: 150: 121:
There are numerous investigations of incidents determining that
2445: 2291:
Marlowe, F. W. (2005). "Hunter-Gatherers and Human Evolution".
2079:
Proceedings of the Annual Meeting of Cognitive Science Society'
1641:
Journal of Cognitive Neuroscience, (Epub ahead of print)(2010).
1138:
Kahneman, D. (2011). Thinking, Fast and Slow, Doubleday Canada.
2431: 697:
One study identifies specific steps to counter the effects of
1148: 2406: 1252:: How to Get Things Right. New York, NY, Metropolitan Books. 1075:
Knauff, M.; Budeck, C.; Wolf, A. G.; Hamburger, K. (2010).
29:
is the prevention and reduction of the negative effects of
2209:"The complex Structure of Hunter–Gatherer Social Networks" 2441: 701:
in certain phases of the software engineering lifecycle.
97:
study, German researchers examined the role that certain
339:, a discipline with its roots grounded in neo-classical 1833:
Policy Insights from the Behavioral and Brain Sciences
16:
Reduction of the negative effects of cognitive biases
2293:
Evolutionary Anthropology: Issues, News, and Reviews
1399:"What Can Economists Learn from Happiness Research?" 1297:"What Can Economists Learn from Happiness Research?" 398:
Unlike neo-classical economics and decision theory,
224:
passenger aircraft, the ineffective response to the
3102: 2967: 2842: 2479: 494:One approach to mitigation originally suggested by 1870:http://www.psych.ucsb.edu/research/cep/primer.html 1637:Spreng, R. N., Mar, R. A., Kim, A. S. N. (2008). 626:in particular. This, along with the findings in 234:approximately 250 cognitive biases known to date 106:reinforced by the overconfidence effect and the 1264:"Utility Maximization and Experienced Utility" 2457: 76:Mount Everest on two consecutive days in 1996 8: 1789:Journal of Personality and Social Psychology 1366:http://www.escholarship.org/uc/item/0fp8278k 667:neural network learning and choice selection 627: 517:in the 1960s and leading to the concept of 378:, a discipline with roots in economics and 3120:Heuristics in judgment and decision-making 2464: 2450: 2442: 2437:Max Planck Institute for Human Development 1864: 1862: 1244: 1242: 1240: 1134: 1132: 1130: 2417:Institute of Ergonomics and Human Factors 2232: 1948: 1902: 1800: 1773: 1771: 1725: 1653:"Multiple Selves in Intertemporal Choice" 1614: 1470: 1279: 1220: 1110: 1100: 1018: 1016: 956: 580:There is an emerging convergence between 1589:Padoa-Schioppa, C.; Assad, J.A. (2007). 1205:"Cognitive Bias in Software Engineering" 220:nuclear reactor fire, the downing of an 19:For broader coverage of this topic, see 1006: 1004: 994: 992: 930: 928: 918: 916: 896: 894: 884: 882: 872: 870: 826: 824: 814: 812: 808: 2422:International Machine Learning Society 2402:Center for the Study of Neuroeconomics 1525:10.1146/annurev.psych.59.103006.093710 1198: 1196: 1186: 1184: 2095:Perspectives on Psychological Science 7: 129:is a plausible component. Examples: 66:Real-world effects of cognitive bias 2412:Journal of Evolutionary Psychology 2213:Proceedings of the Royal Society B 212:nuclear meltdown, the loss of the 14: 2023:Mercier, H.; Sperber, D. (2011). 1651:Jamison, J.; Wegener, J. (2010). 1262:Kahneman, D.; Thaler, R. (2006). 1203:Stacy, W.; MacMillan, J. (1995). 631: 590: 1268:Journal of Economic Perspectives 845:10.1111/j.1467-9280.2006.01704.x 737: 723: 347:trying to maximize the economic 184:failed to accurately discern an 1548:Current Opinion in Neurobiology 2432:Cognitive Neuroscience Society 1660:Journal of Economic Psychology 1406:Journal of Economic Literature 1397:Frey, B.; Stutzer, A. (2002). 1304:Journal of Economic Literature 1295:Frey, B.; Stutzer, A. (2002). 1: 2194:10.1016/s0047-2484(77)80136-x 2029:Behavioral and Brain Sciences 645:human reliability engineering 639:Human reliability engineering 470:experiments suggest that the 427:anchoring and adjustment bias 228:weather event, and many more. 159:United States customary units 2427:Temporal Difference Learning 2168:Myths About Hunter-Gatherers 1982:Trends in Cognitive Sciences 1102:10.1371/journal.pone.0013483 278:Representativeness heuristic 2986:DĂ©formation professionnelle 1736:10.1037/0033-295x.103.4.650 1595:Nature Reviews Neuroscience 1513:Annual Review of Psychology 753:Cognitive bias modification 504:reference class forecasting 3176: 2980:Basking in reflected glory 2407:Fast and Frugal Heuristics 2182:Journal of Human Evolution 1994:10.1016/j.tics.2011.03.003 1913:10.1521/soco.2009.27.5.733 1811:10.1037/0022-3514.39.5.806 1672:10.1016/j.joep.2010.03.004 1560:10.1016/j.conb.2009.09.012 1418:10.1257/002205102320161320 1316:10.1257/002205102320161320 1281:10.1257/089533006776526076 1149:http://aviation-safety.net 1037:10.1037/0003-066x.54.3.182 671:biological neural networks 18: 3128: 3110:Cognitive bias mitigation 2041:10.1017/s0140525x10000968 1209:Communications of the ACM 27:Cognitive bias mitigation 2694:Illusion of transparency 2107:10.1177/1745691611400239 1845:10.1177/2372732215600886 528:maximizing their utility 456:, behavioral economics, 214:Space Shuttle Challenger 195:London Ambulance Service 1250:The Checklist Manifesto 967:10.1126/science.1182238 903:Harvard Business Review 758:Cognitive vulnerability 660:artificial intelligence 628:Evolutionary psychology 582:evolutionary psychology 554:Evolutionary psychology 402:and the related field, 42:rational economic agent 2225:10.1098/rspb.2007.0564 2025:"Argumentative Theory" 573:describe a theory for 454:experimental economics 349:expected value/utility 302:Availability heuristic 88:availability heuristic 3062:Arab–Israeli conflict 2789:Social influence bias 2734:Out-group homogeneity 2166:Ember, C. R. (1978). 1937:American Psychologist 1222:10.1145/203241.203256 1025:American Psychologist 833:Psychological Science 80:overconfidence effect 2704:Mere-exposure effect 2634:Extrinsic incentives 2580:Selective perception 2258:Current Anthropology 1714:Psychological Review 1699:10.1287/orsc.2.1.125 1687:Organization Science 1248:Gawande, A. (2010). 689:Software engineering 487:Cognitive psychology 400:behavioral economics 394:Behavioral economics 146:Mars Climate Orbiter 2929:Social desirability 2824:von Restorff effect 2699:Mean world syndrome 2674:Hostile attribution 1093:2010PLoSO...513483K 949:2010Sci...327.1480H 943:(5972): 1480–1484. 798:Unstated assumption 565:In a similar vein, 511:bounded rationality 2844:Statistical biases 2622:Curse of knowledge 2305:10.1002/evan.20046 2219:(274): 2195–2203. 1875:2009-02-28 at the 778:Freedom of thought 410:and neo-classical 404:behavioral finance 3147: 3146: 2784:Social comparison 2565:Choice-supportive 2377:978-0-262-19398-6 768:Critical thinking 745:Psychology portal 731:Philosophy portal 699:confirmation bias 575:confirmation bias 476:emotional valence 458:cognitive science 414:to achieve this. 266:Gambler's fallacy 242:Confirmation bias 226:Hurricane Katrina 210:Three Mile Island 163:confirmation bias 104:confirmation bias 84:sunk cost fallacy 3167: 3160:Cognitive biases 2944:Systematic error 2899:Omitted-variable 2814:Trait ascription 2654:Frog pond effect 2482:Cognitive biases 2466: 2459: 2452: 2443: 2390: 2386: 2380: 2369: 2363: 2360: 2354: 2351: 2345: 2341: 2335: 2332: 2326: 2323: 2317: 2316: 2288: 2282: 2281: 2253: 2247: 2246: 2236: 2204: 2198: 2197: 2177: 2171: 2164: 2158: 2155: 2149: 2146: 2140: 2133: 2127: 2126: 2090: 2075: 2070: 2061: 2060: 2020: 2014: 2013: 1977: 1971: 1970: 1959:10.1037/a0018413 1952: 1932: 1926: 1923: 1917: 1916: 1906: 1891:Social Cognition 1886: 1880: 1866: 1857: 1856: 1830: 1821: 1815: 1814: 1804: 1784: 1778: 1775: 1766: 1763: 1757: 1754: 1748: 1747: 1729: 1709: 1703: 1702: 1682: 1676: 1675: 1657: 1648: 1642: 1635: 1629: 1628: 1618: 1586: 1580: 1579: 1543: 1537: 1536: 1508: 1502: 1499: 1493: 1492: 1474: 1454: 1448: 1445: 1439: 1436: 1430: 1429: 1403: 1394: 1388: 1384: 1378: 1375: 1369: 1362: 1356: 1353: 1347: 1344: 1338: 1334: 1328: 1327: 1301: 1292: 1286: 1285: 1283: 1259: 1253: 1246: 1235: 1234: 1224: 1200: 1191: 1188: 1179: 1176: 1170: 1167: 1161: 1157: 1151: 1145: 1139: 1136: 1125: 1124: 1114: 1104: 1072: 1066: 1063: 1057: 1056: 1020: 1011: 1008: 999: 996: 987: 986: 960: 932: 923: 920: 911: 910: 898: 889: 886: 877: 874: 865: 864: 828: 819: 816: 747: 742: 741: 740: 733: 728: 727: 726: 656:Machine learning 651:Machine learning 515:Herbert A. Simon 431:endowment effect 325:cognitive biases 310:Bandwagon effect 144:The Loss of the 127:cognitive biases 99:cognitive biases 59:cognitive biases 31:cognitive biases 3175: 3174: 3170: 3169: 3168: 3166: 3165: 3164: 3150: 3149: 3148: 3143: 3124: 3098: 2963: 2838: 2819:Turkey illusion 2587:Compassion fade 2484: 2475: 2470: 2398: 2393: 2387: 2383: 2370: 2366: 2361: 2357: 2352: 2348: 2342: 2338: 2333: 2329: 2324: 2320: 2290: 2289: 2285: 2255: 2254: 2250: 2206: 2205: 2201: 2179: 2178: 2174: 2165: 2161: 2156: 2152: 2147: 2143: 2134: 2130: 2092: 2091: 2087: 2073: 2064: 2022: 2021: 2017: 1979: 1978: 1974: 1950:10.1.1.601.8691 1934: 1933: 1929: 1924: 1920: 1904:10.1.1.220.6198 1888: 1887: 1883: 1877:Wayback Machine 1867: 1860: 1828: 1823: 1822: 1818: 1802:10.1.1.535.9244 1786: 1785: 1781: 1776: 1769: 1764: 1760: 1755: 1751: 1727:10.1.1.174.4404 1711: 1710: 1706: 1684: 1683: 1679: 1655: 1650: 1649: 1645: 1636: 1632: 1588: 1587: 1583: 1545: 1544: 1540: 1510: 1509: 1505: 1500: 1496: 1481:10.2307/1914185 1472:10.1.1.407.1910 1456: 1455: 1451: 1446: 1442: 1437: 1433: 1401: 1396: 1395: 1391: 1385: 1381: 1376: 1372: 1363: 1359: 1354: 1350: 1345: 1341: 1335: 1331: 1299: 1294: 1293: 1289: 1261: 1260: 1256: 1247: 1238: 1202: 1201: 1194: 1189: 1182: 1177: 1173: 1168: 1164: 1160:Administration. 1158: 1154: 1146: 1142: 1137: 1128: 1074: 1073: 1069: 1064: 1060: 1022: 1021: 1014: 1009: 1002: 997: 990: 958:10.1.1.714.7830 934: 933: 926: 921: 914: 900: 899: 892: 887: 880: 875: 868: 830: 829: 822: 817: 810: 806: 763:Critical theory 743: 738: 736: 729: 724: 722: 719: 711: 691: 653: 641: 615: 599: 556: 523:Gerd Gigerenzer 496:Daniel Kahneman 489: 447: 438:Daniel Kahneman 418:Prospect theory 396: 380:system dynamics 373: 345:rational agents 337:Decision theory 334: 332:Decision theory 320: 171:bias blind spot 108:status quo bias 68: 55: 24: 17: 12: 11: 5: 3173: 3171: 3163: 3162: 3152: 3151: 3145: 3144: 3142: 3141: 3136: 3129: 3126: 3125: 3123: 3122: 3117: 3112: 3106: 3104: 3103:Bias reduction 3100: 3099: 3097: 3096: 3091: 3086: 3081: 3079:Political bias 3076: 3071: 3070: 3069: 3064: 3059: 3054: 3049: 3044: 3039: 3034: 3024: 3019: 3014: 3009: 3007:Infrastructure 3004: 2999: 2994: 2989: 2982: 2977: 2971: 2969: 2965: 2964: 2962: 2961: 2956: 2951: 2946: 2941: 2936: 2931: 2926: 2924:Self-selection 2921: 2916: 2911: 2906: 2901: 2896: 2891: 2886: 2881: 2876: 2875: 2874: 2864: 2859: 2854: 2848: 2846: 2840: 2839: 2837: 2836: 2831: 2826: 2821: 2816: 2811: 2806: 2801: 2796: 2791: 2786: 2781: 2776: 2771: 2766: 2761: 2759:Pro-innovation 2756: 2751: 2746: 2744:Overton window 2741: 2736: 2731: 2726: 2721: 2716: 2711: 2706: 2701: 2696: 2691: 2686: 2681: 2676: 2671: 2666: 2661: 2656: 2651: 2646: 2641: 2636: 2631: 2626: 2625: 2624: 2614: 2612:Dunning–Kruger 2609: 2604: 2599: 2594: 2589: 2584: 2583: 2582: 2572: 2567: 2562: 2557: 2552: 2551: 2550: 2540: 2535: 2530: 2529: 2528: 2526:Correspondence 2523: 2521:Actor–observer 2513: 2508: 2503: 2498: 2493: 2487: 2485: 2480: 2477: 2476: 2471: 2469: 2468: 2461: 2454: 2446: 2440: 2439: 2434: 2429: 2424: 2419: 2414: 2409: 2404: 2397: 2396:External links 2394: 2392: 2391: 2381: 2364: 2355: 2346: 2336: 2327: 2318: 2283: 2270:10.1086/507197 2264:(6): 953–981. 2248: 2199: 2188:(7): 643–665. 2172: 2159: 2150: 2141: 2128: 2101:(2): 192–201. 2085: 2062: 2015: 1988:(5): 218–226. 1972: 1943:(2): 110–126. 1927: 1918: 1897:(5): 733–763. 1881: 1858: 1839:(1): 129–140. 1816: 1795:(5): 806–820. 1779: 1767: 1758: 1749: 1720:(4): 650–669. 1704: 1693:(1): 125–134. 1677: 1666:(5): 832–839. 1643: 1630: 1607:10.1038/nn2020 1581: 1554:(6): 672–677. 1538: 1519:(1): 647–672. 1503: 1494: 1465:(2): 263–291. 1449: 1440: 1431: 1389: 1379: 1370: 1357: 1348: 1339: 1329: 1287: 1274:(1): 221–234. 1254: 1236: 1192: 1180: 1171: 1162: 1152: 1140: 1126: 1087:(10): e13483. 1067: 1058: 1031:(3): 182–203. 1012: 1000: 988: 924: 912: 890: 878: 866: 839:(4): 311–318. 820: 807: 805: 802: 801: 800: 795: 790: 785: 780: 775: 770: 765: 760: 755: 749: 748: 734: 718: 715: 710: 707: 690: 687: 679:neural network 658:, a branch of 652: 649: 640: 637: 614: 611: 598: 595: 555: 552: 488: 485: 462:social science 450:Neuroeconomics 446: 445:Neuroeconomics 443: 395: 392: 372: 369: 333: 330: 319: 316: 315: 314: 306: 298: 294:Hindsight bias 290: 282: 274: 271:sunk cost bias 262: 258:Anchoring bias 254: 250:Framing effect 246: 230: 229: 206: 200: 190: 174: 167:hindsight bias 141: 119: 118: 111: 91: 67: 64: 54: 51: 15: 13: 10: 9: 6: 4: 3: 2: 3172: 3161: 3158: 3157: 3155: 3140: 3137: 3135: 3131: 3130: 3127: 3121: 3118: 3116: 3113: 3111: 3108: 3107: 3105: 3101: 3095: 3092: 3090: 3087: 3085: 3082: 3080: 3077: 3075: 3072: 3068: 3065: 3063: 3060: 3058: 3057:United States 3055: 3053: 3050: 3048: 3045: 3043: 3040: 3038: 3035: 3033: 3032:False balance 3030: 3029: 3028: 3025: 3023: 3020: 3018: 3015: 3013: 3010: 3008: 3005: 3003: 3000: 2998: 2995: 2993: 2990: 2988: 2987: 2983: 2981: 2978: 2976: 2973: 2972: 2970: 2966: 2960: 2957: 2955: 2952: 2950: 2947: 2945: 2942: 2940: 2937: 2935: 2932: 2930: 2927: 2925: 2922: 2920: 2917: 2915: 2912: 2910: 2907: 2905: 2904:Participation 2902: 2900: 2897: 2895: 2892: 2890: 2887: 2885: 2882: 2880: 2877: 2873: 2872:Psychological 2870: 2869: 2868: 2865: 2863: 2860: 2858: 2855: 2853: 2850: 2849: 2847: 2845: 2841: 2835: 2832: 2830: 2827: 2825: 2822: 2820: 2817: 2815: 2812: 2810: 2807: 2805: 2802: 2800: 2797: 2795: 2792: 2790: 2787: 2785: 2782: 2780: 2777: 2775: 2772: 2770: 2767: 2765: 2762: 2760: 2757: 2755: 2752: 2750: 2747: 2745: 2742: 2740: 2737: 2735: 2732: 2730: 2727: 2725: 2722: 2720: 2717: 2715: 2712: 2710: 2707: 2705: 2702: 2700: 2697: 2695: 2692: 2690: 2687: 2685: 2682: 2680: 2677: 2675: 2672: 2670: 2667: 2665: 2662: 2660: 2657: 2655: 2652: 2650: 2647: 2645: 2642: 2640: 2639:Fading affect 2637: 2635: 2632: 2630: 2627: 2623: 2620: 2619: 2618: 2615: 2613: 2610: 2608: 2605: 2603: 2600: 2598: 2595: 2593: 2590: 2588: 2585: 2581: 2578: 2577: 2576: 2573: 2571: 2568: 2566: 2563: 2561: 2558: 2556: 2553: 2549: 2546: 2545: 2544: 2541: 2539: 2536: 2534: 2531: 2527: 2524: 2522: 2519: 2518: 2517: 2514: 2512: 2509: 2507: 2504: 2502: 2499: 2497: 2494: 2492: 2489: 2488: 2486: 2483: 2478: 2474: 2467: 2462: 2460: 2455: 2453: 2448: 2447: 2444: 2438: 2435: 2433: 2430: 2428: 2425: 2423: 2420: 2418: 2415: 2413: 2410: 2408: 2405: 2403: 2400: 2399: 2395: 2385: 2382: 2378: 2374: 2368: 2365: 2359: 2356: 2350: 2347: 2340: 2337: 2331: 2328: 2322: 2319: 2314: 2310: 2306: 2302: 2298: 2294: 2287: 2284: 2279: 2275: 2271: 2267: 2263: 2259: 2252: 2249: 2244: 2240: 2235: 2230: 2226: 2222: 2218: 2214: 2210: 2203: 2200: 2195: 2191: 2187: 2183: 2176: 2173: 2169: 2163: 2160: 2154: 2151: 2145: 2142: 2139: 2132: 2129: 2124: 2120: 2116: 2112: 2108: 2104: 2100: 2096: 2089: 2086: 2084: 2080: 2077: 2076: 2069: 2068: 2063: 2058: 2054: 2050: 2046: 2042: 2038: 2034: 2030: 2026: 2019: 2016: 2011: 2007: 2003: 1999: 1995: 1991: 1987: 1983: 1976: 1973: 1968: 1964: 1960: 1956: 1951: 1946: 1942: 1938: 1931: 1928: 1922: 1919: 1914: 1910: 1905: 1900: 1896: 1892: 1885: 1882: 1878: 1874: 1871: 1865: 1863: 1859: 1854: 1850: 1846: 1842: 1838: 1834: 1827: 1820: 1817: 1812: 1808: 1803: 1798: 1794: 1790: 1783: 1780: 1774: 1772: 1768: 1762: 1759: 1753: 1750: 1745: 1741: 1737: 1733: 1728: 1723: 1719: 1715: 1708: 1705: 1700: 1696: 1692: 1688: 1681: 1678: 1673: 1669: 1665: 1661: 1654: 1647: 1644: 1640: 1634: 1631: 1626: 1622: 1617: 1612: 1608: 1604: 1601:(1): 95–102. 1600: 1596: 1592: 1585: 1582: 1577: 1573: 1569: 1565: 1561: 1557: 1553: 1549: 1542: 1539: 1534: 1530: 1526: 1522: 1518: 1514: 1507: 1504: 1498: 1495: 1490: 1486: 1482: 1478: 1473: 1468: 1464: 1460: 1453: 1450: 1444: 1441: 1435: 1432: 1427: 1423: 1419: 1415: 1412:(2): 402–35. 1411: 1407: 1400: 1393: 1390: 1383: 1380: 1374: 1371: 1367: 1361: 1358: 1352: 1349: 1343: 1340: 1333: 1330: 1325: 1321: 1317: 1313: 1310:(2): 402–35. 1309: 1305: 1298: 1291: 1288: 1282: 1277: 1273: 1269: 1265: 1258: 1255: 1251: 1245: 1243: 1241: 1237: 1232: 1228: 1223: 1218: 1214: 1210: 1206: 1199: 1197: 1193: 1187: 1185: 1181: 1175: 1172: 1166: 1163: 1156: 1153: 1150: 1144: 1141: 1135: 1133: 1131: 1127: 1122: 1118: 1113: 1108: 1103: 1098: 1094: 1090: 1086: 1082: 1078: 1071: 1068: 1062: 1059: 1054: 1050: 1046: 1042: 1038: 1034: 1030: 1026: 1019: 1017: 1013: 1007: 1005: 1001: 995: 993: 989: 984: 980: 976: 972: 968: 964: 959: 954: 950: 946: 942: 938: 931: 929: 925: 919: 917: 913: 909:(1): 118–126. 908: 904: 897: 895: 891: 885: 883: 879: 873: 871: 867: 862: 858: 854: 850: 846: 842: 838: 834: 827: 825: 821: 815: 813: 809: 803: 799: 796: 794: 791: 789: 786: 784: 781: 779: 776: 774: 771: 769: 766: 764: 761: 759: 756: 754: 751: 750: 746: 735: 732: 721: 716: 714: 708: 706: 702: 700: 695: 688: 686: 682: 680: 676: 672: 668: 663: 661: 657: 650: 648: 646: 638: 636: 633: 629: 625: 622:era, and the 621: 612: 610: 607: 603: 596: 594: 592: 588: 583: 578: 576: 572: 568: 563: 561: 553: 551: 547: 545: 544:optimism bias 540: 536: 532: 529: 524: 520: 516: 512: 507: 505: 501: 497: 492: 486: 484: 481: 477: 473: 472:limbic system 469: 465: 463: 459: 455: 451: 444: 442: 439: 434: 432: 428: 424: 423:loss aversion 419: 415: 413: 409: 405: 401: 393: 391: 387: 383: 381: 377: 370: 368: 365: 360: 357: 352: 350: 346: 342: 338: 331: 329: 326: 317: 312: 311: 307: 304: 303: 299: 296: 295: 291: 288: 287: 283: 280: 279: 275: 272: 268: 267: 263: 260: 259: 255: 252: 251: 247: 244: 243: 239: 238: 237: 235: 227: 223: 219: 215: 211: 207: 204: 201: 198: 196: 191: 187: 182: 180: 179:Sullivan Mine 175: 172: 168: 164: 160: 156: 152: 148: 147: 142: 139: 137: 132: 131: 130: 128: 124: 116: 112: 109: 105: 100: 96: 92: 89: 85: 81: 77: 73: 72: 71: 65: 63: 60: 52: 50: 48: 43: 38: 34: 32: 28: 22: 3109: 3017:In education 2984: 2968:Other biases 2954:Verification 2939:Survivorship 2889:Non-response 2862:Healthy user 2804:Substitution 2779:Self-serving 2575:Confirmation 2543:Availability 2491:Acquiescence 2384: 2367: 2358: 2349: 2339: 2330: 2321: 2299:(2): 54–67. 2296: 2292: 2286: 2261: 2257: 2251: 2216: 2212: 2202: 2185: 2181: 2175: 2167: 2162: 2153: 2144: 2131: 2098: 2094: 2088: 2078: 2072: 2071: 2067: 2066: 2035:(2): 57–74. 2032: 2028: 2018: 1985: 1981: 1975: 1940: 1936: 1930: 1921: 1894: 1890: 1884: 1836: 1832: 1819: 1792: 1788: 1782: 1761: 1752: 1717: 1713: 1707: 1690: 1686: 1680: 1663: 1659: 1646: 1638: 1633: 1598: 1594: 1584: 1551: 1547: 1541: 1516: 1512: 1506: 1497: 1462: 1459:Econometrica 1458: 1452: 1443: 1434: 1409: 1405: 1392: 1382: 1373: 1360: 1351: 1342: 1332: 1307: 1303: 1290: 1271: 1267: 1257: 1215:(6): 57–63. 1212: 1208: 1174: 1165: 1155: 1143: 1084: 1080: 1070: 1061: 1028: 1024: 940: 936: 906: 902: 836: 832: 712: 703: 696: 692: 683: 664: 654: 642: 632:Neuroscience 616: 613:Anthropology 602:Neuroscience 600: 597:Neuroscience 591:Neuroscience 579: 564: 557: 548: 541: 537: 533: 508: 500:Amos Tversky 493: 490: 466: 448: 435: 416: 397: 388: 384: 374: 363: 361: 355: 353: 335: 321: 308: 300: 292: 284: 276: 264: 256: 248: 240: 232:Each of the 231: 203:Atul Gawande 192: 176: 143: 136:Gimli Glider 133: 120: 94: 69: 56: 39: 35: 26: 25: 3084:Publication 3037:Vietnam War 2884:Length time 2867:Information 2809:Time-saving 2669:Horn effect 2659:Halo effect 2607:Distinction 2516:Attribution 2511:Attentional 2344:Commission. 783:Freethought 675:human brain 620:Paleolithic 587:prospection 519:satisficing 480:prospection 376:Game theory 371:Game theory 286:Halo effect 123:human error 113:Similarly, 3047:South Asia 3022:Liking gap 2834:In animals 2799:Status quo 2714:Negativity 2617:Egocentric 2592:Congruence 2570:Commitment 2560:Blind spot 2548:Mean world 2538:Automation 2389:pp187-190. 2081:, 28(28). 804:References 408:psychology 138:' Incident 95:MarketBeat 93:In a 2010 47:heuristics 3115:Debiasing 3094:White hat 3089:Reporting 3002:Inductive 2919:Selection 2879:Lead time 2852:Estimator 2829:Zero-risk 2794:Spotlight 2774:Restraint 2764:Proximity 2749:Precision 2709:Narrative 2664:Hindsight 2649:Frequency 2629:Emotional 2602:Declinism 2533:Authority 2506:Anchoring 2496:Ambiguity 1945:CiteSeerX 1899:CiteSeerX 1797:CiteSeerX 1722:CiteSeerX 1467:CiteSeerX 953:CiteSeerX 773:Debiasing 436:However, 412:economics 341:economics 218:Chernobyl 21:Debiasing 3154:Category 3012:Inherent 2975:Academic 2949:Systemic 2934:Spectrum 2914:Sampling 2894:Observer 2857:Forecast 2769:Response 2729:Optimism 2724:Omission 2719:Normalcy 2689:In-group 2684:Implicit 2597:Cultural 2501:Affinity 2313:53489209 2278:42981328 2243:17609186 2115:26162138 2049:21447233 2010:16710885 2002:21482176 1967:20141266 1873:Archived 1625:18066060 1568:19896360 1533:17883335 1426:13967611 1324:13967611 1121:20976157 1081:PLOS ONE 1053:14882268 1045:10199218 975:20299588 861:10279390 853:16623688 717:See also 624:Holocene 560:Cosmides 364:actually 222:Iran Air 197:Failures 181:Incident 115:Kahneman 3134:General 3132:Lists: 3067:Ukraine 2992:Funding 2754:Present 2739:Outcome 2644:Framing 2234:2706200 2057:5669039 1853:4848978 1744:8888650 1616:2646102 1576:2281817 1489:1914185 1231:1505473 1112:2956684 1089:Bibcode 983:4803905 945:Bibcode 937:Science 788:Inquiry 673:in the 571:Sperber 567:Mercier 318:To date 53:Context 3139:Memory 3052:Sweden 3042:Norway 2909:Recall 2679:Impact 2555:Belief 2473:Biases 2375:  2311:  2276:  2241:  2231:  2123:118743 2121:  2113:  2055:  2047:  2008:  2000:  1965:  1947:  1901:  1851:  1799:  1742:  1724:  1623:  1613:  1574:  1566:  1531:  1487:  1469:  1424:  1322:  1229:  1119:  1109:  1051:  1043:  981:  973:  955:  859:  851:  356:should 216:, the 186:anoxic 155:metric 86:, the 82:, the 3027:Media 2997:FUTON 2309:S2CID 2274:S2CID 2119:S2CID 2053:S2CID 2006:S2CID 1849:S2CID 1829:(PDF) 1656:(PDF) 1572:S2CID 1485:JSTOR 1422:S2CID 1402:(PDF) 1387:2012. 1337:1999. 1320:S2CID 1300:(PDF) 1227:S2CID 1049:S2CID 979:S2CID 857:S2CID 793:Logic 709:Other 269:(aka 134:The ' 2373:ISBN 2239:PMID 2217:2007 2111:PMID 2045:PMID 1998:PMID 1963:PMID 1740:PMID 1621:PMID 1564:PMID 1529:PMID 1117:PMID 1041:PMID 971:PMID 849:PMID 630:and 606:fMRI 569:and 498:and 468:fMRI 460:and 193:The 177:The 157:and 151:NASA 3074:Net 2959:Wet 2301:doi 2266:doi 2229:PMC 2221:doi 2190:doi 2103:doi 2037:doi 1990:doi 1955:doi 1909:doi 1841:doi 1807:doi 1732:doi 1718:103 1695:doi 1668:doi 1611:PMC 1603:doi 1556:doi 1521:doi 1477:doi 1414:doi 1312:doi 1276:doi 1217:doi 1107:PMC 1097:doi 1033:doi 963:doi 941:327 841:doi 3156:: 2307:. 2297:14 2295:. 2272:. 2262:47 2260:. 2237:. 2227:. 2215:. 2211:. 2184:. 2117:. 2109:. 2097:. 2051:. 2043:. 2033:34 2031:. 2027:. 2004:. 1996:. 1986:15 1984:. 1961:. 1953:. 1941:65 1939:. 1907:. 1895:27 1893:. 1879:." 1861:^ 1847:. 1835:. 1831:. 1805:. 1793:39 1791:. 1770:^ 1738:. 1730:. 1716:. 1689:. 1664:31 1662:. 1658:. 1619:. 1609:. 1599:11 1597:. 1593:. 1570:. 1562:. 1552:19 1550:. 1527:. 1517:59 1515:. 1483:. 1475:. 1463:47 1461:. 1420:. 1410:40 1408:. 1404:. 1318:. 1308:40 1306:. 1302:. 1272:20 1270:. 1266:. 1239:^ 1225:. 1213:38 1211:. 1207:. 1195:^ 1183:^ 1129:^ 1115:. 1105:. 1095:. 1083:. 1079:. 1047:. 1039:. 1029:54 1027:. 1015:^ 1003:^ 991:^ 977:. 969:. 961:. 951:. 939:. 927:^ 915:^ 907:84 905:. 893:^ 881:^ 869:^ 855:. 847:. 837:17 835:. 823:^ 811:^ 429:, 425:, 165:, 2465:e 2458:t 2451:v 2379:. 2315:. 2303:: 2280:. 2268:: 2245:. 2223:: 2196:. 2192:: 2186:6 2125:. 2105:: 2099:6 2065:' 2059:. 2039:: 2012:. 1992:: 1969:. 1957:: 1915:. 1911:: 1855:. 1843:: 1837:2 1813:. 1809:: 1746:. 1734:: 1701:. 1697:: 1691:2 1674:. 1670:: 1627:. 1605:: 1578:. 1558:: 1535:. 1523:: 1491:. 1479:: 1428:. 1416:: 1368:. 1326:. 1314:: 1284:. 1278:: 1233:. 1219:: 1123:. 1099:: 1091:: 1085:5 1055:. 1035:: 985:. 965:: 947:: 863:. 843:: 173:. 23:.

Index

Debiasing
cognitive biases
rational economic agent
heuristics
cognitive biases
Mount Everest on two consecutive days in 1996
overconfidence effect
sunk cost fallacy
availability heuristic
cognitive biases
confirmation bias
status quo bias
Kahneman
human error
cognitive biases
Gimli Glider
Mars Climate Orbiter
NASA
metric
United States customary units
confirmation bias
hindsight bias
bias blind spot
Sullivan Mine
anoxic
London Ambulance Service
Atul Gawande
Three Mile Island
Space Shuttle Challenger
Chernobyl

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑