Chapter 13: Decision Making, Biases & Cognitive Illusions

Loading audio…

ⓘ This audio and summary are simplified educational interpretations and are not a substitute for the original text.

If there is an issue with this chapter, please let us know → Contact Us

While many people evaluate the quality of a choice based on its outcome, psychologists emphasize rationality—a metric involving the thorough, fair consideration of all relevant goals and evidence, rather than just the most immediate or obvious ones. The decision-making process is generally organized into several non-linear phases, including goal establishment, information gathering, structuring the decision, making a final choice, and performing a retrospective evaluation. Central to understanding these choices is the concept of probability; however, individuals often struggle to accurately estimate intermediate likelihoods and frequently deviate from objective mathematical models like Bayes's theorem. This departure from optimality is often attributed to cognitive overload, where the volume of information exceeds human processing capacity, forcing a reliance on heuristics or mental shortcuts. These shortcuts frequently result in cognitive illusions, such as the availability heuristic, where the ease of recalling specific examples influences one's judgment of frequency, or the representativeness heuristic, where people rely on stereotypes or expect small samples to perfectly mirror larger populations. Additional systematic biases explored include framing effects, where the description of a situation as either a gain or a loss alters the perceived value and risk, and anchoring, where initial, often arbitrary values heavily bias final numerical estimates. The sunk cost effect highlights an irrational tendency to persist in failing endeavors simply because a prior investment of time, money, or effort has already been made. Humans also frequently fall prey to illusory correlations, perceiving relationships between variables where none exist, and hindsight bias, which is the false belief that an event was predictable after the outcome is already known. Confirmation bias further complicates rational thought by leading individuals to seek only information that supports their existing hypotheses. Perhaps most significantly, overconfidence acts as a major barrier to improvement, as people typically possess an inflated sense of their own accuracy and judgment. To model these behaviors, the text distinguishes between normative and prescriptive theories, such as Expected Utility (EU) Theory and Multiattribute Utility Theory (MAUT), which describe the ideal integration of various factors, and descriptive models like elimination by aspects or image theory, which detail how people actually simplify complex choices. For experts in high-stakes, time-pressured environments, the recognition-primed model suggests that intuition and mental simulation are prioritized over formal calculation. Ultimately, the chapter suggests that while human judgment is naturally limited, decision-making can be improved through statistical training, objective feedback, and the use of formal decision analysis tools.