Cognitive Biases: How They Shape Decisions and What We Can Do About Them

Cognitive biases are systematic patterns of deviation from rationality in judgment and decision-making. They are not random errors but predictable, consistent ways in which human thinking diverges from what a purely rational agent would do. The research on cognitive biases, developed primarily through the work of Daniel Kahneman, Amos Tversky, and generations of subsequent researchers, has transformed understanding of human judgment and has generated practical implications for everything from healthcare to finance to public policy.
The dual-process theory of cognition, popularized by Kahneman's Thinking Fast and Slow, provides a useful organizing framework. System 1 thinking is automatic, fast, intuitive, and largely unconscious. System 2 thinking is deliberate, slow, effortful, and conscious. Most cognitive biases arise from System 1 heuristics that are useful shortcuts in many contexts but that produce systematic errors in others. The goal of bias awareness is not to eliminate System 1 thinking, which is impossible and often valuable, but to know when to invoke System 2 checking.
Confirmation bias, the tendency to seek, interpret, and remember information in ways that confirm pre-existing beliefs, is among the most thoroughly documented and consequential biases. Research documents confirmation bias in clinical diagnosis, financial investment, political opinion formation, and scientific hypothesis testing. The internet and social media environments may amplify confirmation bias by providing personalized information streams that reflect users' existing preferences.
Anchoring is the tendency to rely heavily on the first piece of information encountered when making estimates and decisions. Experimental research shows that arbitrary anchors, numbers presented to participants before a judgment task, influence subsequent estimates even when participants know the anchors are random. Anchoring affects negotiation outcomes, medical diagnosis, legal sentencing, and financial decision-making. The first price offered in a negotiation, the first diagnosis considered in a clinical workup, and the first option presented in a choice set all have disproportionate influence on subsequent judgments.
The availability heuristic leads people to judge the probability of events based on how easily examples come to mind. Dramatic or emotionally salient events are more available in memory and are therefore judged as more probable than events that are actually more frequent but less memorable. Research documents that people overestimate deaths from dramatic causes like plane crashes and shark attacks while underestimating deaths from mundane but frequent causes like cardiovascular disease. Medical professionals overestimate the probability of diagnoses they have encountered recently.
Overconfidence bias is the tendency to overestimate the accuracy of one's own judgments and predictions. Research across multiple domains finds that people's stated confidence in their judgments substantially exceeds their accuracy. Experts are overconfident, sometimes more so than novices, and overconfidence is particularly pronounced for difficult judgments and predictions about complex systems. Research on physician diagnosis finds overconfidence; research on financial analyst predictions finds that highly confident predictions are not more accurate than less confident ones.
Loss aversion, identified by Kahneman and Tversky's prospect theory, refers to the finding that losses feel approximately twice as painful as equivalent gains feel pleasurable. Loss aversion explains a wide range of behavioral patterns: the endowment effect, in which people value objects they own more than identical objects they do not; status quo bias, in which people prefer the default option partly because changing it feels like a loss; and risk preferences that vary depending on whether outcomes are framed as gains or losses.
The sunk cost fallacy leads people to continue investing in projects or commitments based on past investments rather than future prospects. Research on sunk cost reasoning documents it in military strategy, business investment, sports roster decisions, and personal commitments. The economically rational approach is to make decisions based on prospective costs and benefits, treating past costs as irrelevant to future decisions. The psychological reality is that past investments create emotional attachments that distort prospective evaluation.
Debiasing, the project of reducing the influence of cognitive biases on judgment, has proven to be difficult. Research shows that simply learning about biases does not reliably reduce their influence on subsequent judgment. More effective approaches include decision processes designed to force consideration of alternative hypotheses, checklists that prompt attention to neglected considerations, reference class forecasting that grounds predictions in base rates rather than case-specific detail, and structured decision-making processes that separate analysis from advocacy.
The policy implications of cognitive bias research have generated an entire field of behavioral economics and behavioral public policy, sometimes called libertarian paternalism or choice architecture. If people's decisions are systematically biased in predictable ways, policy design that changes the decision environment, for example by making beneficial choices the default rather than requiring active selection, can improve outcomes without restricting choices. The practical applications range from retirement savings to organ donation to healthy cafeteria design.