The most dangerous mistakes in decision making are not the ones that feel like mistakes at the time. They are the ones that feel completely rational. Daniel Kahneman spent decades cataloging these patterns, and the central finding is humbling: our brains are designed to produce quick, confident answers rather than accurate ones. The errors are systematic, predictable, and nearly invisible from the inside.
The first and most pervasive mistake is anchoring. Any number you encounter before making a judgment will pull your estimate toward it, even when the number is completely irrelevant. Kahneman demonstrated this with a roulette wheel: people who saw a high spin gave higher estimates on unrelated factual questions. Real estate agents, judges, and salary negotiators are all affected. The practical defense is simple but requires discipline: before entering any negotiation or evaluation, generate your own estimate independently, before looking at any reference numbers.
The second mistake is what Kahneman calls WYSIATI — What You See Is All There Is. Our minds build the most coherent story possible from whatever information is available, without pausing to consider what information might be missing. This produces overconfidence. The quality of a decision feels like it depends on the quality of the story we tell ourselves, when it actually depends on the completeness of the information behind it. The corrective is to actively ask: what do I not know? What information would change my mind if I had it?
Loss aversion creates a third category of errors. Kahneman and Tversky showed that losses hurt roughly twice as much as equivalent gains feel good. This asymmetry makes us irrationally conservative when protecting what we have and irrationally reckless when trying to avoid losses. It explains why people hold losing investments too long, why negotiations stall over small concessions, and why we resist changes that have positive expected value simply because they involve some risk of loss.
The availability heuristic distorts our sense of probability. We judge how likely something is by how easily examples come to mind, not by actual statistics. Dramatic but rare events — plane crashes, shark attacks — feel more probable than common but undramatic risks like heart disease or diabetes. This is not a failure of intelligence. It is a feature of how memory works, amplified by media coverage that favors vivid, unusual stories over representative ones.
Perhaps the most practical insight comes from recognizing confirmation bias: our tendency to seek information that supports what we already believe and to ignore or discount information that contradicts it. Naval Ravikant captures the deeper issue when he describes judgment as knowing what to do in situations where effort alone is not enough. Good judgment requires deliberately seeking out the perspectives most likely to prove you wrong. The people who make the best decisions are not the smartest or most experienced. They are the ones who have learned to distrust their own certainty and build in systematic checks against the biases that everyone shares.
