I like to think of myself as a rational person. But of course I’m not.
In fact, we’re all irrational, we all make mental errors, and the portrayal of humans as consistently rational agents, homo economicus, is only a fairy tale used in naive applications of economic theory. The human psyche is a difficult thing to change even if we’re aware of the mistakes we make, the biases we inherit, and have given them all sorts of fancy names like “narrative fallacy” and “reductive bias”. And with all the cognitive biases that psychologists have identified from experiment after experiment in recent decades working against us, it’s no wonder that we often get seduced away from investing intelligently.
But that doesn’t mean we can’t work on it. Knowing thyself is critical if you want to turn rationality into a lifelong project—which is actually an incredibly fun one.
So start here. Here are 19 cognitive biases that can lead you astray in investing.
1. Hot hand fallacy
We believe in winning streaks and that recent events matter, even if these events hold no signal—like flipping coins.
2. Hyberbolic discounting
We are impressed by short-term success, even as it may be due to short-term luck, like quarterly hedge fund performances or earnings reports.
3. Commitment and consistency bias
We stick with what we’ve said we’re going to do. Changing one’s mind is difficult, especially in the public eye.
4. Confirmation bias
Once we’re committed, we look for and overweigh the significance of data that support this initial impression.
5. Sunk cost fallacy
We follow through on activities in which we shouldn’t due to resources and effort already invested in that activity. Which is especially dangerous if we’ve spent weeks working on a single investment idea.
6. Anchoring
We tend to use an initial idea, fact, or number as a reference point for future decisions even if we know it’s not relevant for the present situation.
7. Overconfidence
We poorly calibrate our perceptions of our decisions and overestimate our abilities to make them well.
8. Herding
We seek comfort in social proof by putting the “wisdom of the crowd” over our own ability to think logically.
A Buffett story:
An oil prospector, moving to his heavenly reward, was met by St. Peter with bad news. “You’re qualified for residence,” said St. Peter, “but, as you can see, the compound reserved for oil men is packed. There’s no way to squeeze you in.” After thinking a moment, the prospector asked if he might say just four words to the present occupants. That seemed harmless to St. Peter, so the prospector cupped his hands and yelled, “Oil discovered in hell.” Immediately, the gate to the compound opened and all of the oil men marched out to head for the nether regions. Impressed, St. Peter invited the prospector to move in and make himself comfortable. The prospector paused. “No,” he said, “I think I’ll go along with the rest of the boys. There might be some truth to that rumor after all.”
9. Authority bias
We adopt the views of “experts” without question and discredit our own perspectives if they differ.
10. Chauffeur knowledge
We confuse familiarity with knowledge and understanding.
A Munger story:
I frequently tell the apocryphal story about how Max Planck, after he won the Nobel Prize, went around Germany giving the same standard lecture on the new quantum mechanics.
Over time, his chauffeur memorized the lecture and said, “Would you mind, Professor Planck, because it’s so boring to stay in our routine. [What if] I gave the lecture in Munich and you just sat in front wearing my chauffeur’s hat?” Planck said, “Why not?” And the chauffeur got up and gave this long lecture on quantum mechanics. After which a physics professor stood up and asked a perfectly ghastly question. The speaker said, “Well I’m surprised that in an advanced city like Munich I get such an elementary question. I’m going to ask my chauffeur to reply.
11. Recency effect
We overreact to good news and to bad news because everything looks bigger up close.
12. Survivorship bias
We tend to focus on and learn from the winners in a particular area while forgetting about the losers who are employing the same strategy.
13. Mental accounting
We apply different subjective values to the same amount of money. Given the need to sell a stock, we would rather sell the winner rather than the loser, even though selling the loser is usually the rational decision.
14. Endowment effect
We place a higher value on what we already own than what we would if it didn’t own it. Which it’s not all bad since the endowment effect can also help you hold on to a truly great business, even if the price gets a little silly.
15. Hindsight bias
We mistakenly confuse our ability to predict events by forcing our minds to understand the past as a cohesive narrative, leading to future neglect of probability.
16. Neglect of probability
We disregard the probabilities inherent in our decisions if there’s uncertainty involved, leading to small risks either being neglected entirely or being hugely overrated. And we round probabilities up to 100% or down to 0% even as we know that “nothing is ever certain”.
17. Narrative fallacy
We love stories and we let our preference for the good ones cloud our facts, increasing our impression of understanding.
18. Law of small numbers
We take small samples and assume that they fit the general population, which is strongly correlated with the hot hand fallacy.
19. Availability heuristic
We judge the probability of an event based on the ease with which examples come to mind. We extrapolate our personal experiences and consider them to be the market reality.
These cognitive biases are all part of my collection of mental models.