Thinking slow with Daniel Kahneman
Rationality is not the demonstration of reasonableness, but an application of internal consistency. In his book, Thinking, Fast and Slow, nobel laureate Daniel Kahneman shares his research on heuristics, biases and quirks that guide human behaviour. It offers an accessible and insightful framework into checking our intuition, rationality and logic.
It’s probably the most popular book in behavioural psychology and economics, but we thought it worthwhile to share some of our notes here. In this post, we’ll review some of Kahneman’s major ideas, including his observations on the systems and heuristics that shape our decisions. But we’ll avoid too much discussion on his empirical studies for the sake of brevity.
Skip ahead:
- Our two systems for thinking
- Thinking fast with heuristics
- Coherence, overconfidence and illusions
- A bad intuition for probability
- Thinking about prospect theory
- Narrow framing and life’s choices
Our two systems for thinking
In Thinking, Fast and Slow, Daniel Kahneman opens with an explanation of the two systems, System 1 and System 2, that guide our behaviour and decision-making processes. System 1, our first system, behaves automatically and quickly, often without effort or obvious control. We use it to make decisions and actions that come naturally or unconsciously to us. By contrast, System 2, our second system, engages in effortful mental activities and attention. It’s often associated with deliberateness, agency, choice, concentration, computation and self-control.
It’s important to note that both System 1 and 2 are fictitious. The brain is a far more complex system. Nevertheless, it serves as a useful reminder and summary about how we tend to think.
Ego depletion
Effortful thinking is hard. We’re lazy by nature. As a principle for physical and cognitive effort, the principle of least effort suggests that when there are numerous ways to achieve an objective, we’ll select the action of lowest cost or effort. Self-control and thoughtfulness for example requires effort. It has to draw from a finite reserve of energy. The greater or longer the cognitive demands of our activity, the more likely it is we’ll quit or break discipline. Kahneman refers to this as ego-depletion. It’s one of many contributing factors in our tendency to make intuitive errors.
Cognitive ease
Our thinking, understanding and memories are far more superficial than we like to believe. We often conflate familiarity with understanding, which Kahneman calls an illusion of remembering. This familiarity bias or exposure effect, when combined with our lazy System 2, may encourage us to accept simple or accessible beliefs as fact, an illusion of truth. We like to jump to conclusions because it is cognitively easy since dealing with ambiguity is cognitively taxing.
Philip Fernbach and Steven Sloman share similar evidence in The Knowledge Illusion. They describe our evolutionary dependence on the community of knowledge and thinking as a collective. While it helps with the division of labour and ego depletion, it can also contribute to an illusion of understanding and explanatory depth. These illusions in turn may result in intuitive thinking errors.
Thinking fast with heuristics
Kahneman suggests that we store our biases and beliefs in System 1, a system that we will find difficult to turn off. It also has a poor intuition for logic and statistics, and has the tendency to answer questions through substitution of easier ones. We tend to bring System 2 in when decisions become more difficult and effortful.
Although System 1 is important to day-to-day life, it can falter in more complex environments such as policy, economic or investment decision making (where the correct answer is not always intuitive). While Kahneman provides rich descriptions of both systems and their functions in his book, this post will focus more on their limitations, heuristics and biases:
Associations
Associations are System 1 embedded networks that enable our mind to trigger new ideas or automatic behaviours. Reciprocal links, like common gestures, are an example of such associations and associative networks. These networks can sometimes self-reinforce cognitive, emotional and physical responses within and among people.
Priming and anchors
Similarly, certain ideas can activate concepts, words and emotions that influence our decisions and actions. This is known also as a priming effect. We have a tendency for example to match intensity, where we associate the qualities of one entity with that of another. Such associations are sometimes faulty, particularly when the analogies chosen are poor descriptors of our subject in reality (e.g. using linear analogies to describe exponential processes).
We also tend to look for anchors to form our impression of a result or outcome. This initial anchor is a type of priming effect that we select automatically and intuitively in System 1. Poor choice of anchors may lead to incorrect or nonsensical conclusions. A common example for this is the investor who decides whether a stock is cheap or expensive based entirely on its price today, relative to the price observed yesterday or the month before.
Halo effect
The order in which we receive information can change our interpretation, known also as the halo effect. As a result, we are prone to overemphasise initial sequences of data when formulating impressions, opinions and decisions. This bias may also operate in reverse. For example, our memory might recall or focus on our most recent experience, neglecting our overall experience or performance over time.
Path dependency bias
There’s a similar issue, known also a path-dependency bias, which World Series poker champion Annie Duke described in Thinking in Bets. It describes how different paths to an identical outcome can sometimes result in very different analytical and emotional conclusions. For example, some investors feel upset when the price of their stock surges before falling back to its original purchase price (a regret from not selling early). By contrast, the same investors might feel elated when the price of their stock plunges before returning to the same purchase price (relief from avoided losses). While investors breakeven in both scenarios, their emotional experiences are very different. Kahneman and Duke’s insight is that the path, order and recall of information will shape our conclusions.
Hindsight bias
We have a tendency to take the past as a given, neglecting the many ways in which it might have evolved. This simplified narrative of history and past events can lead to a myopic view of possible futures. Hindsight bias in particular leads many to associate good (bad) outcomes with good (bad) decision making processes. Annie Duke refers to this tendency as resulting. It’s a common problem, not only in poker, but in fields like investing, policy making and scientific inquiry.
Coherence, overconfidence and illusions
Kahneman notes that System 1 is responsible for maintaining, updating and interpreting our individual world. Heuristics like associations and anchors influence our perceptions and expectations. We’re also narrative-minded, and seek coherence and causal explanations to build an automatic story and understanding of our immediate world. This is reflected for example in the explanations the news media generates to explain day-to-day fluctuations in the stock market. We tend to seek and apply causal thinking even if none exist. Kahneman refers to this as an illusion of causality.
Narrative fallacies
We also have trouble reviewing the quantity and quality of evidence when forming impressions. We’re sometimes willing to suspend disbelief to maintain an elegant story as true. Nassim Taleb described this tendency as a narrative fallacy in The Black Swan. In The Psychology of Human Misjudgement, Charlie Munger attributes the narrative fallacy to our desire to avoid uncertainty and inconsistency; And that other tendencies, like psychological denial, availability mis-weighing or social proof, can compound these effects.
Illusion of validity
Then there’s the illusion of validity, which describes our false sense of confidence in our ability to interpret a given set of data. This is best demonstrated in stock picking, where most buyers and sellers operate with the same public information but arrive at very different conclusions. Very rarely do buyers (sellers) thoroughly question why they know more about the stock than the person who is selling (buying), and whether their insights are already embedded in prices. The illusion of rigour and high-level skills, and the culture and confidence of the financial community, may feed into this cognitive bias.
Complex systems and unpredictable outcomes come with the countless interactions of different factors over time. Errors in predictions are unavoidable and one’s confidence is an unreliable indicator of accuracy. Those that do not know that they do not know what they are doing are holding an illusion of validity. Atul Gawande shares a similar observation in Complications, describing how we sometimes fail to consider whether our problems are more like icecubes (predictable) or hurricanes (complex & chaotic).
Overconfidence
We’re prone to overconfidence, comfortable with placing untested faith in our intuitions. Related to this is our tendency to reach conclusions first, and to seek evidence and rationalisations after. Furthermore, Kahneman describes how we’re more likely to succumb to cognitive ease when in a positive mood; And more distrusting of our intuitions when uncomfortable or unhappy. These behaviours stem from the effort and motivation needed to verify our intuitions and beliefs.
It’s neither easy nor practical to check everything we know. We’re quick to form impressions on the basis of personal, sampled and/or conditional descriptions, unadjusted for general population or base probabilities. But remaining alert and engaged when it matters can help us to reject superficial answers that we would otherwise accept intuitively.
Kahneman also says that maybe some overconfidence and illusions are necessary for everyday life. Consider the prospects of innovation without any optimism. Sometimes, misplaced confidence and entrepreneurial delusions can motivate us, and help to overcome setbacks when the odds against us. Confidence, ability and the likelihood of success are not independent of one another. But confidence, taken to either extremes, can sometimes be dangerous.
A bad intuition for probability
System 1 is not equipped to handle statistics well. We have difficulties grappling with very small numbers, very large numbers, sample sizes and exponential processes. For example, we often ignore sample sizes, forgetting that extreme outcomes are more likely with small samples than large. Kahneman believes this emerges from our natural tendency to favour certainty and causality over sustaining doubt. We like to look for patterns, regularity and explanations. This is true even for completely random events. Hence, it is quite common for us to confuse correlation with causation.
Conjunction fallacy
Similarly, we tend to find vivid narratives and detailed stories to be more compelling. This is attributed in part to our desire for coherence. However, stories with more details and descriptions are less plausible from a statistical standpoint. Assuming that a joint event is more likely than a single event is known also as a conjunction fallacy. This desire for coherence can lead to a misjudgement of probabilities, a common trap for forecasters. Those that add more details to their economic model for the sake of ‘realism’ might actually make their forecasts less probable.
Availability bias
Like the halo effect, we tend to give weight to information based on the ease and frequency in which instances come to mind. This is known also as the availability heuristic. For example, the volume of media coverage can warp our perception of base rates and representativeness of reality.
Affect heuristic
Similarly, we tend to form impressions based on the emotions we feel at the time, known also as the affect heuristic. It’s a form of question substitution that stems from our desire for associative coherence. A combination of availability and affect heuristics can be a powerful force for cascading and reinforcing the attitudes and beliefs of individuals and groups.
Base and causal rates
When it is difficult to judge the quality and quantity of evidence, it’s helpful to anchor our impressions to a plausible base rate and to review the evidence for the causal (conditional) rate. The former refers to facts about a population set, which we often give insufficient weight to. The latter refers to facts about a specific case within the population, which we sometimes over extrapolate or stereotype. To paraphrase a key passage from Kahneman’s book: we should be more willing to deduce a particular from the general, and more careful when inferring the general from a particular.
Extremes and regressions
Intuitive predictions tend to be more overconfident and extreme. We have a tendency to overestimate the likelihood of improbable events and overweight their importance in our decision making. This stems in part from our desire tendency to seek cognitive ease and ignore base rates, in which we make little attempt to specify the likelihood of alternatives. Availability biases, confirmation biases and vivid representations can add further to this.
Like extreme predictions, we sometimes ignore regression to the mean in our analysis of problems and trajectories. For example, if the outcome of an independent event lies within a bell curve of possibilities, the likelihood of an even more extreme event to follow one prior extreme event is low (but not impossible). However, we tend to extrapolate on trends from historical instances and fail to recognise when features like regression to the mean might exist.
It’s why investors like Charlie Munger or Howard Marks like to think in tendencies, cycles and ecosystems. It helps them to consider the accelerants and impediments of a system, and the alternatives that could arise.
Thinking about prospect theory
Kahneman highlights the troubles we face with noticing and addressing flaws in our mental models. To demonstrate this, the author spends quite a bit of time in his book documenting the limitations of expected utility theory (EUT), a theory that economists use and teach widely to this day.
While EUT offers an mathematical description of our aversion to risk, the author describes how its assumptions ignore the realities of human decision making. For example, common applications of EUT in finance (or insurance) plots our utility or happiness function against an absolute level of expected wealth. In reality, our emotional states are often tied to the change in wealth, or wealth relative to some reference point (e.g. comparing one’s wealth to their siblings or neighbours).
With regards to the limitations of EUT, Kahneman’s point is that history, context and reference points matter. He also raises a broader question about the unwillingness of mainstream economists to engage with the complexities and nuances of human decision making. One potential reason for this is the appealing simplicity, elegance and ‘mathiness’ of models like EUT.
Kahneman and Amos Tversky decided to expand on the ideas of EUT, culminating in the development of Prospect Theory. Their theory suggests that while we dislike losing more than we like winning (risk aversion), we make our evaluations with a reference point. This assumption helps to explain several behaviours that EUT does not, which they summarise in what Kahneman calls a fourfold pattern.
Fourfold pattern
Predictions from the fourfold pattern stem from several assumptions. Firstly, we tend to make decisions based on the near-term emotional impact of potential gains and losses, as opposed to the prospects of our long-term wealth and wellbeing. Secondly, we are quite insensitive to changes in probabilities at the extremes (e.g. 0.0001 vs 0.00001). As such, we tend to overweight large but improbable risks (a.k.a. the possibility effect) and attribute less weight to outcomes with likelihoods that are almost certain (a.k.a. the certainty effect).
Combining these effects with different reference points (gains and losses), Kahneman and Tversky explains our decision making process under a four-fold pattern:
Prospect / probability | Gains | Losses |
Low probability | Risk seeking (e.g. partake in lotteries) | Risk averse (e.g. seek insurance) |
High probability | Risk averse (e.g. make safe bets) | Risk seeking (e.g. make desperate gambles) |
Of course, these behaviours are not set in stone. They depend on the preferences, probabilities, payouts and reference points specified. However, the simple model for example does a good job of explaining our simultaneous preference for lotteries and insurance. Kahneman notes that our aversion to losses and the fear of disappointment are powerful behavioural forces. We work harder to prevent or avoid losses than to generate gains. These biases can sometimes lead to suboptimal decisions.
Endowment effect
When evaluating change, we tend to focus more on potential losses than on potential gains. These preferences can bias us towards the status quo. Kahneman discusses a similar phenomenon in the endowment effect, which describes our tendency to value goods we own more highly than identical goods on the market. The effect is more pronounced in goods for use (e.g. wine, houses, etc.) than goods for exchange (e.g. money). Scrutinising our reference points, imagining counterfactuals and anticipating our emotional reactions under different outcomes can help us to make wiser decisions.
Narrow framing and life’s choices
Kahneman shared an idea in probability that left a strong impression on me. It’s the idea behind narrow framing, a behaviour that can cost us dearly over the long run.
Here’s a simple example: Consider a 50:50-coin flip that pays you $100K if heads but costs you $75K if tails. If this game involved only one coin flip, most of us would avoid playing in fear of potential losses. However, if this game involved 100, 1000 or 10,000 independent flips, most of us would play. This alternative is more attractive to us because the repetition of favourable gambles has a positive expected return and lower likelihood of catastrophic losses.
Decision making in life resembles this second gamble. We have to make many decisions across our lifetime. However, we might reject every favourable gamble in life if we review every decision under uncertainty in isolation. This strategy is potentially suboptimal across one’s lifetime. By contrast, if we review our decisions in life as a set of risky gambles, we can construct a decision portfolio with higher expected returns and lower lifetime risk. This outside view may help us to make better decisions in the long-run.
Of course, decision making in life is rarely clear cut. It’s difficult to judge our expected payoffs against potential choices, or to even know our entire set of options itself. Regardless, the idea of narrow framing is a helpful concept to keep in mind. An extended consideration of our choices and opportunity set may help us to make decisions with better lifetime odds.
Further reading
“Nothing in life is as important as you think it is, while you are thinking about it”
Daniel Kahneman – Thinking, Fast and Slow
If you enjoyed Thinking, Fast and Slow by Daniel Kahneman, there’s a chance you will enjoy some of the the following works as well:
- Influence – Robert Cialdini on the psychology of persuasion
- Charlie Munger – The Psychology of Human Misjudgement
- The Tipping Point – Malcolm Gladwell on small things that make big differences
- The Knowledge Illusion – Philip Fernbach and Steven Sloman on ignorance and irrationality
- Thinking in Bets – Annie Duke on decision making
- Irrational Exuberance – Robert Shiller on stock market bubbles and manias
- The Mental Game of Poker – Jared Tendler with lessons from sports psychology
References
- Kahneman, D. (2011). Thinking, Fast and Slow. More details, publications and articles available at <https://scholar.princeton.edu/kahneman/publications-0>
- Stanovich, K. (2011). Rationality and the Reflective Mind.
- Taleb, N. (2007). The Black Swan: The Impact of the Highly Improbable.
- Surowiecki, J. (2004). The Wisdom of Crowds: Why the Many are Smarter Than the Few.
- Guwande, A. (2009). The Checklist Manifesto: How to Get Things Right.
- Duke, A. (2018). Thinking in Bets: Making Smarter Decisions When You Don’t Have All the Facts. More at < https://www.annieduke.com/ >
- Fernbach, P. & Sloman, S. (2017). The Knowledge Illusion: Why We Never Think Alone.