Thinking Fast and Slow, by Daniel Kahneman

Amazon link

I finally read this book, which has been on the to-read list since it came out because of its discussion of cognitive biases. I hadn’t been in a particular hurry to read it, especially since I’d listened to Kahneman’s Long Now talk which covered the main themes of the book. But I finally got it from the library and it was a nice summary of the psychology of decision making.

Kahneman sums up the book in the first paragraph of his final conclusions chapter:

I began this book by introducing two fictitious characters, spent some time discussing two species, and ended with two selves. The two characters were the intuitive System 1, which does the fast thinking, and the effortful and slower System 2, which does the slow thinking, monitors System 1, and maintains control as best it can within its limited resources. The two species were the fictitious Econs, who live in the land of theory, and the Humans, who act in the real world. The two selves are the experiencing self, which does the living, and the remembering self, which keeps score and makes the choices.

System 1 and System 2 are the most discussed portion of the book – System 1 is our intuition that leaps to conclusions, System 2 is our logical rational brain which has to be effortfully engaged. Kahneman is quick to point out that System 2 is not superior to System 1 – somebody with only System 2 would never get anything done because they would spend all their time thinking about possibilities (as suggested by the anecdote of the man who could no longer make decisions after he lost his emotions).

So the question is when do we engage System 2. Kahneman says in the conclusion: “The way to block errors that originate in System 1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2”. However, he follows up by saying it’s difficult to recognize those signs in the moment – System 1 is very good at substituting easy answers to questions that aren’t being asked (e.g. substituting “I like this person” as an answer for the question “Is this person qualified for this job?”) and we have to recognize those signs. The first part of the book is an excellent tour of several different cognitive shortcuts that System 1 uses:

  • Anchoring – giving a random number before asking somebody to estimate a number will bias the result.
  • Law of small numbers – assuming that small sample sizes represent the population and leaping to conclusions based on samples as small as one.
  • Availability bias: Things that come to mind easier are considered more likely and probable, which is related to recency bias.
  • Specificity and story – System 1 likes situations which are specific and vivid e.g. when asked “Which is more probable? (a) Linda is a bank teller and (b) Linda is a bank teller and active in the feminist movement”, most people chose (b) which makes no sense logically – (b) is a complete subset of (a) – but System 1 could construct a specific story of (b) whereas (a) was too generic.

Kahneman has one chapter where he discuss co-writing an article with Gary Klein, the author of Sources of Power, which I quite liked as a summary of why we should trust our intuition (or System 1 in Kahneman’s terminology). Kahneman believes we shouldn’t trust System 1 so it was interesting to read of their work to try to figure out where they agreed:

“Our conclusion was that for the most part it is possible to distinguish intuitions that are likely to be valid from those that are likely to be bogus. … If the environment is sufficiently regular and if the judge has had a chance to learn its regularities, the associative machinery will recognize situations and generate quick and accurate predictions and decisions. You can trust someone’s intuitions if these conditions are met. … In a less regular, or low-validity, environment, the heuristics of judgment are invoked. System 1 is often able to produce quick answers to difficult questions by substitution, creating coherence where there is none. The question that is a answered is not the one that was intended, but the answer is produced quickly and may be sufficiently plausible to pass the lax and lenient review of System 2.”

In other words, you can not trust your intuition just because you are confident – you can only do so in regular situations where the subconscious pattern building machine can be effective.

In the second part of the book, Kahneman moves on to discussing the pitfalls of assuming that humans make rational decisions. These are pretty well-known e.g. there is a huge difference volunteering to be an organ donor between countries where you have to opt in (< 15%) or opt out (> 85%). Since the question is exactly the same, clearly something else is going on (in this case, the bias for inaction overwhelms any rational consideration). He also discussed his work on prospect theory, where he and Amos Tversky measure loss aversion and determine that we need the gain to be twice as big to risk losing (in other words, we want to win $200 in a coin flip to risk losing $100). He also discusses our inability to estimate probabilities, where we overestimate rare chances (treating 1% chances as 5%, and 99% chances as 91%) and therefore want inordinate protection against them. In all of these cases, he suggests taking the “outside view” (the perspective of an unknowledgeable observer) and/or reversing the situation (opt-in vs. opt-out) to see if our intuition changes to understand if we are truly being rational or just going with our intuition. He uses the acronym WYSIATI (What You See Is All There Is) to remind us that we often put blinders on ourselves to make our decision making easier.

Lastly, Kahneman delves into the remembering vs. experiencing self. He describes an experiment where people are asked to experience pain (e.g. by placing their hand in a bath of ice cold water) for a duration of time. There are two scenarios – holding the hand in for 60 seconds, and holding the hand in for 60 seconds, and then an additional 30 seconds where the bath was warmed slightly. People consistently said the second scenario was preferable and they chose to repeat that rather than the first scenario. This logically makes no sense – they are experiencing more pain and discomfort in the second scenario. But Kahneman says that we remember only the average of the peak pain and the end pain, and are oblivious to the actual duration. The experiencing self which suffers through 90 seconds instead of 60 is ignored. You can play with this to design optimum experiences for yourself e.g. when planning vacations (always end on a high note and take pictures to remember the peak experiences).

All in all, I don’t think I learned anything particularly new from reading the book, but it is a great summary of the different quirks that prevent our brains are not perfectly rational. By having a mental list of such quirks, we can be more aware of them, and try to combat them by bringing the slower and more effortful System 2 into play. We won’t always succeed, and that’s where friends are helpful to give us the outside view and a new perspective beyond whatever we are seeing at the moment (Kahneman calls it the focusing illusion: “Nothing in life is as important as you think it is when you are thinking about it.”). Let me know when you see me leading myself astray and I’ll try to do likewise.

One thought on “Thinking Fast and Slow, by Daniel Kahneman

Leave a Reply

Your email address will not be published. Required fields are marked *