Thinking Fast and Slow Book Summary (English)


In this summary, you will find the most important ideas contained in Daniel Kahneman‘s bestseller, Thinking fast and slow. Do you want to know why this psychologist received the Nobel Prize in Economic Sciences? Then continue reading.

Before we start, we would just like to make something clear. The word heuristics is very important and should be clarified before we begin with our summary. Although it looks a bit daunting, it’s pretty simple: heuristics are rules and mental short-cuts people use to process information more efficiently. They are like a set of presumptions that usually lead to good predictions. The most important idea that you will be able to draw from this book is the fact that the human intellect has two “modes” of functioning.

The first is fast and intuitive, but sometimes irrational and hasty. On the other hand, the second mode is rational, slow, calculative, and prudent. The point of this book is not to downgrade one mode and praise the other. Rather, the goal is to point out the good as well as bad sides of each mode of functioning. Let’s start with the rational side. One may be too quick to make conclusions and say that the rational side is always superior to the intuitive side. However, this is not always the case. The exact and detailed differences are better explained later in the text, but for now, it can be mentioned that the intuitive side saves our energy in everyday, routine situations when there isn’t much need to employ the slow and calculative mode of functioning. Of course, sometimes we need to just sit down and think carefully about a problem. These are situations in which we employ our rational thinking.

Finally, the rational, slow half and the intuitive, fast one aren’t necessarily rivals. They often work together. For example, “raw” data can be gathered and organized by the fast, intuitive system, after which the more careful, slower side takes over. Heuristics and biases will be the main focus of this summary. Heuristics are groups of rules that, in most situations, lead us to satisfying and good results. They are not, however, always relevant, and sometimes we may make very silly mistakes by relying on heuristics. The fast system relies heavily on heuristics and biases, and, because this system is unable to experience doubt, it presumes that these hypotheses are always and inevitably true. Unfortunately, heuristics and biases sometimes backfire, and this is the moment when the slow system should take over. This book will help you to recognize these situations when usual presumptions are no longer working, and when slower, deeper processing is needed.

Thinking Fast and Slow Book Summary


Anchoring is a type of heuristic that we employ when we have a certain referent point; for example when someone asks you: “Did Nelson Mandela spend more or less than 30 years in prison?” you may say “I don’t know,” and give some random answer. But, most interestingly, if someone asks you immediately after to give an estimation of the number of years Nelson Mandela spent in prison, you will most likely write a number around 30, because the first question made you biased in that direction. Now, everything works well when anchors aren’t set randomly. In the Nelson Mandela example, the anchor wasn’t set randomly because Mandela did spend about 30 years in jail. That way, the anchor helps you to give the best estimation possible. Let’s see another situation when this anchoring heuristic may be used against you. Some people like to go around pawn shops and try to sell some stuff they deem valuable. Steve, for example, did this often, and sometimes brought home significant amounts of money. However, this time, he stumbled upon a very greedy pawn-shop owner, a genius in haggling.

Steve brought some item that looked almost trivial, like some trifle you usually find in the attic. It was a little figurine of a soldier. It looked interesting, and Steve wanted to see if he could get a few dollars for it. So he brought it to the pawn shop and started arranging the price. Out of the blue, the owner asked him, “What do you think, does this little figurine cost more or less than 25 dollars?” Steve was intrigued and jokingly said that it must cost more than 25 dollars. The owner didn’t respond to his reply and just asked him how much he was requesting for the little toy. Steve was a little taken aback, but he just mumbled something like, “30 dollars.”The owner gave it to him without trying to lower it a bit or anything. This looked a bit suspicious, but Steve just took the money and got out. The next day, he saw the same toy in the shop, with the price of 200 dollars. It became obvious to him that the owner tricked him into believing that the real price of the toy was around 25, and this deceitful anchor ultimately fooled Steve.

Availability works by the rule, “If you can think of it, it must be important.” Like any other heuristic on which the fast system relies, most of the time the heuristic of availability is very useful. For example, when somebody asks you, “What’s the biggest city in Europe?” you may instantly say “London,” not because you know for sure that it is the biggest city in Europe, but just because it first came to your mind. And you wouldn’t be too far away, because London is the second biggest city in Europe, after Moscow. This is a situation when the availability heuristic works in your favor. On the other hand, there are numerous examples of this heuristic catching people off guard, and rushing them to unnecessary and irrational conclusions. Large plane crashes are one of the most shocking and media-covered events in the world. For days and weeks after this kind of catastrophe, the news is still full of reports about these kinds of events. And when you ask people, especially the ones who regularly watch news and reports, about the possibilities of car crashes and plane crashes, people tend to underestimate the possibility of a car crash happening, and overestimate that of a plane crash.

This is because pictures of plane crashes are more readily available in their memories, and it is much easier to retrieve these images than those of car crashes, even though car crashes are much more common and deadly. Let’s see another example. We all know that sharks are dangerous. Even if we neglect the negative impact Spielberg’s film “Jaws” had on the beliefs around sharks, we can exemplify the heuristic of availability by the people’s perception of sharks’ deadliness compared to other animals. Most of us may make a great mistake if we were to assess the likelihood of a shark attack. Similar to the first example, because shark attacks are so dramatic, graphic, bloody, etc., they are “forced” by numerous news agencies, as they attract more viewers. Because of this, the memory of a shark attack is ready and steady in our brain, ultimately making us greatly overestimate the possibility of a shark attacking us when we are more likely to die from falling plane parts.

The sunk-cost fallacy is, simply put, a situation when people continue to invest in a failed asset, even though it is obvious that even the first investment was irrational, and that every subsequent one is sheer lunacy. However, often otherwise very sane people do this to avoid feeling regret and like they’ve made a mistake. This is, of course, a way to salvage a little bit of pride and to keep their ego intact. There is something called “The Concorde Fallacy,” after the real-life situation that happened around the then-revolutionary model of airplane, the Concorde, the joint project of the UK and French governments. Both governments continued investing in the project even after it became obvious that its economic sustainability was null, and that no one wanted to use Concord airplanes because they were so unsafe. However, because even the initial investments in this project were astronomical, it was almost impossible for high officials to admit their failure and abort the project altogether.

Framing is one of the most interesting heuristics and biases, and the one widely employed by news agencies and media magnates. One of the best examples of framing is when we slightly alter a question to emphasize and attenuate what we want. For example, people will respond very differently to these two essentially identical questions:

  1. “Would you consent to a medical procedure that had a 90% chance of survival?
  2. “Would you consent to a medical procedure that had a 10% chance of death?

When we use the first way of asking the same question, we make the listener focus on the good side. Because of this, when we ask people if they would accept an operation that has a 90% chance of success, they are more likely to accept. On the other hand, with the second question, we emphasize the bad side: the mortality rate. This way, people become focused on negative characteristics and are thus more likely to reject the operation. In short, the way we ask questions is very important and can alter the response we’re going to get. Let’s take another example. Kris loved to go to parties. But he was a bit skeptical before going anywhere because he wanted to “get the most of it.” His friends found it a bit difficult to coax him into going with them, but, eventually, the solution occurred to them. They simply asked him to go to the club with them and talked about how they had so much fun the last time they went out. They chose not to talk about the bad things that happened to them.

Let’s take another example, probably more important for the real world than the last one. The way news is conveyed to us is crucial for the effect that it causes. For example, let’s say that a rescue mission of some occupied town was undertaken. It went well, and most of the townspeople were rescued, although some were killed by terrorists. Now, we have at least two ways of spreading the message about this event. If we want to praise the ability of the rescue team, etc., we could proclaim, “The town was saved, and the lives of more than 10,000 people were saved!” On the other hand, if we want to downgrade the importance of this mission, we could say, without uttering a lie, something like this:

“Do we save lives by the deaths of innocent people? More than 50 civilians killed in a mission of doubtful success.” Nothing was changed or misinterpreted because when there are civilian casualties, a mission is never 100% successful. It is the selection of facts and slight exaggeration that plays the most important role here. So the next time you watch the news, hold these tips in mind. Facts can be wrapped up in very different forms, and they are purposefully handpicked and selected to convey the message news agencies wanted and planned. In doing this, news and marketing agencies evade moral and judiciary norms in a very deceitful way. On the one hand, they don’t lie, they simply spread the facts (purposefully selected). On the other, they achieve the exact effect they want.

Optimism and loss of aversion (overconfidence), as we’ve already explained, are closely linked with the sunk-cost fallacy. To cut a long story short, when planning an entrepreneurial endeavor, or something like that, people tend to take an unrepresentative, narrow set of factors into consideration, and underestimate the importance of other, more sinister factors. Furthermore, random events are almost always omitted in the planning phase, which only exacerbates an already bad situation. As already implied, the sunk-cost fallacy and overconfidence are very close phenomena. More specifically, overconfidence can lead to the sunk-cost fallacy. Let’s use the Concorde example once again. One of the most important features of overconfidence is overestimating the likelihood of beneficial events, and underestimating that of bad events happening. It is possible that the designers of Concorde made this kind of mistake before making the sunk-cost fallacy mistake.

For example, it is possible that Concorde designers were overwhelmed by the possibilities of their new plane: it was exceptionally fast, it could get you anywhere on earth in a matter of hours, it was luxurious, and it could achieve twice the speed of sound, etc. On the other hand, the possibilities of bad events were underestimated: safety issues and the amount of demand were both neglected, which ultimately led to utter failure. And then, instead of admitting their overconfidence, the manufacturers plunged into an even worse mistake, and they continued to invest in an asset that has already failed. Affective forecasting is a pretty complex phenomenon. Simply put, affective forecasting happens when we try to predict the way we will feel in the future. However, there are numerous ways we make mistakes in these predictions, and in this chapter, we will mention some of them. Impact bias is best seen in the planning of vacations. We are all euphoric before going on our well-deserved break from work. Partly because we have an idealized representation of what a good vacation should look like, and partly because we are very happy because we don’t have to work for quite some time, we overestimate the happiness we are going to feel.

This is one of the most important factors that cause disappointments and unfulfilled expectations. This is best seen in the so-called “Paris syndrome” or “Pari shōkōgun,” most prominent in Japanese tourists. They have such an idealized expectations from their vacation in France, that it’s impossible for those expectations to become fulfilled. As a result, some Japanese tourists experience symptoms such as depression, dizziness, tachycardia, sweating, and even vomiting. For sure, they have been working very hard and for long hours expecting to have a perfect vacation in “The City of Light,” only to be let down by dirty streets, long queues, (sometimes) bad weather, astronomical prices, etc.

Immune neglect is the other side of impact bias. It is described as one of the most important functions of the psychological immune system. The term “neglect” is utilized because people are usually not aware of the tendency of their immune system to exaggerate their future emotional responses. It has been shown that people who “catastrophize” and emphasize only the bad aspects of an event have a reduced stress response when faced with the real event. It has been shown that even thinking about the possibility of a bad event alerts the body, and physiological coping strategies begin to act. As a result of this, people are more prepared to face real event when it happens. So, a judgment that is wrong alerts our psychological immune system, and sometimes saves us from a lot of trouble. We’ll now mention a situation that is quite frequent, just to make this complex process easier to understand.

We all know that one person is utterly afraid of dentists. Even the smell so usual in the dentists’ office sparks the most intense fear in these individuals. For them, even thinking about going to the dentist seems like an event worse than death. They ponder the worst possibilities: “What if something goes wrong? What if I get an infection? What if a dentist makes a mistake?” This list could go on and on. However, if these individuals summon the courage to go and sort their dental problem out, they may realize, during their dental procedure, that they were silly in their estimations, and that it isn’t that bad. This is because their expectation prepared them for the worst: in other words, their psychological immune system acted preemptively and prepared their bodies for the most stressful event possible. Focalism is another way in which people make mistakes when predicting future events. However, unlike the aforementioned immune neglect, this one isn’t usually beneficial.

Focalism happens when a person focuses on a specific detail of an event so much that this detail becomes the most prominent one, at the expense of other aspects. This can be shown in the situation when people are deciding where to move: California or the Midwest. Most people focus on the weather difference and stick solely to this factor. Because this specific detail is the center of focus, other, equally important ones are neglected, such as prices or quality of living.


There you have it, the most important ideas from Kahneman’s book Thinking, fast and slow were extracted and nicely exemplified. Here are some of the most important things you could learn with this summary:

  1. Thinking is not a unified process. There are at least two different types of thinking, and they differ in a lot of aspects.
  2. The slow type of thinking is highly rational and calculative. It considers all possibilities and usually isn’t influenced by emotions. This is the type of thinking we employ when we try to solve a complex mathematical problem, or when we contemplate our future life plans.
  3. Fast thinking is very good in most situations because it acts according to presumptions that are at least partially based on real events. These biases aren’t necessarily always true, but they work in most circumstances, which is why they are very useful. Fast thinking is automatic and subconscious, and it is very hard to become aware of these fast processes.
  4. Slow thinking and fast thinking aren’t opposed to each other. They quite often act together.
  5. Some marketing or news agencies may purposefully try to influence your fast thinking processes because they are hidden from your consciousness. However, you can relatively easily detect these attempts if you hold some of the tips mentioned in this text in mind.
  6. Whenever you read something, especially news articles or something like that, be aware of the so-called framing. The form of the question is sometimes more important than the content.

Thank You for Your Valuable Time.

Leave a Comment