Thinking Fast and slow (PART 1: better understanding of how decisions are made, why certain judgment errors are so common)


In this two part blog we will look at the 'Thinking fast and slow' book by Nobel prize winner Daniel Kahneman, which has a significant contribution to a new understanding of the human mind. We now have a better understanding of how decisions are made, why certain judgment errors are so common and how we can improve ourselves.


Of two minds: how our behavior is determined by two different systems – one automatic and the other considered.


There is a compelling drama going on in our minds, a filmlike plot between two main characters with twists, dramas and tensions. These two characters are the impulsive, automatic, intuitive System 1, and the thoughtful, deliberate, calculating System 2. As they play off against each other, their interactions determine how we think, make judgments and decisions, and act.

System 1 is the part of our brain that operates intuitively and suddenly, often without our conscious control. You can experience this system at work when you hear a very loud and unexpected sound. What do you do? You probably immediately and automatically shift your attention toward the sound. That’s System 1.

This system is a legacy of our evolutionary past: there are inherent survival advantages in being able to make such rapid actions and judgments.
System 2 is what we think of when we visualize the part of the brain responsible for our individual decision-making, reasoning and beliefs. It deals with conscious activities of the mind such as self-control, choices and more deliberate focus of attention.

For instance, imagine you’re looking for a woman in a crowd. Your mind deliberately focuses on the task: it recalls characteristics of the person and anything that would help locate her. This focus helps eliminate potential distractions, and you barely notice other people in the crowd. If you maintain this focused attention, you might spot her within a matter of minutes, whereas if you’re distracted and lose focus, you’ll have trouble finding her.

As we'll see in the following part, the relationship between these two systems determines how we behave.
 

The lazy mind: how laziness can lead to errors and affect our intelligence To see how the two systems work, try solving this famous bat-and-ball problem:

A bat and ball cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost? The price that most likely came to your mind, $0.10, is a result of the intuitive and automatic System 1, and it’s wrong! Take a second and do the math now. Do you see your mistake? The correct answer is $0.05. What happened was that your impulsive System 1 took control and automatically answered by relying on intuition. But it answered too fast.

Usually, when faced with a situation it can’t comprehend, System 1 calls on System 2 to work out the problem, but in the bat-and-ball problem, System 1 is tricked. It perceives the problem as simpler than it is, and incorrectly assumes it can handle it on its own. The issue the bat-and-ball problem exposes is our innate mental laziness. When we use our brain, we tend to 

use the minimum amount of energy possible for each task. This is known as the law of least effort. Because checking the answer with System 2 would use more energy, our mind won’t do it when it thinks it can just get by with System 1.

This laziness is unfortunate, because using System 2 is an important aspect of our intelligence. Research shows that practicing System-2 tasks, like focus and self-control, lead to higher intelligence scores. The bat-and-ball problem illustrates this, as our minds could have checked the answer by using System 2 and thereby avoided making this common error. By being lazy and avoiding using System 2, our mind is limiting the strength of our intelligence.


Autopilot: why we are not always in conscious control of our thoughts and actions

What do you think when you see the word fragment “SO_P”? Probably nothing. What if you first consider the word “EAT”? Now, when you look again at the word “SO_P,” you would probably complete it as “SOUP.” This process is known as priming. We’re primed when exposure to a word, concept or event causes us to summon related words and concepts. If you had seen the word “SHOWER” instead of “EAT” above, you probably would’ve completed the letters as “SOAP.”

Such priming not only affects the way we think but also the way we act. Just as the mind is affected by hearing certain words and concepts, the body can be affected as well. A great example of this can be found in a study in which participants primed with words associated with being elderly, such as “Florida” and “wrinkle,” responded by walking at a slower pace than usual. Incredibly, the priming of actions and thoughts is completely unconscious; we do it without realizing. What priming therefore shows is that despite what many argue, we are not always in conscious control of our actions, judgments and choices. We are instead being constantly primed by certain social and cultural conditions. For example, research done by Kathleen Vohs proves that the concept of money primes individualistic actions.

People primed with the idea of money - for example, through being exposed to images of money - act more independently and are less willing to be involved with, depend on or accept demands from others. One implication of Vohs’s research is that living in a society filled with triggers that prime money could nudge our behavior away from altruism. Priming, just like other societal elements, can influence an individual's thoughts and therefore choices, judgment and behavior – and these reflect back into the culture and heavily affect the kind of society we all live in.  



Snap judgments: how the mind makes quick choices, even when it lacks enough information to make a rational decision


Imagine you meet someone named Ben at a party, and you find him easy to talk to. Later, someone asks if you know anybody who might want to contribute to their charity. You think of Ben, even though the only thing you know about him is that he is easy to talk to. In other words, you liked one aspect of Ben’s character, and so you assumed you would like everything else about him. We often approve or disapprove of a person even when we know little about them.

Our mind’s tendency to oversimplify things without sufficient information often leads to judgment errors. This is called exaggerated emotional coherence, also known as the halo effect: positive feelings about Ben’s approachability cause you to place a halo on Ben, even though you know very little about him. But this is not the only way our minds take shortcuts when making judgments. There is also confirmation bias, which is the tendency for people to agree with information that supports their previously held beliefs, as well as to accept whatever information is suggested to them.
This can be shown if we ask the question, “Is James friendly?” Studies have shown that, faced with this question but no other information, we’re very likely to consider James friendly – because the mind automatically confirms the suggested idea.
The halo effect and confirmation bias both occur because our minds are eager to make quick judgments. But this often leads to mistakes, because we don’t always have enough data to make an accurate call. Our minds rely on false suggestions and oversimplifications to fill in the gaps in the data, leading us to potentially wrong conclusions.
Like priming, these cognitive phenomena happen without our conscious awareness and affect our choices, judgments and actions.
 

Heuristics: how the mind uses shortcuts to make quick decisions

Often we find ourselves in situations where we need to make a quick judgment. To help us do this, our minds have developed little shortcuts to help us immediately understand our surroundings. These are called heuristics.
Most of the time, these processes are very helpful, but the trouble is that our minds tend to overuse them. Applying them in situations for which they aren’t suited can lead us to make mistakes. To get a better understanding of what heuristics are and what mistakes they can lead to, we can examine two of their many types: the substitution heuristic and the availability heuristic.
The substitution heuristic is where we answer an easier question than the one that was actually posed.
Take this question, for example: “That woman is a candidate for sheriff. How successful will she be in office?” We automatically substitute the question we’re supposed to answer with an easier one, like, “Does this woman look like someone who will make a good sheriff?”
This heuristic means that instead of researching the candidate’s background and policies, we merely ask ourselves the far easier question of whether this woman matches our mental image of a good sheriff. Unfortunately, if the woman does not fit our image of a sheriff, we could reject her – even if she has years of crime-fighting experience that make her the ideal candidate.
Next, there is the availability heuristic, which is where you overestimate the probability of something you hear often or find easy to remember.
For example, strokes cause many more deaths than accidents do, but one study found that 80 percent of respondents considered an accidental death a more likely fate. This is because we hear of accidental deaths more in the media, and because they make a stronger impression on us; we remember horrific accidental deaths more readily than deaths from strokes, and so we may react inappropriately to these dangers.
 

No head for numbers: why we struggle to understand statistics and make avoidable mistakes because of it

How can you make predictions on whether certain things will happen?
One effective way is to keep the base rate in mind. This refers to a statistical base, which other statistics rely on. For example, imagine a large taxi company has 20 percent yellow cabs and 80 percent red cabs. That means the base rate for yellow taxi cabs is 20 percent and the base rate for red cabs is 80 percent. If you order a cab and want to guess its color, remember the base rates and you will make a fairly accurate prediction.
We should therefore always remember the base rate when predicting an event, but unfortunately this doesn’t happen. In fact, base-rate neglect is extremely common.
One of the reasons we find ourselves ignoring the base rate is that we focus on what we expect rather than what is most likely. For example, imagine those cabs again: If you were to see five red cabs pass by, you’d probably start to feel it’s quite likely that the next one will be yellow for a change. But no matter how many cabs of either color go by, the probability that the next cab will be red will still be around 80 percent – and if we remember the base rate we should realize this. But instead we tend to focus on what we expect to see, a yellow cab, and so we will likely be wrong.
Base-rate neglect is a common mistake connected with the wider problem of working with statistics. We also struggle to remember that everything regresses to the mean. This is the acknowledgment that all situations have their average status, and variations from that average will eventually tilt back toward the average.
For example, if a football striker who averages five goals per month scores ten goals in September, her coach will be ecstatic; but if she then goes on to score around five goals per month for the rest of the year, her coach will probably criticize her for not continuing her “hot streak." The striker wouldn’t deserve this criticism, though, because she is only regressing to the mean!

Leave a comment


Please note, comments must be approved before they are published