Thinking Fast and slow (PART 2: better understanding of how decisions are made, why certain judgment errors are so common)


IN THIS TWO PART BLOG WE WILL LOOK AT THE 'THINKING FAST AND SLOW' BOOK BY NOBEL PRIZE WINNER DANIEL KAHNEMAN, WHICH HAS A SIGNIFICANT CONTRIBUTION TO A NEW UNDERSTANDING OF THE HUMAN MIND. WE NOW HAVE A BETTER UNDERSTANDING OF HOW DECISIONS ARE MADE, WHY CERTAIN JUDGMENT ERRORS ARE SO COMMON AND HOW WE CAN IMPROVE OURSELVES.

Past imperfect: why we remember events from hindsight rather than from experience
Our minds don’t remember experiences in a straightforward way. We have two different apparatuses, called memory selves, both of which remember situations differently.
First, there is the experiencing self, which records how we feel in the present moment. It asks the question: “How does it feel now?”
Then there is the remembering self, which records how the entire event unfolded after the fact. It asks, “How was it on the whole?”
The experiencing self gives a more accurate account of what occurred, because our feelings during an experience are always the most accurate. But the remembering self, which is less accurate because it registers memories after the situation is finished, dominates our memory.
There are two reasons why the remembering self dominates the experiencing self. The first of these is called duration neglect, where we ignore the total duration of the event in favor of a particular memory from it. Second is the peak-end rule, where we overemphasize what occurs at the end of an event.
For an example of this dominance of the remembering self, take this experiment, which measured people’s memories of a painful colonoscopy. Before the colonoscopy, the people were put into two groups: the patients in one group were given long, rather drawn-out colonoscopies, while those in the other group were given much shorter procedures, but where the level of pain increased towards the end.
You’d think the most unhappy patients would be those who endured the longer process, as their pain was endured for longer. This was certainly what they felt at the time. During the process, when each patient was asked about the pain, their experiencing self gave an accurate answer: those who had the longer procedures felt worse. However, after the experience, when the remembering self took over, those who went through the shorter process with the more painful ending felt the worst. This survey offers us a clear example of duration neglect, the peak-end rule, and our faulty memories.
 
Mind over matter: how adjusting the focus of our minds can dramatically affect our thoughts and behaviours
Our minds use different amounts of energy depending on the task. When there’s no need to mobilize attention and little energy is needed, we are in a state of cognitive ease. Yet, when our minds must mobilize attention, they use more energy and enter a state of cognitive strain.
These changes in the brain’s energy levels have dramatic effects on how we behave.
In a state of cognitive ease, the intuitive System 1 is in charge of our minds, and the logical and more energy-demanding System 2 is weakened. This means we are more intuitive, creative and happier, yet we’re also more likely to make mistakes.
In a state of cognitive strain, our awareness is more heightened, and so System 2 is put in charge. System 2 is more ready to double-check our judgments than System 1, so although we are far less creative, we will make fewer mistakes.
You can consciously influence the amount of energy the mind uses to get in the right frame of mind for certain tasks. If you want a message to be persuasive, for example, try promoting cognitive ease.
One way to do this is to expose ourselves to repetitive information. If information is repeated to us, or made more memorable, it becomes more persuasive. This is because our minds have evolved to react positively when repeatedly exposed to the same clear messages. When we see something familiar, we enter a state of cognitive ease.
Cognitive strain, on the other hand, helps us succeed at things like statistical problems.
We can get into this state by exposing ourselves to information that is presented to us in a confusing way, for example, via hard-to-read type. Our minds perk up and increase their energy levels in an effort to comprehend the problem, and therefore we are less likely to simply give up.
 
Taking chances: the way probabilities are presented to us affects our judgment of risk
The way we judge ideas and approach problems is heavily determined by the way they are expressed to us. Slight changes to the details or focus of a statement or question can dramatically alter the way we address it.
A great example of this can be found in how we assess risk.
You may think that once we can determine the probability of a risk occurring, everyone will approach it in the same way. Yet, this isn't the case. Even for carefully calculated probabilities, just changing the way the figure is expressed can change how we approach it.
For example, people will consider a rare event as more likely to occur if it’s expressed in terms of relative frequency rather than as a statistical probability.
In what’s known as the Mr. Jones experiment, two groups of psychiatric professionals were asked if it was safe to discharge Mr. Jones from the psychiatric hospital. The first group were told that patients like Mr. Jones had a “10 percent probability of committing an act of violence,” and the second group were told that “of every 100 patients similar to Mr. Jones, 10 are estimated to commit an act of violence.” Of the two groups, almost twice as many respondents in the second group denied his discharge.
Another way our attention is distracted from what is statistically relevant is called denominator neglect. This occurs when we ignore plain statistics in favor of vivid mental images that influence our decisions.
Take these two statements: “This drug protects children from disease X but has a 0.001 percent chance of permanent disfigurement” versus “One of 100,000 children who take this drug will be permanently disfigured.” Even though both statements are equal, the latter statement brings to mind a disfigured child and is much more influential, which is why it would make us less likely to administer the drug.
Taking chances: the way probabilities are presented to us affects our judgment of risk
The way we judge ideas and approach problems is heavily determined by the way they are expressed to us. Slight changes to the details or focus of a statement or question can dramatically alter the way we address it.
A great example of this can be found in how we assess risk.
You may think that once we can determine the probability of a risk occurring, everyone will approach it in the same way. Yet, this isn't the case. Even for carefully calculated probabilities, just changing the way the figure is expressed can change how we approach it.
For example, people will consider a rare event as more likely to occur if it’s expressed in terms of relative frequency rather than as a statistical probability.
In what’s known as the Mr. Jones experiment, two groups of psychiatric professionals were asked if it was safe to discharge Mr. Jones from the psychiatric hospital. The first group were told that patients like Mr. Jones had a “10 percent probability of committing an act of violence,” and the second group were told that “of every 100 patients similar to Mr. Jones, 10 are estimated to commit an act of violence.” Of the two groups, almost twice as many respondents in the second group denied his discharge.
Another way our attention is distracted from what is statistically relevant is called denominator neglect. This occurs when we ignore plain statistics in favor of vivid mental images that influence our decisions.
Take these two statements: “This drug protects children from disease X but has a 0.001 percent chance of permanent disfigurement” versus “One of 100,000 children who take this drug will be permanently disfigured.” Even though both statements are equal, the latter statement brings to mind a disfigured child and is much more influential, which is why it would make us less likely to administer the drug.
 
Gut feeling: why rather than making decisions based solely on rational considerations, we are often swayed by emotional factors
If utility theory doesn’t work, then what does?
One alternative is prospect theory, developed by the author.
Kahneman’s prospect theory challenges utility theory by showing that when we make choices, we don’t always act in the most rational way.
Imagine these two scenarios for example: In the first scenario, you’re given $1,000 and then must choose between receiving a definite $500 or taking a 50 percent chance to win another $1,000. In the second scenario, you’re given $2,000 and must then choose between a sure loss of $500 or taking a 50 percent chance on losing $1,000.
If we made purely rational choices, then we would make the same choice in both cases. But this isn’t the case. In the first instance, most people choose to take the sure bet, while in the second case, most people take a gamble.
Prospect theory helps to explain why this is the case. It highlights at least two reasons why we don’t always act rationally. Both of them feature our loss aversion — the fact that we fear losses more than we value gains.
The first reason is that we value things based on reference points. Starting with $1,000 or $2,000 in the two scenarios changes whether we’re willing to gamble, because the starting point affects how we value our position. The reference point in the first scenario is $1,000 and $2,000 in the second, which means ending up at $1,500 feels like a win in the first, but a distasteful loss in the second. Even though our reasoning here is clearly irrational, we understand value as much by our starting point as by the actual objective value at the time.
Second, we’re influenced by the diminishing sensitivity principle: the value we perceive may be different from its actual worth. For instance, going from $1,000 to $900 doesn’t feel as bad as going from $200 to $100, despite the monetary value of both losses being equal. Similarly in our example, the perceived value lost when going from $1,500 to $1,000 is greater than when going from $2,000 to $1,500.
 
False images: why the mind builds complete pictures to explain the world, but they lead to overconfidence and mistakes
In order to understand situations, our minds naturally use cognitive coherence; we construct complete mental pictures to explain ideas and concepts. For example, we have many images in our brain for the weather. We have an image for, say, summer weather, which might be a picture of a bright, hot sun bathing us in heat.
As well as helping us to understand things, we also rely on these images when making a decision.
When we make decisions, we refer to these pictures and build our assumptions and conclusions based on them. For example, if we want to know what clothes to wear in summer, we base our decisions on our image of that season’s weather.
The problem is that we place too much confidence in these images. Even when available statistics and data disagree with our mental pictures, we still let the images guide us. In summer, the weather forecaster might predict relatively cool weather, yet you might still go out in shorts and a T-shirt, as that’s what your mental image of summer tells you to wear. You may then end up shivering outside!
We are, in short, massively overconfident of our often faulty mental images. But there are ways to overcome this overconfidence and start making better predictions.
One way to avoid mistakes is to utilize reference class forecasting. Instead of making judgments based on your rather general mental images, use specific historical examples to make a more accurate forecast. For example, think of the previous occasion you went out when it was a cold summer day. What did you wear then?
In addition, you can devise a long-term risk policy that plans specific measures in the case of both success and failure in forecasting. Through preparation and protection, you can rely on evidence instead of general mental pictures and make more accurate forecasts. In the case of our weather example, this could mean bringing along a sweater just to be safe.

Leave a comment


Please note, comments must be approved before they are published