Thinking Fast and Slow
In the book Thinking Fast and Slow, Daniel Kahneman mentions two separate systems that we use to make decisions – fast and slow thinking. But there are also cognitive errors in our mind that lead to irrationality and inaccurate judgements. These errors are part of a systematic pattern, which means that if we learn to recognize these errors we can start making smarter decisions.
The Two Systems
System 1. Intuition
System 1 is the fast instinctive and unconscious way of thinking. Based on our emotions and feelings, it helps us guide our everyday decisions. It is also the system that is responsible for answering simple equations such as 2+2. Or sentence completion like bread and.. butter.
System 2. Rationality
System 2 is the slower and deliberate thinking system. It is based on logical thought and complex problem solving. Answering difficult equations such as 27×42, or extensively thinking through a problem with a high level of concentration. This system requires more effort but it is also more reliable than system 1.
System blind spots
Blind spots can lead to incorrect decisions. This can be due to cognitive bias, or there is an environmental influence affecting our emotional state and thus our decision.
Cognitive biases can arise from low availability of information, incorrect perception, or when we use mental shortcuts to come to a conclusion. This is because the mind can construct quite an illogical, inaccurate, incomplete or unreliable story – but as long as it sees it as coherent it will accept it.
Daniel Kahneman in his book Thinking Fast and Slow calls this “What You See Is All There Is” (acronym is WYSIATI). The mind only takes and evaluates the things it knows, not the things it doesn’t know. This implies that decisions are made with some type of information, even though the information itself isn’t always correct.
Here are five types of biases that are widely spread in daily life.
- The Confirmation bias; interpreting new information as confirmation of our beliefs
- Base-Rate neglect; ignoring evidence / undervaluing statistics
- The Framing effect; seeing identical choices as different choices
- The Rush effect; using the wrong system to answer a question
- Substitution; simplifying the question to create an answer
The Confirmation bias
The Confirmation bias implies that new information is always made compatible with existing beliefs we have about the world. It is selective perception of information – meaning that what we seek to see is what we believe already.
The Confirmation bias is deeply rooted in our opinion of what we think of the world. Facts are interpreted in a biased way, leading us to only uphold information that aligns with what we think already.
For example, people that believe in ESP (Extra-Sensory Perception) think they can perceive and sense things with the mind. they will keep close track of instances where they thought of their mother and then 3 seconds later the phone rang with their mother calling. It is the constant search of validation for what we believe to be true.
To counter this bias, ask yourself;
Am I trying to reinterpret things to maintain a previous attitude or belief?
Am I overvaluing evidence because of my own experience?
Am I seeing a pattern where there isn’t one?
The Base-rate neglect
The base-rate neglect implies that we forget or ignore the frequency of some event occurring, by ignoring important data (What You See Is All There Is).
Imagine Steve. His description is that he is mild-mannered and detail-oriented. Is Steve a librarian or a farmer? Most people would answer a librarian. This answer feels logical. But the mistake, as Daniel Kahneman explains, comes from the image we hold in our minds.
We fail to take into account that for every librarian there is a much bigger number of farmers with this description. Based on percentages, the chances of Steve being a farmer are actually higher than being a librarian. This cognitive bias that we fail to see through is part of how our mind reacts and responds to daily life. And it happens all the time.
Another example is a girl that is sitting on the metro and reading the New York Times. Bet on her having a PhD or not having a college degree. Due to stereotyping because she reads the New York Times, we would probably judge her to have a PhD.
But this doesn’t take the base-rate into account. Namely, that there are much more people riding the metro that do not have a college degree. Based on this likelihood, the chances of the girl having a PhD become much smaller. It is this invisibility of evidence and our bias of representativeness that pushes us to make wrong decisions and judgement calls.
Another example is the statement of insurance companies that most car accidents occur in a five mile radius from home.
How is it possible that the majority or accidents occur so quickly (only after driving a few minutes in that small radius around home? The real question is how often do you drive further than 5 miles from home? Probably only if you take the car to work. So the five mile radius is where the most driving occurs. It is all about taking in all the relevant information to make the right conclusion.
When in doubt if you are neglecting a base rate, ask yourself; is there any relevant information that I might be ignoring which can influence my conclusion?
The Framing effect
Presenting the same information in different ways can evoke different emotions. So your opinion can change depending on how the question is asked or how the statement is framed.
A prominent example of this effect are foods with labels on them such as 20% fat or 80% fat-free. We are drawn to the choice that expresses the positive side of things.
This effect also works with major negatives.
Imagine two types of surgery. Which would have your preference?
One method states that the survival rate after one month is 90%.
The other method states that the mortality rate after one month is 10%.
Research points out that we are more likely to choose the 90% survival rate method than the other, even though they are identical. We are somehow attracted to the positive frame even though there is no actual difference.
When you feel the framing effect is present, ask yourself;
What if I present this situation in the opposite way?
How does that change my perception?
The Rush effect
System 1 is the “automatic thinker” and System 2 is the “effortful thinker”. Most of the time we rely on system one, because it requires less energy than system 2.
In this case, system 2 works as the Lazy Controller. It monitors and controls suggestions from System 1. But it often places too much faith in intuition to let it solve a problem.
Take an example
“A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?”
Most people jump to conclusions by using system 1. In a snap moment the ‘good enough’ answer is 10 cents. But after careful consideration with system 2 you may find out that this is actually wrong.
If the bat is one dollar and five cents, the ball will be five cents for a grand total of one dollar and ten cents. The error is letting system 1 answer a question that is better suited for system 2.
This is the concept of cognitive ease – answering questions as soon as we think we adequately perceive the problem. Put differently, when we perceive problems we sometimes perceive them as simpler than they are. We ‘add up’ too fast because we think system 1 can handle the problem.
Errors of judgement and reasoning are present in many subtleties our every day life. But due to the sub-conscious nature of system 1 its involvement is rarely noticed. To become more mindful of these effects its a good idea to take time and contemplate on the validity and source of your decisions. In case you think you are intuitively answering a question, try to defend it with logical thought. If you can’t, it is probably something to be questioned. Double-checking means getting the reasoning system involved in the cognitive process.
Substitution implies that when we are asked a difficult question, we respond with an answer for an easier question.
If we are asked; “What is your opinion about the skills of colleague X?” – the question is roughly translated to “How much do I like this person?“.
In other words, we substitute the difficult target question with a related question that is much easier to answer based on our feelings. This intuitive opinion is based on a heuristic- meaning that you are using a mental shortcut to make a quick judgement call. We turn to this mental shortcuts when we need a simple and fast solution.
Each target question is transformed by basing your answer on feelings and prejudices. In other words, it is evidence that you can neither explain nor defend. But it does help in getting things done because it reduces the required effort for a lot of decisions while speeding up the decision making process. Just be aware that it can also lead to errors and inaccurate judgements because the evidence isn’t in tune with reality. You can always ask yourself; are my feelings contributing or influencing my evaluation?
Effects of the environment
The environment can also have a great deal of effect on the subconscious level of the mind – this can lead to an influenced decision
- The Priming effect; leading to an association or situational response
- The Anchoring effect; leading to an influenced response
- The Risk effect; creating risky or safe behaviour
- The Endowment effect; increasing the perceived value of owned goods
The Priming effect
The mind can be easily primed with words and thought patterns, leading to altered decisions.
In 1997, NASA landed a probe on mars.
As a result, worldwide sales of Mars bars increased dramatically.
That is the power of priming.
Take the following word: SO_P.
A person who recently heard the word “food” would complete the word as SOUP. But a person who recently heard the word “shower” would complete the word as SOAP.
The mind is not something that is exclusively determined by internal thoughts patterns, but also external factors. There is influence and effect on the sub-conscious level with words that create an associative response – which can influence and impact our associations and ultimately how we make decisions.
In an experiment about life satisfaction, people were asked how happy they were with their lives overall. The students were divided in two groups and asked two simple questions in a different chronological order:
How happy are you these days?
How many dates did you have last month?
How many dates did you have last month?
How happy are you these days?
The results are astounding. In the first group, there was no correlation between happiness and the amount of dates students had. However, for the second group, participants with a lot of dates reported higher levels of happiness than the participants without a lot of dates.
The explanation is straight-forward – emotion aroused by the dating question was still on the mind of the participants in group B when the question about general happiness came up. So priming the students with a question about their dating ended up influencing their perceived levels of happiness.
The Anchoring effect
The Anchoring effect states that people are over-reliant on the first piece of information they hear. The first value that is considered strongly influences the estimate that will be made afterwards. Put differently, the first number ‘anchors’ the possible values and draws the second estimate towards that first given number.
That is why most prices start high, because this number will be anchored as the base number for the rest of the discussion. The high price also motivates consumers to buy something when there is a big price reduction for fear of missing out.
Anchoring can also be used a decoy.
When we compare two bottles of wine, one $10 and the other $30, it makes the $30 dollar bottle look expensive. That’s why we add in another bottle of $50 to make the $30 price reasonable.
This means that most behaviour isn’t driven by price or an offer itself, but our perception of how good of a deal we are getting overall.
The Risk effect
Life is full of decisions where we have to make trade-offs. We try to create a clear and tidy image of those decisions, but often they contain a mix of risk and return, or cost and benefit.
The decision we make is if we gamble or play it safe. System 1 and 2 are heavily involved in this process. Our perception of the probability and gains and losses creates a fourfold pattern. In two cases we seek risk and in two other cases we prefer to avoid it.
Risk Averse – High probability + Significant gains
- Win $700 for sure
- 80% chance to win $1000
Most people will be more comfortable taking the sure thing: $700 – meaning they are risk averse. This way gains are locked in – better to have something over nothing.
Another example here is a plaintiff in a lawsuit. He or she will accept the sure gain of an amount instead of a possibility he or she will get nothing.
Risk seeking – High probability + Significant losses
- Lose $700 for sure
- 80% chance to lose $1000
This is a gamble between two bad options. The rational choice here is to choose for the sure loss of $700. But there are risk seekers that would take the chance of only 20% chance that they do not lose anything.
Another example is a plaintiff that is likely to lose a large case. He or she will hold on to glimmers of hope that luck will fall in their favor. Instead of accepting a reasonable fine, they risk the big chance of having to pay more for the small chance of not having to pay anything.
Risk Averse – Low probability + Significant losses
- Lose $100,000
- Pay $1,100 insurance against a 1% chance to lose $100,000
In this case a high percentage of people will choose option B due to the fear of a large loss. The ‘insurance’ makes them sleep better at night. The choice is made because people prefer taking an unfavourable settlement rather risking a whole lot more.
Risk seeking – Low probability + Significant gains
- Do nothing
- Bet $10 for a 0.1% chance to win $9,000
Most of the people will choose option B, hoping for a large gain. Because $10 isn’t a lot, but the chance of winning something isn’t either. This implies that we are risk seeking when the possible gains are high while the probability is low. Think about buying a ticket for the lottery, or as a plaintiff rejecting a small settlement in hopes of getting the big bucks.
It becomes clear that decision are not always driven by pure rationality. System 1 and system 2 both have a say, but there are psychological pressures and mental biases affecting our choices.
In moments of risk, ask yourself;
What are the objective upsides and downsides here?
Am I overweighting the downside, or the fear of loss?
By becoming aware how you frame opportunities, you can be clear why you seek risk or avoid it.
The Endowment Effect
The endowment effect states that we naturally assign more value to things just because we own them.
The picture above shows how ownership creates difference in value. The price to sell the mug for changes when its ours. The same applies to trading one product for another. For example, participants who were given a Swiss chocolate bar were not willing to trade it for a coffee mug. Vice versa, the participants who received the coffee mug were unwilling to trade it for the chocolate bar.
Another example with money shows how this effect works when we can either win money or possibly lose it.
Imagine these scenarios:
You receive $1,000.
Then you have the choice to receive an additional $500, or take a 50% gamble to win another $1,000.
What would you do? Most people choose for the sure $500 extra. They play risk averse. They end up with $1500.
Now turn it around.
You receive $2,000.
Then you have the choice to lose a fixed $500, or the 50% chance to lose $1000. A bigger percentage of people opts for a risk seeking strategy here, so they can keep the whole $2000. The strange thing is the discrepancy between the two scenarios, because we end up with the same sure amount in both cases ($1500).
The punch line is that humans pay attention to how the choices are expressed, either winning something or losing something. We are risk-averse when outcomes are framed as potential gains, but we are risk-seeking when outcomes are framed as potential losses;
“A bird in the hand is worth two in the bush “
This example shows that the endowment effect is powerful, but you have to take the first step by yourself. The effect only works when you are on the right side – are you trying to win something or are you determined not to lose something?
Essentially, find ways to push yourself over to the $2,000 dollar side – and imagine how you would feel grateful for having it – and then subsequently losing it. From this side you will be psychologically invested to seek risk and to sustain what you have received.
Is the mind not to be trusted?
Can the mind (in specific system 1) be trusted for its instincts? The answer is yes, but it requires awareness and vigilance. With awareness on the soundness of our decisions , the decisions will be smarter and more grounded in objective reality;
Am I examining this situation rationally? Or using intuition?
Am I being critical with myself and does this cloud my judgement?
Am I trying to shape this into a story?
Time to be aware of the blind spots in your own decisions. Slow down and ask yourself if you can defend your opinion based on logical reasoning. Create the ability to question your own thought and belief patterns. Because better decisions lead to better actions and a better life.