Thinking Fast and Slow
The Two Systems
System 1. IntuitionSystem 1 is the fast instinctive and unconscious way of thinking. Based on our emotions and feelings, it helps us guide our everyday decisions. It is also the system that is responsible for answering simple equations such as 2+2. Or sentence completion like bread and.. butter.
System 2. RationalitySystem 2 is the slower and deliberate thinking system. It is based on logical thought and complex problem solving. Answering difficult equations such as 27x42, or extensively thinking through a problem with a high level of concentration. This system requires more effort but it is also more reliable than system 1.
System blind spotsBlind spots can lead to incorrect decisions. This can be due to cognitive bias, or there is an environmental influence affecting our emotional state and thus our decision. In this article you will discover cognitive biases and effects of the environment that influences your decisions in daily life.
Cognitive biasesCognitive biases can arise from low availability of information, incorrect perception, or when we use mental shortcuts to come to a conclusion. This is because the mind can construct quite an illogical, inaccurate, incomplete or unreliable story - but as long as it sees it as coherent it will accept it. Daniel Kahneman in his book Thinking Fast and Slow calls this "What You See Is All There Is" (acronym is WYSIATI). The mind only takes and evaluates the things it knows, not the things it doesn't know. This implies that decisions are made with some type of information, even though the information itself isn't always correct. Here are five types of biases that are widely spread in daily life.
- The Confirmation bias; interpreting new information as confirmation of our beliefs
- Base-Rate neglect; ignoring evidence / undervaluing statistics
- The Framing effect; seeing identical choices as different choices
- The Rush effect; using the wrong system to answer a question
- Substitution; simplifying the question to create an answer
The Confirmation biasThe Confirmation bias implies that new information is always made compatible with existing beliefs we have about the world. It is selective perception of information - meaning that what we seek to see is what we believe already. To counter this bias, ask yourself;
Am I trying to reinterpret things to maintain a previous attitude or belief?
Am I overvaluing evidence because of my own experience?
Am I seeing a pattern where there isn’t one?
The Base-rate neglectThis bias implies that we forget or ignore the frequency of some event occurring, by ignoring important data (What You See Is All There Is). Imagine Steve. His description is that he is mild-mannered and detail-oriented. Is Steve a librarian or a farmer? Most people would answer a librarian. This answer feels logical. But the mistake, as Daniel Kahneman explains, comes from the image we hold in our minds. We fail to take into account that for every librarian there is a much bigger number of farmers with this description. Based on percentages, the chances of Steve being a farmer are actually higher than being a librarian. This cognitive bias that we fail to see through is part of how our mind reacts and responds to daily life. And it happens all the time. Another example is a girl that is sitting on the metro and reading the New York Times. Bet on her having a PhD or not having a college degree. Due to stereotyping because she reads the New York Times, we would probably judge her to have a PhD. But this doesn't take the base-rate into account. Namely, that there are much more people riding the metro that do not have a college degree. Based on this likelihood, the chances of the girl having a PhD become much smaller. It is this invisibility of evidence and our bias of representativeness that pushes us to make wrong decisions and judgement calls. Another example is the statement of insurance companies that most car accidents occur in a five mile radius from home.
The Framing effectPresenting the same information in different ways can evoke different emotions. So your opinion can change depending on how the question is asked or how the statement is framed. This effect also works with major negatives. Imagine two types of surgery. Which would have your preference?
One method states that the survival rate after one month is 90%.
The other method states that the mortality rate after one month is 10%. Research points out that we are more likely to choose the 90% survival rate method than the other, even though they are identical. We are somehow attracted to the positive frame even though there is no actual difference. When you feel the framing effect is present, ask yourself; What if I present this situation in the opposite way? How does that change my perception?
The Rush effectSystem 1 is the “automatic thinker” and System 2 is the “effortful thinker”. Most of the time we rely on system one, because it requires less energy than system 2. In this case, system 2 works as the Lazy Controller. It monitors and controls suggestions from System 1. But it often places too much faith in intuition to let it solve a problem. Take this example: “A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?” Most people jump to conclusions by using system 1. In a snap moment the 'good enough' answer is 10 cents. But after careful consideration with system 2 you may find out that this is actually wrong. If the bat is one dollar and five cents, the ball will be five cents for a grand total of one dollar and ten cents. The error is letting system 1 answer a question that is better suited for system 2. This is the concept of cognitive ease - answering questions as soon as we think we adequately perceive the problem. Put differently, when we perceive problems we sometimes perceive them as simpler than they are. We 'add up' too fast because we think system 1 can handle the problem. Errors of judgement and reasoning are present in many subtleties our every day life. But due to the sub-conscious nature of system 1 its involvement is rarely noticed. To become more mindful of these effects its a good idea to take time and contemplate on the validity and source of your decisions. In case you think you are intuitively answering a question, try to defend it with logical thought. If you can't, it is probably something to be questioned. Double-checking means getting the reasoning system involved in the cognitive process.
SubstitutionSubstitution implies that when we are asked a difficult question, we respond with an answer for an easier question. In other words, we substitute the difficult target question with a related question that is much easier to answer based on our feelings. This intuitive opinion is based on a heuristic- meaning that you are using a mental shortcut to make a quick judgement call. We turn to this mental shortcuts when we need a simple and fast solution.
Effects of the environmentThe environment can also have a great deal of effect on the subconscious level of the mind - this can lead to an influenced decision
- The Priming effect; leading to an association or situational response
- The Anchoring effect; leading to an influenced response
- The Risk effect; creating risky or safe behaviour
- The Endowment effect; increasing the perceived value of owned goods
The Priming effectThe mind can be easily primed with words and thought patterns, leading to altered decisions. #1
In 1997, NASA landed a probe on mars.
As a result, worldwide sales of Mars bars increased dramatically.
That is the power of priming. #2
Take the following word: SO_P. A person
who recently heard the word "food" would complete the word as SOUP.
But a person who recently heard the word "shower" would complete the
word as SOAP. The mind
is not something that is exclusively determined by internal thoughts patterns,
but also external factors. There is influence and effect on the sub-conscious
level with words that create an associative response - which can influence and
impact our associations and ultimately how we make decisions. #3
In an experiment about life satisfaction, people were asked how happy they were with their lives overall. The students were divided in two groups and asked two simple questions in a different chronological order: Group A.
How happy are you these days?
How many dates did you have last month? Group B.
How many dates did you have last month?
How happy are you these days? The results are astounding. In the first group, there was no correlation between happiness and the amount of dates students had. However, for the second group, participants with a lot of dates reported higher levels of happiness than the participants without a lot of dates. The explanation is straight-forward - emotion aroused by the dating question was still on the mind of the participants in group B when the question about general happiness came up. So priming the students with a question about their dating ended up influencing their perceived levels of happiness.
The Anchoring effectThe Anchoring effect states that people are over-reliant on the first piece of information they hear. The first value that is considered strongly influences the estimate that will be made afterwards. Put differently, the first number 'anchors' the possible values and draws the second estimate towards that first given number. This means that most behaviour isn't driven by price or an offer itself, but our perception of how good of a deal we are getting overall.
The Risk effectLife is full of decisions where we have to make trade-offs. We try to create a clear and tidy image of those decisions, but often they contain a mix of risk and return, or cost and benefit. The decision we make is if we gamble or play it safe. System 1 and 2 are heavily involved in this process. Our perception of the probability and gains and losses creates a fourfold pattern. In two cases we seek risk and in two other cases we prefer to avoid it.
- Win $700 for sure
- 80% chance to win $1000
Risk seeking - High probability + Significant losses
- Lose $700 for sure
- 60% chance to lose $1000
chance of not having to pay anything.
Risk Averse - Low probability + Significant losses
- Lose $100,000
- Pay $1,100 insurance for a 1% chance to lose $100,000
Risk seeking - Low probability + Significant gains
- Do nothing
- Bet $10 for a 0.1% chance to win $9,000
What are the objective upsides and downsides here?
Am I overweighting the downside, or the fear of loss? By becoming aware how you frame opportunities, you can be clear why you seek risk or avoid it.
The Endowment EffectThe endowment effect states that we naturally assign more value to things just because we own them. Another example with money shows how this effect works when we can either win money or possibly lose it. Imagine these scenarios:
You receive $1,000.
Then you have the choice to receive an additional $500, or take a 50% gamble to win another $1,000. What would you do? Most people choose for the sure $500 extra. They play risk averse. They end up with $1500. Now turn it around. You receive $2,000. Then you have the choice to lose a fixed $500, or the 50% chance to lose $1000. A bigger percentage of people opts for a risk seeking strategy here, so they can keep the whole $2000. The strange thing is the discrepancy between the two scenarios, because we end up with the same sure amount in both cases ($1500). The punch line is that humans pay attention to how the choices are expressed, either winning something or losing something. We are risk-averse when outcomes are framed as potential gains, but we are risk-seeking when outcomes are framed as potential losses; "A bird in the hand is worth two in the bush " This example shows that the endowment effect is powerful, but you have to take the first step by yourself. The effect only works when you are on the right side - are you trying to win something or are you determined not to lose something? Essentially, find ways to push yourself over to the $2,000 dollar side - and imagine how you would feel grateful for having it - and then subsequently losing it. From this side you will be psychologically invested to seek risk and to sustain what you have received.
Is the mind not to be trusted?Can the mind (in specific system 1) be trusted for its instincts? The answer is yes, but it requires awareness and vigilance. With awareness on the soundness of our decisions , the decisions will be smarter and more grounded in objective reality; Am I examining this situation rationally? Or using intuition?
Am I being critical with myself and does this cloud my judgement?
Am I trying to shape this into a story? Time to be aware of the blind spots in your own decisions. Slow down and ask yourself if you can defend your opinion based on logical reasoning. Create the ability to question your own thought and belief patterns. Because better decisions lead to better actions and a better life.