Cognitive Biases to Watchout For — Thinking, Fast and Slow

Eesha Ulhaq
7 min readAug 25, 2019

--

Noble prize winner in economics Daniel Kahneman gives insight into how the human mind works. The book emphasizes how easily we can be influenced without even knowing it. He introduces two characters system one our fast and intuitive thought process and contrasts it with system two our slower, more logical reasoning. Recognizing the rationality behind our choices, helps us avoid our natural human biases, crucial in a world where we are constantly making decisions.

Framing

Statistically, the same outcomes but conveyed differently.

  1. A vaccine has a 2% failure rate
  2. A vaccine has a 98% success rate

Though technically the vaccine is equally as effective, majority of people would respond more positively to option 2.

Availability and Probability

You take the train every day to work, recently a train crashed. Would you be more inclined to take the car to work?

If yes, why? In reality, the chances of the train crashing did not change after the incident. Ironically, you would be more likely to die taking a car than a train probability wise.

People who watch the news often think the events broadcasted are more likely and dangerous than they are in reality. Many people are actively afraid of terrorism, In 2017, 6476 people experienced fatal injuries from terrorism, yet 300,000 people die each year due to obesity. 1.25 million people die each year from road accidents. But many don’t think twice before sitting in their cars or biting into a big mac. 17.9 million from heart disease but these things aren’t on a lot of our minds.

The more we’re exposed to an event, the easier it is to come up with examples, the more examples we can think of the higher we estimate the frequency of the event.

We tend to overestimate the probability of unlikely events occurring, resulting in miscalculations when we make decisions by overweighting these events.

Anchoring

when a given value influences an unknown value through a shift.

  1. How much would you pay for this ring?

2. Would you pay $7000 for this ring?

Those asked question 2, despite being shown the same ring would purchase the ring for a higher price than those in question 1

Another reason a jacket on sale at $300 down from $1000 looks more attractive than the same jacket for a regular price of $300

Loss Aversion

Say you’ve just lost $50 gambling but have a 50% chance of winning it back, you would be more inclined to try again. Because you’re already in $50 deep. This is how many people get sunken into gambling loopholes.

Similarly, people who have made bad decisions in the past that have cost them time or money are less likely to take action to get out of those bad decisions because they’re too invested in them. For example, you buy a pair of jeans that you never wear, they’re ugly and taking up space in your closet. But you don’t want to throw them out because you invested money in them.

Another reason we see people in toxic relationships or jobs they don’t like. They fear starting over because it means everything they did in the past was all for nothing; this is fear is usually more destructive than letting go.

The decisions from the past should not influence the level of utility we get from new decisions. They should be analyzed individually.

Tieing into prospect theory; when the chances of winning and losing are similar, we are risk-averse. Whereas if there is a guaranteed loss, we become more inclined to take risks.

We also see the endowment effect here. We often overestimate the actual value of the things we own simply because they’re in our possession.

“What you see is all that there is.”

The human mind loves stories. When we get limited information, our brains try to fill in gaps and jump to conclusions to perfect a story, often disregarding the lack of evidence to support our story.

This can lead to Confirmation bias, resulting in us looking for information that supports our current beliefs and perspectives, while we tend to overlook information that contradicts them.

For example, let’s say we don’t believe in climate change. We’re told that the Amazon rainforest is on fire due to increases in temperature. We are quick to disregard this fact and look for any evidence that can reinforce our view that climate change does not cause forest fires. We think up any alternative causes for the fire. (*climate change is a serious problem)

Hindsight bias

When an event occurs in the past, people recall they “knew it all along.” But if the event doesn’t play out, they often brush it off. We often exaggerate the time’s things went wrong and never acknowledge all the times they didn’t because we never see the cause and effect of them.

We are quick to blame decision-makers when the outcomes of an event are negative. Questioning how they didn’t see the “obvious” signs an event was going to play out. In foresight, these signs might have been extremely subtle, but in hindsight, they are exaggerated.

Base Rates

We overweight our evidence and underweight the base rate of an event.

We tend to use system 1 in single evaluations but use system 2 in comparison evaluations. Comparison evaluations are more accurate and used in Bayes law.

The baseline is the average, or the predicted average when nothing about the case is explicitly given. The given information about the case should be assigned weight factors when calculating. If that’s too much work typically aiming for the in-between of the base rate and our intuitive response works well.

Bayes Theorem

Small numbers

Visualizing large statistics is difficult for a lot of us; we react better to smaller data sets say a study with 100 people. The problem is these small data sets are often not representative of the whole population.

Though this seems obvious, we often are guilty of jumping to conclusions when we experience something and conclude that it must hold true for the broader population.

Say you hear four people talking badly about a movie. You conclude that it must be a bad movie. However, the problem here is you come to a conclusion disregarding absence of the thousands of other people who watched that movie. The same can be said for experiences, people, places, ideas etc.

Small numbers don’t account for the random chance of sampling.

Humans like individual stories or experiances> facts or statists because we can relate and visualize them better. Marketers and influencers have long been using this tactic. You would respond better to a dying person in an anti-smoking ad than a big number.

Correlation ≠ Causation

Because we love a coherent story and anything that supports our beliefs, we look for correlations even when they aren’t tied to causes.

For example, many old people move to Florida, and many old people get Alzheimer’s, therefore, moving to Florida causes Alzheimer’s.

This is stupid, but it’s not always so obvious.

Substitution

We respond to hard questions by answering easier ones:

What do I think about this? → How do I feel about this?

This is system one at work. It loves to make automatic connections and coming to an answer immediately

“Who is the best candidate?” this is a complicated question, and numerous factors must be weighed in. However, we often substitute this for “Who is the most likeable?”

Substituting isn’t always accurate because of its dependence on the next question your asking. For example

If I asked you how you feel about your love life, then ask you how happy you are with life. The previous answer would influence if not substitute for the second question. If I am unhappy about my love life, then I would say I am unhappy about my life. Whereas if the question was asked on its own, this factor would not be weighted as much.

Here are some others you should check out :)

--

--

Eesha Ulhaq

an archive of blogs from when i was 17 - was very often wrong