Guide to the Most Common Cognitive Biases and Heuristics
4 Comments
Our brains have to process an infinite amount of stimuli and make an endless number of decisions. So to save time and mental energy, our brains rely on heuristics, or short-cuts. Think of heuristics like guidelines, or rules of thumb: they’re usually good enough most of the time, but they can result in errors.
Cognitive biases are systematic errors in thinking that interfere with how we reason, process information, and perceive reality. Basically, biases deviate our thinking away from objective reality and cause us to draw incorrect conclusions.
Biases and heuristics are part of our automatic or intuitive system of thinking, so they occur without our awareness. But because they impact nearly all of our thinking and decision making, familiarity with the most common errors is a great way to become a better critical thinker.
[Before reading about individual biases and heuristics, the following post is strongly recommended: Should you trust your intuition? The elephant and rider inside your mind ]
A note on how to use this post: This page is a resource of the most common cognitive biases and heuristics, and is not intended to be read from top to bottom. Feel free to share the graphics to help others learn more about how to be better thinkers.
A very brief background
Cognitive biases and heuristics were first described by Amos Tversky and Daniel Kahneman in the 1970s. While Tversky died in 1996, Kahneman won the 2002 Nobel Prize in Economics for their work, which he later summarized in his best-selling (and must-read) book, Thinking, Fast and Slow .
Resources
MORE COMING SOON!!!
Confirmation Bias
Definition and explanation: Confirmation bias refers to the tendency to search for, interpret, and remember information that confirms our beliefs. In short, we prefer information that tells us we’re right…and we’re more likely to remember the hits and forget the misses.
In a world full of too much information, our brains need to take short-cuts . Unfortunately, some of these short-cuts can lead us astray. In the case of confirmation bias, the short-cut is: Does this piece of information support what I already think is true? If so, we assume there’s no need to question it.
Of all the biases, confirmation bias is the most powerful and pervasive, constantly filtering reality without our awareness to support our existing beliefs. It’s also self-reinforcing: because confirmation bias makes it seem like our beliefs are supported by evidence, we grow even more confident we’re right , and thus the more we filter and ignore information that would change our mind.
[Learn more: The person who lies to you the most…is you ]
A prime example of confirmation bias plays out in our modern media environment, where we’re able to select news organizations and even the types of stories that validate our worldview. With the help of algorithms that learn our preferences, we can get trapped in filter bubbles , or personal information ecosystems, where we’re served more and more content that reaffirms our existing beliefs and protected from evidence that we’re wrong. (We really don’t like being wrong.) In essence, we assume our news feed is telling us about reality, when the reality is it’s telling us about us.
Confirmation bias is also one of the biggest reasons we fall for “fake news.” Why bother spending time and energy fact checking that viral video or news story or meme when it already fits with what you believe? It feels true, so it must be!
Another example of confirmation bias is the common (but mistaken) belief that the full moon impacts behavior. Indeed, the full moon is often blamed by nurses and doctors for an increase in hospital admissions and the police for an increase in crime. (It’s fun to note that the root of the words lunacy and lunatic come from the Latin luna for moon.) To be clear, there is no good evidence that the moon has any of these impacts.
So why then does this belief persist? Imagine a teacher who believes in the full moon effect. He notices his students seem to be a little rambunctious and thinks, “It must be a full moon.” If it is a full moon, he confirms – and probably becomes even more confident in – his belief. If it’s not a full moon, he quickly interprets their behavior differently or blames it on something else. And in the future, he’s much more likely to remember the examples that supported his belief and forget the others.
How to overcome confirmation bias: Confirmation bias is amongst the most prevalent and influential of all the biases, so it’s important that critical thinkers try their best to reduce its impact.
Here are a few tips:
Learn to recognize when you’re prone to confirmation bias, such as when a belief is tied to strong emotions and/or you’re confident you’re right. So slow down and don’t let your emotions guide your reasoning. And avoid overconfidence! The more certain you are that a belief is true, the less likely you are to question it.
Be open to being wrong! The more tightly we hold our beliefs, the more contradictory evidence is viewed as a threat. But if you can’t change your mind with new evidence you’ll never be able to learn. Instead, separate your beliefs from your identity: You aren’t wrong, the belief is.
Go one step further and search for evidence that would prove you wrong. In today’s information-saturated environment, if you’re looking for evidence that you’re right, you will find it. So instead, search for disconfirming evidence! If the belief is true it will withstand scrutiny.
Back to the Top
Dunning-Kruger Effect
Definition and explanation: Have you ever noticed that those who know the least are often the most confident? It has a name: the Dunning-Kruger effect.
The Dunning-Kruger effect is a cognitive bias in which people overestimate their knowledge or abilities, resulting in undeserved overconfidence. Essentially, poor performers are unable to recognize their own mistakes and limits, so they assume they’re awesome. The reason is that (paradoxically) the skills and knowledge required to be competent at a task are the same skills needed to evaluate one’s own competence. Basically, ignorant people don’t know enough to recognize how ignorant they are.
But why are the incompetent so overconfident? Contrary to what many people believe, an ignorant mind is not a clean slate. Our brains are constantly trying to make sense of the world, using prior knowledge and experiences. Once we form beliefs and narratives, confirmation bias kicks in, and we unconsciously seek out information that supports our ideas. The result is a mind cluttered with misleading experiences, random facts, and intuitions that feels a lot like knowledge. And this powerful sense of false knowledge is what leads to overconfidence.
You’ve almost certainly witnessed the Dunning-Kruger effect. From dreadful American Idol contestants who can’t fathom why the judges laughed, to ignorant social media commentators LOUDLY proclaiming their opinions, it can be both hilarious and frustrating.
That said, it’s important to keep in mind that all of us are prone to this bias. Think for a moment about something you’re really good at. It might be fixing cars, breeding Basset hounds, baking bread, or playing Call of Duty…. anything you’re an expert in. Now consider what the average person knows (or doesn’t). It’s probably not much, and some of it is probably wrong. They probably don’t even realize how much there is to know.
Now consider that you’re that ignorant in basically every other area. I hope you’re humbled by that realization. We are all blind to our ignorance. We are all overconfident idiots.
There are real consequences to overconfidence. Not only do the incompetent overestimate their skills, they’re unable to recognize true expertise. And if you already know everything, why would you learn? Or change your mind?
[Learn more: Overconfident Idiots: Why Incompetence Breeds Certainty ]
How to overcome the Dunning-Kruger effect: At the root of the Dunning-Kruger effect is a lack of self-awareness. Essentially, we are unable to objectively evaluate our own knowledge and competence. Therefore one key solution is metacognition, or being aware of our thought processes, with the goal of assessing and improving our understanding and performance.
Also important is intellectual humility, or the recognition that you might be wrong. Be curious about what you don’t know. Ask for feedback from experts, and be open to incorporating their suggestions. Most issues are more complicated than we think, and understanding their complexity and nuance requires deep knowledge and expertise. If the answer seems simple and obvious to you, and yet somehow experts have “missed” it, consider it might be you that’s wrong.
Back to the Top
Availability Heuristic
Definition and explanation: The availability heuristic is a mental short-cut in which we estimate how likely or important something is based on how easily we can think of examples. However, because we are more likely to remember events that are recent, vivid, or emotional, we overestimate the likelihood of certain events and may make poor decisions.
Consider the following examples:
You’re at the beach, thinking about going into the water, and images of shark attacks pop into your head. You sit and read a book instead.
You recently saw a plane crash on the news, and you were already scared of flying, so you decide to drive on your next trip.
You just watched a documentary about someone who won big on the slot machines, so you plan a trip to the casino. Someone has to win…it might as well be you!
You’re worried about someone kidnapping your child because you saw news coverage of an attempted abduction. Thankfully, the child wasn’t harmed, but you don’t want to risk it. Today’s world is so much more dangerous than it was when you were young.
In all of these cases, you assumed something was likely because you could easily think of examples. Yet shark attacks are exceedingly rare, flying is orders of magnitude safer than driving, the chances of winning at the slots are miniscule, and there’s never been a safer time to be a kid . By confusing ease of recall with the truth, your brain misled you. And as a result you make poor decisions.
One of the biggest influences on our perception of risk is news coverage. By definition, the news covers events that are new and noteworthy, and not necessarily things that are common. News reports of murders and horrible crimes (or shark attacks and plane crashes) can result in us thinking these events are more common than they really are.
How to overcome the availability heuristic: The first step in overcoming any heuristic is awareness. Remember, the goal is to determine how likely something is in order to make better decisions. Short-cuts help us think fast, but they aren’t always reliable.
So slow down your thinking and don’t assume the first thing that pops into your head is representative of reality. Try to identify the stories your brain is using as evidence, and notice any emotions connected to them. Then if possible, use statistics instead!
Back to the Top
Written by Jon Guy
Other names: Anchoring heuristic
Definition and explanation: The anchoring effect refers to our tendency to “anchor” to the first piece of information we learn about something, and form our beliefs about that thing based on the anchor. Newer information isn’t evaluated objectively, but, rather, through the lens of the anchor. The anchoring effect is an extremely common cognitive bias, and one that can interfere with our abilities to make good decisions and objectively understand reality. Therefore, understanding the anchoring effect can save us time, money, and improve the quality of our thinking.
The anchoring effect occurs when we unwittingly cling to the first bit of information we get about something. However, if we’re not careful, anchoring can result in poor decisions that we may regret. For example, you discover that the new car you’d like to purchase costs an average of $25,500 (the anchor). So you take a trip to a local dealership, and the salesperson offers to sell you the vehicle for $24,000. “What an amazing deal,” you think, as you drive off the lot in your new car. Later you learn that several other dealerships around town are selling the same vehicle for $23,000! Since you were anchored to the original $25,500, anything less sounded like a good deal, and that anchor kept you from pursuing prices at other local dealerships.
But as I mentioned earlier, anchoring can be tricky. Not only does it affect many of our decisions, it can even affect decisions that are made for us. For instance, if your doctor anchors to the first symptoms you report about an illness, she might misdiagnose you without pursuing other possible explanations. Or, if you’re waiting on a decision from a jury on an insurance settlement case, their decision on how much to award you might be influenced by a strategically placed anchor.
Most people agree that taking care of our health is one of the more important goals we pursue throughout our lives. But let’s say your grandparents and great-grandparents were all very long-lived. You might anchor to their longevity as an expectation of how long you will live, without taking into consideration that they might have lived much more healthy and active lifestyles than you do. Therefore, by anchoring to one piece of information (how long they lived) and ignoring other, more important pieces of information (how they took care of themselves), you could wind up neglecting your own health by eating poorly or exercising infrequently.
The anchoring effect is so powerful that the anchor doesn’t even have to be relevant to the thing we’re making a decision about! For example, researchers have shown that putting an expensive car on a restaurant menu actually resulted in people spending more money while dining there. Other researchers have shown that by simply asking people the last two digits of their social security number and then showing them a list of products, those whose last two digits were higher were willing to pay more for the products than those whose digits were lower.
How to overcome the anchoring effect: The anchoring effect is an extremely pervasive bias, and even contributes to other biases. Moreover, since anchoring happens outside of our conscious awareness, interrupting the process can be rather challenging. Therefore, it is important to understand the effects of anchoring, so that we might stand a chance of overcoming them.
For example, thinking long and hard about an important decision always sounds like a good idea, right? Intuitively, this makes sense. However, if we’re merely thinking deeply about the anchor, we’re just amplifying its effects, and probably digging ourselves even deeper into our biases.
Fortunately, there are some strategies we can use to combat, if not completely overcome, the anchoring effect, such as by practicing metacognition. We are cognitive misers, which means overcoming our biases requires us to maximize metacognition; an awareness and understanding of our own thought processes. Or as I like to call it, thinking about thinking.
Another strategy we can employ is to try to consider alternative options. If you see t-shirts on sale for 3 for $10, consider that you may only need one, or that you might not need any! Anything you can do to interrupt the decision-making process can help to slow down your thinking and give you the time you need to make a better decision.
Back to the Top
Representativeness heuristic
Written by Jon Guy
Definition and explanation: The representativeness heuristic doesn’t exactly flow easily off the tongue. Nonetheless, this heuristic is well worth our attention. Like all heuristics, the representativeness heuristic is a mental shortcut our brains take to preserve its limited resources, in this case to make quick judgments about the likelihood of something based on how similar it is to existing mental categories.
The representativeness heuristic is an error of reasoning that occurs when we make generalizations based on our mental models of reality. To use a classic example , let’s say I told you that Mary is a quiet, shy introvert who’s not very interested in getting to know people and is also very detail oriented. Based on this description, is it more likely that Mary is a librarian or a mechanic? Our gut tells us that Mary is much more likely to be a librarian because her characteristics sound more representative of our mental models of librarians.
But the representativeness heuristic doesn’t just apply to our perception of people. For example, if you saw a frog with bright colors, you might assume that it’s poisonous based on your mental model of what poisonous frogs look like. Or, imagine shopping for a new phone case. If you find one that looks thick and durable, and you assume that it will provide good protection because of your prior mental model about what durability looks like.
The reason we might assume the frog is poisonous is because most of us have a mental construct of what poisonous animals look like, and a colorful frog fits the bill. Likewise, our mental constructs about what constitutes durability might lead us to believe that a thick case means that it’s durable. In either situation, our default assumption is based on our prior mental models, and questioning these models requires deeper thinking. Without further evidence, we cannot know whether the frog is poisonous or the phone case is durable.
Perhaps the best example of the representativeness heuristic is stereotyping. Stereotyping occurs when we have a mental model of a specific class of people, and we make judgments about all people within that class based on our mental construct. Just as we assumed that “colorful amphibian” means “poisonous animal,” we make similar assumptions about classes of people. For instance, we might assume that all Indians enjoy spicy foods, or that all Native Americans are highly spiritual, or that all rich people are snobs, or that all homeless people are drug addicts and alcoholics. However, in each of these incidents, our mental models will be wrong much of the time, and so it is up to us to think critically about how our brains misinterpret reality.
Heuristics are short-cuts our brains take to save energy. Importantly, they aren’t necessarily bad and don’t always lead us astray. Additionally, our brains categorize things (i.e. people, places, events, objects, etc) based on our previous experiences so we know what to expect and how to react when we encounter something new in that category. If we didn’t categorize things, every new thing we encountered would overwhelm us! So, while the representativeness heuristic can cause us to misconstrue reality, it’s also a valuable tool that helps us navigate our daily lives. Rather than trying to determine if every moving car is potentially dangerous, we draw on our mental models of moving vehicles, and treat them all with caution. Likewise, if a grizzly bear was running our way, we’d know we were in trouble, but if a tumbleweed was speeding towards us, we’d know we have nothing to fear.
Even though the representativeness heuristic can be a useful tool, it’s still important for us to maintain our skepticism and practice our critical thinking skills. While categorizing things assists our brains in reducing its cognitive load, we need to remember that even our mental constructs might be wrong, and understand how easily stereotypes can become prejudices. Moreover, categorizing limits our ability to see similarities and differences between different things, and, when we pay too much attention to specific boundaries, we further limit our ability to see the whole.
As the foregoing examples show, the representativeness heuristic can affect our judgment in both trivial and nontrivial ways. It’s not that big of a deal to incorrectly assume that the guy wearing a suit is a businessman, when he’s actually a plumber headed to a costume party. However, judging one individual based on a stereotypical example is a major driver of racism, sexism, classism, and even speciesism. Therefore, becoming better educated about our mental shortcomings goes a long way toward making the world a better place.
How to avoid the representativeness heuristic: Biases are part of human nature, and oftentimes serve valuable purposes. As such, completely avoiding the representativeness heuristic is unlikely, and undesirable. However, there are several things we can do to minimize the number of times we fall for it, as well as reduce its effects when we inevitably slip up.
Awareness of our biases is bias kryptonite, so understanding when we’re vulnerable, sharing what you’ve learned about it with others, and developing critical thinking skills are excellent ways to avoid this bias.
Finally, education can help us combat cognitive biases. Learning formal logic, becoming a better statistical thinker, or studying other errors of human cognition are all great ways to minimize our biases, become better thinkers, and perceive reality on reality’s terms.
Back to the Top
Barnum Effect (aka Forer Effect)
Definition and explanation: If you’ve ever read your horoscope, visited a psychic , or even taken a personality test and thought, “Wow, that was so accurate! How did they know?!?”, you’ve likely fallen for the Barnum effect. The Barnum effect describes the tendency for people to believe that vague personality descriptions apply uniquely to them, even though they apply to nearly everyone.
The Barnum effect is named after the famous showman P.T. Barnum, but it was first described by psychologist Bertram Forer in 1948 (hence its other name). In an experiment on psychology’s favorite “lab rat” (i.e. undergraduates), Forer gave his students personality tests and told them he would analyze and provide each with individualized feedback. Overall, the class rated their results as very accurate (an average of 4.3 out of 5).
The kicker, of course, was that all students received the exact same results, which included statements such as:
You have a great need for people to like and admire you.
You have a tendency to be critical of yourself.
Disciplined and self-controlled outside, you tend to be worrisome and insecure inside.
At times you have serious doubts as to whether you have made the right decision or done the right thing.
It’s not hard to understand why the students thought these statements applied to them…they apply to nearly everyone. But the students interpreted the generic statements as applying specifically to them. Thus, they fell for the Barnum effect.
To understand why, let’s take a closer at the statements that produce the Barnum effect. Barnum statements are general and applicable to nearly everyone. But their vagueness is their “strength,” as individuals each interpret them with their own meaning. They’re also mostly positive, as we prefer being flattered to hearing negative things about ourselves. And they often include qualifiers, like “at times”, or simultaneously attribute opposing characteristics, so that they’re almost never wrong.
Forer rightly attributed the results of his experiment to his students’ gullibility. But the truth is, we’re all gullible to some degree, which is why the Barnum effect is frequently exploited by those seeking to convince us they have deep insight into our personal psychology.
How to avoid falling for the Barnum effect: Your best line of defense against the Barnum effect is awareness and skepticism.
Be on guard for situations where vague information might be giving the impression that results are specifically tailored to you, such as fortune tellers, horoscopes, psychics, personality tests (e.g. MBTI ), online quizzes, and even Netflix watchlists.
Remember to insist on sufficient evidence to accept any claim. Ask yourself: Could this “personalized” feedback apply to others? Am I falling for flattery? And importantly, do I want to believe? Skepticism is the best way to protect yourself against being fooled…but no one can fool us like we can.
Final note: While many people associate P.T. Barnum with the saying “there’s a sucker born every minute,” there’s actually no good evidence he said those words. For a phrase that describes our tendency to be gullible, isn’t it ironic?