Monday, August 31, 2020

Confirmation Bias

Though there are many contenders, perhaps the king of all cognitive biases is confirmation bias. In these days of political polarization and partisan news networks, most people have probably heard the term. As it's used these days, it means the tendency to seek out and trust information that confirms our pre-existing beliefs, and to avoid and distrust information that conflicts with those beliefs. Many conservatives prefer to watch Fox News and have their beliefs validated, while many liberals do the same with MSNBC. Social media has revealed that confirmation bias can be so strong that people will share articles they agree with without even reading them first. They just read the title, say, “Damn right!” and hit the share button. If they decide to look further into a controversial issue, they generally just find the first thing they agree with and accept it as the truth. We like having our beliefs confirmed, and we hate having them questioned. That’s confirmation bias.

Actually, though, the term confirmation bias is used in two slightly different ways, and that can cause some confusion. The term was originally coined by the psychologist Peter Wason, whose most famous study used a logic puzzle now called the Wason Selection Task. Want to see if you can solve it? Take a look at the four cards below. Which cards would you need to turn over to test whether the following hypothesis is true: “If a card has a vowel on one side, it has an even number on the other.”


If you said you need to turn over the “A” card, congratulations--you’re half right! But to truly test the hypothesis, you also need to turn over the 5 card. Why? Because if you turn over the 5 and find an odd number on the back, you’ve disproved the hypothesis. Turning over the A card tells you if the hypothesis is true, but turning over the 5 card can tell you if it’s false, and that’s just as important. The majority of people (including me) find this puzzle incredibly hard, and most get it wrong. The reasons for that are complex and controversial, but what that seems clear is that people test hypotheses by looking for confirmation, not falsification.

Wason came up with this puzzle because he was interested in the idea of falsification in science. Briefly, this is the idea that science advances by falsifying incorrect hypotheses, not confirming correct ones. Wason found that people are biased toward confirming hypotheses, not falsifying them, so he coined the term “confirmation bias”.

But notice this is slightly different than the idea that people cling to their cherished pre-existing beliefs. Wason suggested that people are biased toward testing any hypothesis by trying to confirm it, not just the ones they want to believe. And that’s a real bias--that is what most people do. In recent years, though, confirmation bias has come to mean seeking confirmation for what you want to believe. And that’s also a real bias. To avoid confusion, some researchers call this second sense of confirmation bias “myside bias”. But hasn’t caught on, and these days most people use the term “confirmation bias” in the second sense--as seeking confirmation for what you already believe. That’s probably fine, but since it’s such an important, pervasive bias, it’s good to remember that the term can mean two slightly different things.

What’s clear is that confirmation bias in the second sense can explain a lot about human belief and behavior. It explains why people seek information online that they agree with, and avoid information that says they might be wrong. It explains other cognitive biases as well, such as wishful thinking (“astrology must be true, because I like believing in it”), and denialism (“the climate can’t be changing, because that conflicts with my ideology”). It explains cherry picking, where we only look at data that supports our view, while ignoring data that conflicts with it.

The ancient Greek thinker Diagoras knew how to spot unconscious cherry picking. Once an acquaintance tried to convince him that the gods were benevolent by pointing to paintings of people who had survived storms at sea. When the storm hit, they had prayed, and then they survived (notice the post hoc, ergo propter hoc thinking?). Diagoras replied, “Yes, but where are all the paintings of those who were shipwrecked?” To really know if prayer is correlated with surviving storms, you would have to know how many people prayed and were then saved, and how many people prayed and were shipwrecked anyway. Cherry picking is great for reaching the conclusion you want to reach. It’s just not great for reaching the conclusion that’s true.

No comments:

Post a Comment

The Pillars of Argument: Covering Your ARS

So far, I’ve talked about clarifying the language and structure of arguments, as a first step in evaluating them. But how do we take the nex...