Monday, August 31, 2020

Confirmation Bias

Though there are many contenders, perhaps the king of all cognitive biases is confirmation bias. In these days of political polarization and partisan news networks, most people have probably heard the term. As it's used these days, it means the tendency to seek out and trust information that confirms our pre-existing beliefs, and to avoid and distrust information that conflicts with those beliefs. Many conservatives prefer to watch Fox News and have their beliefs validated, while many liberals do the same with MSNBC. Social media has revealed that confirmation bias can be so strong that people will share articles they agree with without even reading them first. They just read the title, say, “Damn right!” and hit the share button. If they decide to look further into a controversial issue, they generally just find the first thing they agree with and accept it as the truth. We like having our beliefs confirmed, and we hate having them questioned. That’s confirmation bias.

Actually, though, the term confirmation bias is used in two slightly different ways, and that can cause some confusion. The term was originally coined by the psychologist Peter Wason, whose most famous study used a logic puzzle now called the Wason Selection Task. Want to see if you can solve it? Take a look at the four cards below. Which cards would you need to turn over to test whether the following hypothesis is true: “If a card has a vowel on one side, it has an even number on the other.”


If you said you need to turn over the “A” card, congratulations--you’re half right! But to truly test the hypothesis, you also need to turn over the 5 card. Why? Because if you turn over the 5 and find an odd number on the back, you’ve disproved the hypothesis. Turning over the A card tells you if the hypothesis is true, but turning over the 5 card can tell you if it’s false, and that’s just as important. The majority of people (including me) find this puzzle incredibly hard, and most get it wrong. The reasons for that are complex and controversial, but what that seems clear is that people test hypotheses by looking for confirmation, not falsification.

Wason came up with this puzzle because he was interested in the idea of falsification in science. Briefly, this is the idea that science advances by falsifying incorrect hypotheses, not confirming correct ones. Wason found that people are biased toward confirming hypotheses, not falsifying them, so he coined the term “confirmation bias”.

But notice this is slightly different than the idea that people cling to their cherished pre-existing beliefs. Wason suggested that people are biased toward testing any hypothesis by trying to confirm it, not just the ones they want to believe. And that’s a real bias--that is what most people do. In recent years, though, confirmation bias has come to mean seeking confirmation for what you want to believe. And that’s also a real bias. To avoid confusion, some researchers call this second sense of confirmation bias “myside bias”. But hasn’t caught on, and these days most people use the term “confirmation bias” in the second sense--as seeking confirmation for what you already believe. That’s probably fine, but since it’s such an important, pervasive bias, it’s good to remember that the term can mean two slightly different things.

What’s clear is that confirmation bias in the second sense can explain a lot about human belief and behavior. It explains why people seek information online that they agree with, and avoid information that says they might be wrong. It explains other cognitive biases as well, such as wishful thinking (“astrology must be true, because I like believing in it”), and denialism (“the climate can’t be changing, because that conflicts with my ideology”). It explains cherry picking, where we only look at data that supports our view, while ignoring data that conflicts with it.

The ancient Greek thinker Diagoras knew how to spot unconscious cherry picking. Once an acquaintance tried to convince him that the gods were benevolent by pointing to paintings of people who had survived storms at sea. When the storm hit, they had prayed, and then they survived (notice the post hoc, ergo propter hoc thinking?). Diagoras replied, “Yes, but where are all the paintings of those who were shipwrecked?” To really know if prayer is correlated with surviving storms, you would have to know how many people prayed and were then saved, and how many people prayed and were shipwrecked anyway. Cherry picking is great for reaching the conclusion you want to reach. It’s just not great for reaching the conclusion that’s true.

Sunday, August 30, 2020

Faces in the Clouds: Seeing Patterns That Aren't There

“...nothing is so alien to the human mind as the idea of randomness.” --John Cohen


The human mind is a pattern-seeking machine. We’re constantly trying to spot patterns and regularities in the world around us, and for very good reason. Our ancestors who looked in the bushes and saw a pattern of spots as a leopard lived long enough to become our ancestors.That’s true even if it turned out there wasn’t a leopard there. It’s better to accept a false positive than be eaten by a false negative. Our evolutionary history has given us all a case of apophenia: an overactive tendency to see patterns and meaning even in random, meaningless things. That’s why we see faces in the clouds, a man in the moon, and rams and scorpions in random arrangements of stars as they happen to look from Earth. It’s why people hear Satanic messages in music played backwards. It’s why some people see Jesus in a piece of toast, and others see Buddha in a tree trunk. Sometimes we go beyond seeing patterns, and see meaning, and even intention, where there is none. Humans are prone to see omens in the flight of birds, and angry gods in natural disasters.

The problem with our constant pattern-seeking is that we aren’t very good at recognizing a lack of pattern. We aren’t good at seeing randomness as randomness. In fact, real randomness doesn’t like we think it should. Consider the following sequences of digits. Which one was generated by a random process?

01100101100110100110

10001001000100011111

Most people would say the first one is random. But it isn’t. I made that sequence deliberately to look the way I expect a random sequence to look. Then I made the second sequence with a true, time-honored randomness generator. I flipped a coin: heads = 1, tails = 0. So the second sequence is the random one. But it doesn’t look like it, does it? It has too many long stretches of ones and zeros to seem random. It even looks too regular to me, and I’m the one who flipped a coin twenty times to make it!

Our inability to see randomness for what it is leads us to several errors in judgement. For example, I like to shoot pool, but I’m not very consistent. Sometimes I get on a streak, and hit several shots in a row. I start thinking I’m on fire, and then the streak ends. Did I temporarily become a better pool player, or was it just an apparent pattern in randomness, like the string of five 1’s in the random coin flips above? Most likely, it was just random, and I’m the victim of the hot hand illusion. That’s what researchers have found when they studied professional basketball players (who are much better at shooting baskets than I am at shooting pool.) Most players swear they get the “hot hand”--they get on a streak and can hardly miss the basket. But if you run statistical tests of their hits and misses, it turns out that the hot and cold streaks are random. Of course, good players have more hits than misses, but the “hot hand” streaks of hits still seem to be random occurrences.

A sort of reverse of the hot hand illusion, called the gambler’s fallacy, occurs when people think a random streak is “due” to end. Take parents, for example. The odds of a couple having a female baby are 50% (actually not quite, but close enough that we’ll call it 50%). So, here’s a probability problem: if a couple has had 5 male babies in a row, what are the odds of the next baby being a girl?

Answer: 50%. A single, random event has no memory of what’s happened before. A coin doesn’t realize that it’s landed on heads 10 times in a row and decide to land on tails because it’s overdue. But many couples keep having babies, thinking they’re “due” to have one sex or the other. Casinos are full of gamblers thinking their losing streaks are “due” to end. Many of them lose huge amounts of money that way. It’s another clear case where cognitive biases are doing real harm.

A close cousin of the hot hand illusion happened when the Germans were bombing London in World War II. People looked at maps of bomb strikes and noticed suspicious clusters and empty areas. Some Londoners started thinking the relatively empty areas on the map were places where German spies were living. But bombing back then wasn’t very accurate, and the empty spots and the clusters were actually random. This phenomenon is known as the clustering illusion.

Another form of mistaken pattern recognition happens when two things occur together (or one after another) purely by coincidence, and people decide they must be related. Let’s say you’re a baseball player, and you tighten your shoelaces just before hitting a home run. If you’re like a lot of baseball players (a famously superstitious bunch) you may decide that tightening your shoelaces caused you to hit that homer, so you start tightening them every time you go to bat. This is called illusory correlation, and this particular version is also a logical fallacy called post hoc, ergo propter hoc (Latin for “after this, therefore because of this”). This effect also causes people to think home remedies and quack cures are more effective than they really are. If you have a sore throat, it will almost certainly go away by itself if you wait. If you take a shot of whiskey--like your Uncle Bob recommended--it will also go away. Just don’t conclude that it went away because of the whiskey. 

Why don’t people think about all the times a sore throat went away without a shot of whiskey, and realize their mistake? Why don’t baseball players stop tightening their shoelaces when they notice all the times they do it and then don’t hit a homer? Like most humans, they notice the instances that “confirm” their theories, and don’t notice the ones that don’t. And that brings us to what might be the king of all cognitive biases.

Saturday, August 29, 2020

Mental Shortcuts and their Pitfalls

In the last post, I talked about how we all have two systems for thinking. System 2 is the conscious, executive system which I compared to the captain of a ship, and System 1 is the mostly unconscious, automatic one I compared to the ship's crew. One big difference between System 1 thinking and System 2 thinking is that System 1 is faster. It’s good at making quick calculations. One way it does this is by using cognitive shortcuts called heuristics, which are basically rules of thumb for making quick judgments. Psychologists have identified many heuristics that we use all the time. Oftentimes, they get us through our days just fine, but sometimes they can go wrong. That’s the thing about rules of thumb--they only work most of the time.

Representativeness

One of the most important tasks our minds perform is categorizing things. We need to know the difference between “food” and “not food”, for example, and to eat one and not the other. We also use categories to make predictions: “I don’t recognize that animal, but it’s big and has long canines, so it might be the kind that eats people”. One way we make these categorizations is by using the representative heuristic. We decide how to classify things based on how well they match our mental representation of other things in that category. That’s awfully abstract, so here’s an example: if I see a large, muscle-bound man in athletic gear walking down the street while vigorously chewing gum, I figure he’s more likely to be a football coach than an accountant, because he matches my mental representation of a coach much better.

The problem is, he might be an accountant. Making judgments based on representativeness is useful, but it can lead us astray. Imagine that you see a small, neatly dressed man in glasses, reading a book. Take a second to think: is that guy more likely to be a librarian or an electrician? Librarian, right? Not so fast. There are actually 16 times as many male electricians as male librarians in the United States. So he’s probably more likely to be an electrician, even if he doesn’t fit the stereotype. This kind of error is called the base rate fallacy, and the representative heuristic is usually to blame for it. We ignore the base rate--the frequency of occurrence--of things (librarians and electricians, in this case) and make judgements based on representativeness.

The word “bias” usually means something different when we’re talking about cognitive biases than when we’re talking about biased perceptions of groups of people. But in this case, the two kinds of biases are linked. After all, representativeness is really just another word for “stereotype”. We probably can’t avoid stereotypes altogether, but we can let our System 2 minds step in and say, “Wait a minute, let’s not jump to conclusions about people.” Not everybody matches a stereotype and not all stereotypes are accurate.

Availability

Let’s look at another common heuristic. Consider this question: are there more words in English with the letter “k” as the first letter, or the third letter? Most people guess that more words start with a “k”, but in fact words with “k” as the third letter are three times as common. People get it wrong because words that start with “k” are easier to call to mind. This is called the availability heuristic. We estimate how common things and events are by how easily they come to mind--by how available they are to consciousness.

But this can be a pretty unreliable method, especially in the days of mass media. For example, if you live in the United States, take a couple of seconds to answer this question: Which worries you more, that you or a family member could be a victim of:

1. A mass shooting

2. A fatal accident

It’s no secret that the United States has a problem with mass shootings. Every few weeks some deranged person kills multiple people, and the tragedy dominates the national news for a few days. Not surprisingly, many people are far more worried about mass shootings than fatal accidents. But consider this: according to the Congressional Research Service*, between 1999 (the year of the Columbine shooting) and 2013, 1,554 people were killed in mass shootings. In the same time period, 1.73 million people were killed in accidents. That means the average American was over a thousand times as likely to die in an accident as in a mass shooting. The top accidental causes of death were car crashes, accidental poisonings (including overdoses--these have risen in recent years), and falls. All three are hundreds of times as likely to kill the average American as mass shootings. Indeed, twice as many people are killed in bicycle crashes as in mass shootings. Yet people worry far more about mass shootings than these other causes, because mass shootings are more memorable. The more common causes of death don’t make the news as much, precisely because they’re so common.

To be clear, I’m not saying that mass shootings aren’t a terrible problem, and I’m not making any argument about gun policy in the US. I’m just saying we shouldn’t let the availability heuristic distort our thinking about what’s most dangerous. Why? Because misunderstanding what’s dangerous is itself dangerous. For example, after the 9/11 terrorist attacks, many Americans were afraid to fly, so they drove instead, even though driving is much more dangerous than flying. It’s been estimated that this caused about 1600 extra traffic fatalities in the year after 9-11. This is another case where a lack of critical thinking actually killed people.

__________________________________________________

The CRS report defines a “public mass shooting” as an incident where 4 or more people are killed. This may be a little too restrictive. If you use the definition of a mass shooting by the Gun Violence Archive, which defines a mass shooting as “four or more people shot/killed” and add up deaths for 2014-2016 (the only years that overlap between the Gun Violence Archive and Centers for Disease Control data) there were 1084 deaths in those shootings. Even with this much more expansive definition, the average American in these years was over 100 times as likely to die of poisoning or in a car crash as in a mass shooting. Slightly more people were killed in mass shootings in these three years than bicycle accidents (928 people). None of this is to say that gun homicide in general isn’t a big problem. In 2014-2016, there were 38,402 gun homicides--the fifth-highest cause of death by injury. https://fas.org/sgp/crs/misc/R44126.pdf

** https://www.cdc.gov/injury/wisqars/index.html



Friday, August 21, 2020

Being of Two Minds

Many of our cognitive distortions can be better understood by considering one of psychology’s big insights: we all have two minds. Well, not exactly two minds, but we do have two systems for processing information, which psychologists call System 1 and System 2. System 2 is what we think of as “our” mind, because it’s the one we’re most aware of, and most able to control. System 2 thinking is the conscious, effortful thinking we use when we’re doing something unfamiliar or difficult--something we can’t do while thinking about something else. It’s what we use when we solve math problems, try to make a clear point in a conversation, or drive through an unfamiliar city. 

System 1 thinking, on the other hand, is mostly unconscious. It’s what we use when we recognize a familiar face, make snap judgments about people and situations, or do things we learned to do long ago. Have you ever driven to work and realized you have no memory of the drive? That’s because your System 1 mind took the wheel, while your System 2 mind was thinking about other things. System 1 handles tasks we’re able to do automatically, so it’s not so limited by our attentional capacity. If you drive to work while singing along with the radio, it’s not hard to do, even though both those things are incredible feats of information processing that took years for us to learn.

One way to think of System 1 and System 2 is to imagine a captain standing on the bridge of old sailing ship. Most of the crew is out of sight, all working at the same time to keep the ship sailing along. They can do far more work than the captain, and many crew members have skills he doesn’t. The crew is System I, and the captain is System 2. During fair weather, the captain can plan ahead, charting the ship’s future course, or just relax and sing along with the radio. But if something unusual happens--if there’s a tricky strait to sail through--then he needs to take over and start making decisions. He pulls out the charts he needs and spreads them on his desk (this is analogous to what our conscious minds do when we pull information from our memory in order to deal with a task at hand.) Then he starts giving orders. Like the ship captain, our System 2 mind plays an executive role. It goes to work when there’s a new or difficult situation that requires leadership. 

But System 2 isn’t running the show as much as it he thinks. Most of the time, the System 1 crew knows what it’s doing. They may steer around an island while the captain’s asleep, but he wakes up and writes in the ship’s log that he ordered them to do that. The same thing happens in our minds. Our System 1 mind causes us to do things without thinking, and then our System 2 mind takes credit, making up reasons why we did it. This is called confabulation. Sometimes I get up off the couch, and then think “I’m going to make coffee.” But is that really why I got up, or is that my System 2 mind confabulating about why I just got up off the couch? 

Sometimes our System 2 minds lose control entirely. When we’re tired, angry, or fearful, the crew--our System 1 mind--can rise up. And for all their competence, they’re an unruly, superstitious, knee-jerky bunch, capable of overpowering the captain if emotions are running high enough. That’s when our worst cognitive biases come into play, and our most regrettable decisions get made. 

Still, System 1 can do a lot of things better than System 2. As I type this, my fingers are jumping around the keyboard automatically, even though my conscious mind barely remembers what the keyboard looks like. If I look down at my fingers, System 2 tries to micromanage System 1 and I start typing....very...slowly. It’s as though the captain is trying to do the ship carpenter’s job, and making a mess of things. A good critical thinker, then, is like a ship with a good captain. The captain keeps tabs on the crew’s decisions, making sure they’re good ones. Occasionally he takes over, showing he’s still in charge when emotions run high. And sometimes he sits back, lets the crew do its thing, and marvels at their skills.


Thursday, August 20, 2020

Chapter Two. How Our Minds Fool Us: The Doors of Perception

Most of us--at least in our less thoughtful moments--are what philosophers call naive realists. We go through our days thinking we see the world more or less as it really is. And that really is naive. The truth is, what we experience is only a thin slice of the world’s full complexity. Yes, we can see a beautiful spectrum of colors, but the rainbow is only a tiny slice of an electromagnetic spectrum that’s mostly invisible to us. Microwaves, radio waves, and x-rays are part of the same spectrum, but we can’t see them. We can’t see the ultraviolet patterns on flowers that guide bees to nectar. We can’t see what’s too far away, or too small, or around corners. Our ears are no better. They pick up just a thin slice of the spectrum of soundwaves. We’re deaf to sounds too high, too low, or too faint to hear. 

These sensory deficits aren’t trivial. If our eyes had been sharp enough to spot deadly viruses and bacteria before microscopes were invented, we might have learned to avoid them, thus saving countless lives. The same is true of dangerous sources of radiation, like radon. But we never knew those things were there. That’s why one of the great triumphs of science is that it’s given us sensory prosthetics like microscopes, telescopes, and radiation sensors that help better understand--and survive in--the world around us.

Our weak senses are just the first filter that information from the outside world has to pass through. The second is attention. What we are conscious of is mostly just what we’re paying attention to, and attention is very selective. Several times a day, for example, a fire truck pulls out of the station down the street from my house, lights flashing and siren blaring. All this is literally designed to make me notice the truck, but I rarely do. Why? Because I’m used to it. My brain has learned it doesn’t need to focus on it, so it focuses on other things.

There’s a good reason we talk about “paying” attention. Attention, like money, is a limited resource. We can only pay attention to a few things at a time, so we have to focus on what’s most immediately important to us, and filter out the rest. That’s why, even though I like listening to audio books when I take road trips, I turn them off when I come to a city. I can’t follow the story and navigate heavy traffic at the same time, and if I tried, I would probably wreck my car.

What’s in our consciousness, then, is not a mirror-image of the world around us, but a pale reconstruction, heavily filtered by our senses, and heavily selected and edited by our brains. That shouldn’t really be surprising, because our brains didn’t evolve to show us a perfect, true reflection of the world (or, for that matter, to be perfectly logical). They evolved to selectively focus on the most important information, so we can navigate and survive in a complex, often dangerous world. Our brains were shaped for (and by) survival, not truth. And it shows in the many ways our thoughts and perceptions differ from reality. We don't think with full information and perfect rationality, but with mental shortcuts that let us make judgments quickly. These shortcuts can work amazingly well, but they can also cause serious distortions in the way we see ourselves, others, and the world around us.

I know people who hear about the distortions, and say, “But it’s so negative to focus on that. Why dwell on it?” My answer is that, while it may be depressing to see how error-prone our minds are, identifying the errors is actually the first step in a very positive direction. If we want to get better at seeing the world clearly, then we have to know the ways our minds distort it, so we can compensate for the distortions. It’s like going to the eye doctor. You have to know the defects in your vision to get glasses that make you see more clearly.

Wednesday, August 19, 2020

Complexity, Uncertainty, and Skeptical Open-Mindedness

Tolerance of Complexity and Uncertainty

Another crucial intellectual virtue is tolerance for complexity and uncertainty. We live in a complex world, and if we want to understand it accurately, our theories need to be complex enough to do it justice. To put it another way, we live in a high-resolution world, so we need high-resolution minds. But we don’t have them, or not enough of us do. A quick glance at the news or social media reveals how common low-resolution worldviews are. Too many people see the world in simplistic, black and white, us vs. them terms.


I like to call this outlook “cartoon thinking”, and it’s an easy habit to fall into. It’s all too easy to turn complex realities into exaggerated, simplistic caricatures. We like to see people as either heroes or mustache-twirling villains, when in fact most people are neither. We like to see social problems as having one simple, knee-jerk answer, when most of them don’t. If we’re not careful, we see whole groups of people in terms of stereotyped caricatures. All these tendences have caused a lot of pain and misunderstanding in this world. So, if we want to overcome cartoon thinking and the problems it causes, we have to cultivate a higher-resolution outlook. We have to get more comfortable with complexity, because the world is, in fact, complex. 


Because it’s complex, sometimes we simply don’t yet know the answers to certain questions. That’s why we need a tolerance for uncertainty. If we don’t know the answer to a question, then it doesn’t do any good to pretend we do. Consider the question of whether life exists elsewhere in the universe. Some people say, “Yes, absolutely!” Others say, “No, that’s impossible.” But the fact is, we simply don’t know. We don’t yet know the answer to this or many other questions, and until we do, the most honest thing to say is, “We don’t know.”

Skeptical Open-Mindedness


Skeptical open-mindedness might sound like a contradiction in terms. But I don’t think it is, because critical thinking requires both skepticism and open-mindedness. By “skepticism” I don’t mean philosophical skepticism, the school of thought that holds that we can know nothing. I also don’t mean denialism, which is a refusal to believe things that are well-supported by the evidence. Skepticism in the sense I’m using it just means holding off on believing a claim until you see good evidence that it’s true. In that sense, skepticism is the complement of open-mindedness, which is being open enough to new ideas to give them a fair hearing. Open-mindedness and skepticism can really be seen as two sides of the same coin. Neither is sufficient by itself. As the old saying goes, “Have an open mind. But not so open that your brains fall out.”


Monday, August 17, 2020

Curiosity and Fair-Mindedness

Curiosity and Commitment to Learning

A lack of curiosity is a common trait among ignorant people. In fact, that’s why they’re ignorant. They aren’t interested in ways of thinking different from their own, or in learning about new realms of knowledge. The result is that they don’t learn much. And that’s a problem, because if you want to think clearly about most topics, you have to have a certain level of background knowledge. If I want to have an informed opinion on monetary policy, for example, it isn’t enough for me to know the general skills of good reasoning. I also need to know some actual facts about monetary policy. You can’t have an informed opinion if you aren’t informed. 


Unfortunately, I’m not curious about monetary policy, which shows that curiosity can’t get you everywhere. Some fields are both important AND boring, and that means that if you want to think clearly about them, you have to commit yourself to learning about them. That can be about as much fun as eating sawdust, but such is life.

Fair-Mindedness


Have you ever seen a social media debate that goes something like this?:


Ed: “You gun nuts must think it’s just fine for school kids to be gunned down!!”


Fred: “You idiot! You gun grabbers just want to take away everyone’s guns to pave the way for communism!”


Neither Ed nor Fred seem to be at their most reasonable there, but the fact is, neither is likely to believe what the other accuses him of believing. Fred is probably not a psychopath who thinks school shootings are OK, and Ed is most likely not interested in outlawing all guns, or in living in a communist regime. Both are committing the straw man fallacy. They ignore whatever real arguments the other might have, and replace them with ridiculous caricatures that are easier to attack. It’s called the “straw man” fallacy because it’s easier to knock over a straw man than a real one. And it might even impress the audience, who may not see the difference. But nothing is really gained in this exchange. It’s like watching the Three Stooges hooting and slapping each other. It can admittedly be entertaining, but it’s definitely not educational.


If Ed and Fred listened to each others’ actual points of view, instead of attacking the straw man versions, they might have a more productive discussion. And if one of them really does have a better point than the other, he won’t be able to demonstrate it by knocking over straw men. He’ll need to knock over the other guy’s real arguments, and that’s a lot harder. That’s one reason critical thinkers need to be fair-minded (another is that most people really aren’t evil and crazy). Philosophers talk about using the principle of charity, which means that if someone makes a claim, you should interpret it in the most charitable light. Don’t assume they have bad intentions, or that their argument is weaker than it really is. If you aren’t sure what they mean, ask for clarification. Restate it in the strongest firm you can. Then, if you can refute the strong version of their argument (sometimes called a steel man), you’ve really done something. And if you can’t, you’ve encountered a strong argument that needs to be taken seriously.


Sunday, August 16, 2020

Intellectual Courage, Independence, and Modesty

In my last post I talked about how critical thinking is as much about personal dispositions or intellectual virtues as it is about the process of thinking itself. The first of these attitudes I mentioned was valuing truth--critical thinking ability is useless or even harmful if you don't care what's true. Now I want to discuss a couple of other crucial dispositions.

Courage and Independence

Critical thinking requires intellectual courage, which is closely related to intellectual independence. Real thinking can be a lonely pursuit. While it’s good to consider the opinions of others, you can’t let others do your reasoning for you. If you do, then you’re not reasoning at all. You may cogitating hard hard in order to rationalize beliefs you borrowed from others, but you won’t actually be reasoning, because reasoning is about deciding for yourself what conclusions to draw.

And here's the hard part: if you do draw your own conclusions, chances are some of them will differ from what you want to believe, and what your friends and peers believe. Neither of those things is comfortable or easy, and that’s why they take courage. It takes courage to choose your beliefs based on evidence and reason instead of desire and comfort. It also takes courage to disagree with your peers, especially if they’re deeply invested in a particular ideology, because they aren’t going to like it if you question that ideology. For example, if you and your friends are all pro-choice, try telling them--just as an experiment--that you agree with pro-lifers about something. If you’re pro-life, try doing the opposite. Just fasten your seatbelt first.

Intellectual Modesty
Not to be absolutely certain is, I think, one of the essential things in rationality.

- Bertrand Russell
Smug people aren’t good thinkers, and good thinkers aren’t smug. You can’t do any real reasoning if you’re already sure you’re right. As the philosopher Anthony Weston said, "True thinking is an open-ended process. The whole point is that you don’t know when you start where you’ll find yourself in the end."

Good thinkers also can’t be too impressed with their own knowledge. None of us knows everything, and most of us are downright ignorant about a lot of things. Take me, for example. I’ve got a decent understanding of science, and I try to stay up on politics, but I know diddly-squat about international monetary policy. Ditto for accounting, poker, and silkworm husbandry. I know next to nothing about these things, and until I learn more, I don’t have any business pontificating about them.

Interestingly (and depressingly) the people who know the least, and therefore have the lowest chance of being right, are commonly the most smug about their beliefs. This is a recently-discovered cognitive bias called the Dunning-Kruger Effect. Generally speaking, the less competent and knowledgeable people are, the more competent and knowledgeable they think they are. One of the things they don’t know is how much there is to know, and thus how little of it they do know. People with a lot of knowledge, on the other hand, have a better idea of the vastness of human knowledge, and therefore the vastness of their own ignorance. The ignorant are ignorant of their own ignorance, which is why so many of them think they have everything figured out.

Saturday, August 15, 2020

Intellectual Virtues: Why Critical Thinking isn’t Just About Thinking

Since I spent most of the last post talking about what critical thinking isn't, maybe I should talk about about what it actually is. Edward Glaser, a psychologist who designed the first psychological test of critical thinking, gave a good overview of the characteristics of a critical thinker:

The ability to think critically...involves three things: (1) an attitude of being disposed to consider in a thoughtful way the problems and subjects that come within the range of one's experiences, (2) knowledge of the methods of logical inquiry and reasoning, and (3) some skill in applying those methods. (italics added)


Notice the first thing Glaser mentions isn't knowledge or skill, but attitude. It's no good having the ability to think critically if you don't have the motivation to use it. A person who knows all about logic and reasoning, but is too lazy or apathetic to apply it to real-world issues, isn’t a critical thinker. Even worse is intellectual dishonesty. Someone who knows logic and rhetoric, but uses them deceptively to “win” arguments, instead of seeking what’s actually true, isn't a good critical thinker either (though, sadly, he might be an effective debater or politician).

Caring What’s True and other Intellectual Virtues


As I’ve already mentioned, critical thinking isn’t just about cold logic. Among other things, it's also about values and attitudes. For example, you have to value truth. You have to care what’s actually true more than what you want to be true. You have to be willing to abandon old, comforting beliefs when the evidence is against them. That’s easy enough to say, but very, very hard to do. For example, if you believe in heaven and hell, you likely have a deep emotional investment in believing those places exist. If you’re an atheist, you may be emotionally invested in believing they don’t. Either way, let’s say you were shown undeniable evidence that you were wrong. Would you trade the comforting fiction for a hard truth? It’s easy to talk about, and very hard to do.


Truth seeking is just one of several values required for good critical thinking. Or perhaps we should think of them as virtues, in the sense moral philosophers use the word: as personal traits one can cultivate to become a better person. Of course, the word "virtues" has a bit of a puritanical, holier-than-thou ring to it in everyday usage, and many critical thinking scholars use the word “dispositions” instead. Whatever you call them, different scholars have different lists, but the ones below show up on most of them.

  • Caring What is True
  • Intellectual Courage and Independence
  • Intellectual Modesty
  • Curiosity and Commitment to Learning
  • Fair-Mindedness
  • Tolerance of Complexity and Uncertainty
  • Skeptical Open-Mindedness
I've already talked about the first virtue in this list. In the next few posts, I'll explore the others in a little more depth.

Friday, August 14, 2020

What is Critical Thinking?

A Brief Definition

The first thing to make clear is that the word “critical” in critical thinking doesn’t mean “negative”, as in “My wife says I’m a terrible singer--why is she so critical?” Rather, it means something more like “evaluative”. To think critically is to evaluate beliefs, claims and decisions carefully--not in order to bash them, but to make sure they’re supported by logic and evidence. And here's the hard part: it's not just about evaluating other people's beliefs and decisions. It’s even more crucial that we evaluate our own. As the physicist Richard Feynman said, "The first principle is that you must not fool yourself – and you are the easiest person to fool."

The education scholar Robert H. Ennis has a nice, short definition of critical thinking:
Critical thinking is reasonable and reflective thinking focused on deciding what to believe or do.
Let’s take a closer look at that definition. First of all, critical thinking is “reasonable”. Have you ever noticed that the word “reason” can mean “thinking logically or rationally” as well as “a statement given as justification for a claim”? That’s no accident. When we engage in reasoning, we try to make sure we have good reasons for what we believe. We don’t just say, “I believe it because it’s true!” Well, some people do, but not people who are being reasonable.

Not only is critical thinking reasonable, in Ennis’ definition, but it’s also reflective. Good critical thinking requires us to constantly reflect on our own reasoning, to make sure we aren’t jumping to conclusions based on insufficient reasons. We have to think about our own thinking. 

Finally, the goal of critical thinking is “deciding what to believe or do”. Critical thinking can be devoted to deciding what’s reasonable to believe, but another important goal is deciding what’s reasonable to do. You might use critical thinking to decide whether to believe in God, but you might also use it to decide which house to buy, or who to vote for. None of these are trivial decisions. What we believe, and what we do, has important consequences--not just for us, but for other people, like our children or fellow citizens. That’s why critical thinking is so important.

Notes on Rationality

What to Believe vs. What to Do

The distinction between decisions about beliefs and actions brings us to the word “rationality”. “Rationality” is often used as a synonym for “reasonableness”, but it can be used in different ways in different fields, and the differences are worth mentioning. Many scholars distinguish between epistemic rationality, or rationally deciding what to believe, and instrumental rationality, or rationally deciding what to do. When economists and political scientists talk about rationality, they’re usually talking about instrumental rationality. They tend to see rationality in terms of maximizing utility. Why did I choose to go on vacation instead of spending the money on a new car? To an economist, it’s because that decision maximizes my utility.

I could get into defining exactly what utility means, but I won’t, because this blog isn’t about that kind of rationality. Rationality in the sense that economists are talking about isn’t really about deciding what’s true, or what’s morally right, but what’s advantageous. Some people might maximize their utility by seeking truth and goodness, but others might maximize theirs by being a selfish liar. If you’re a little uncomfortable with this amoral idea of rationality, well, so am I. But most scholars of instrumental rationality are perfectly moral people, and fields like decision theory, behavioral economics, and game theory are full of deep insights that really can help you make better decisions.In this blog I'm more interested in what is true and right than what maximizes utility.

Rationality ≠ Rationalizing

It’s easy to confuse the word “rationality” with “rationalizing”, but those are two very different things. In fact, they’re in direct opposition to each other. Rationality--at least in the sense of epistemic rationality--is about choosing what to believe based on good reasons. It moves from reasons to a conclusion. Rationalization goes in the opposite direction--it starts with a claim about what’s true or right, and then seeks out reasons to justify that claim (to yourself or others). Rationalization is actually one of the great enemies of rationality.

Logic, Emotion, and Compassion

While a lot of people think of reason as the opposite of emotion, that’s not necessarily true. Strong emotions can get in the way of reasoning, of course, but emotions like fear and empathy can sometimes help us make better decisions. The neurosurgeon Antonio Damasio has found that people with brain injuries that keep them from experiencing emotions normally can be prone to very bad decisions. Fear, for example, can help us give the appropriate motivational weight to options we’re considering. If I come to a dark alley in a high-crime area and think, “I should go that way and get home faster”, a little bit of fear might help me make a better decision.

Another good thing to remember is that you can be an expert on logic or decision theory and still make awful decisions. Kurt Gödel, probably the greatest logician of the 20th century, was so paranoid about being poisoned that he starved himself to death. John Von Neumann, perhaps the single smartest person of the 20th century, was instrumental in designing modern computer architecture as well as game theory, which is essential to strategic decision-making. But Von Neumann’s game theory calculations made him certain that the Soviet Union would destroy the United States with nuclear weapons, so he advocated a first strike which would have killed tens of millions of people. This made him one of the models for Dr. Strangelove, the madly “rational” scientist in the movie of the same name. Genius though he was, von Neumann was wrong. The Soviet first strike never came, and if the US had struck first, we would have committed the greatest atrocity in history.

So, while logic and other methods of formal reasoning are great, they aren’t infallible. They can be wrong. When making decisions that might cause great harm, erring on the side of caution and compassion can be the rational thing to do. As I'll mention many times in this blog, being reasonable is about more than cold, formal logic

Thursday, August 13, 2020

Who Needs Critical Thinking? All of Us

 “Everyone complains about his memory, and no one complains about his judgment.”

   - François de La Rochefoucauld


Most of us readily agree that there’s a lot of bad thinking out there, but we’re reluctant to admit that our own thoughts and perceptions might be flawed. The truth is that we’re all susceptible to illusions, wishful thinking, and biases. I am, you are...we all are. All of our brains are error prone. If you don’t believe me, consider the image below. Which of those two tabletops is longer and thinner than the other?



Did you say the one on the left? Me too. And we’re both wrong. The surface of the table on the left is the exact size and shape as the one on the right. You can take out a ruler and check (I did). It’s an optical illusion that makes our brain see a difference that isn’t really there. 


Now, you may object that this is an artificial image created by psychologists to fool us, and doesn’t apply to everyday life. So, here’s an image you might see in the real world--a few sunbeams at the end of the day. Here’s a question: Are those sunbeams diverging from each other, or are they parallel? 



They’re obviously diverging, right? Wrong. Sunbeams (or crepuscular rays as scientists call them) are as parallel as a set of train tracks, because the sun is large and far away. They only seem to diverge, for the same reason train tracks do--perspective. If you look at crepuscular rays from space, you’ll see that they’re quite parallel.



Just as there are optical illusions, there are “cognitive illusions” which can derail thinking the way optical illusions derail vision. Consider this simple math problem: If a ball and a bat cost $1.10 together, and the bat costs $1.00 more than the ball, how much is the ball? Take a second to think about it.


Did you say ten cents? I did, and I was wrong. If the ball cost 10 cents, and the bat costs a dollar more than the ball, then the bat would cost $1.10. But the bat and ball together only cost $1.10. The correct answer is that the bat costs $1.05, and the ball costs 5 cents (there’s been some inflation since this puzzle was invented). Most people try to solve this problem by taking the mental shortcut of subtracting $1.00 from the total price to get the price of the ball, without thinking the problem through. We all use such shortcuts (called heuristics) all the time. Sometimes they work just fine, but other times they don’t. 


The point is that our minds can deceive us when it comes to reasoning as well as perception. That’s why we need to be taught critical thinking skills. Unfortunately, most of us weren’t. I wasn’t--not in any systematic way--so I didn’t know much about this topic until I started researching it. How about you? Do you know the basic terminology of reasoning? Do you, for example, know the difference between an inductive argument and a deductive argument? Can you spot a valid deductive argument? Consider this silly little syllogism:


If lizards play fiddles, then frogs dance jigs. 

Lizards play fiddles.

__________________________________

Therefore, frogs dance jigs. 


Is that a valid argument? Yes, it is. It just isn’t true. There’s a difference between validity and truth which most people don’t know about. I certainly never learned it in high school. That’s really my point here. I’m guessing that most readers don’t know exactly what “inductive”, “deductive”, and “valid” mean, because I didn’t know what they meant either until recently. If I’ve successfully pointed out flaws in your perception, reasoning, or knowledge, it’s because I want to convince you that everyone has them, including you and me. Luckily, most people can also learn to overcome them, or at least push back against them, by learning better critical thinking skills.


But what does “critical thinking” even mean? Isn’t that one of those buzzy catchphrases that politicians and education gurus like to throw around because it sounds good? Unfortunately it can be. And that’s a shame, because critical thinking is a real set of solid, useful skills that’s been studied for years by philosophers, psychologists, and educators. In fact, one of the main skills you need to be a good critical thinker is to define terms precisely. So let’s take a closer look at what critical thinking actually is, and why it’s about more than just thinking.


Wednesday, August 12, 2020

The Problem: Bad Arguments and Alternative Facts

In November 2016, Oxford Dictionaries announced that their Word of the Year was “post-truth”. They defined this unsettling term as a state of affairs where “objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.” Early the next year the term “alternative facts” raised millions of eyebrows and caused sales of George Orwell’s 1984 to increase by 9,500%. Later in 2017, both the American Dialect Society and the Collins Dictionary announced that their word of the year was a phrase: fake news. There had indeed been an enormous number of phony news stories and news sites, but politicians immediately co-opted and distorted “fake news” to mean “any news I don’t like”. Which is a very post-truth thing to do.

Clearly, truth was having a rough couple of years. Many pundits quoted Mark Twain: “A lie can go around the world while the truth is putting its boots on.” Except Twain never actually said that. Which, paradoxically, shows how true the statement is. The misquotation was going around the world at the speed of light, promoted by the very people worried that truth no longer had any boots at all.

But--to paraphrase something Twain really did say--the reports of the sudden death of truth were exaggerated. First, the fact that so many people are outraged by post-truth politics, alternative facts, and fake news shows that many of us still do care what’s true and what’s false. Second, while the current assault on truth is very real, truth has had enemies for a very long time. Fake news, propaganda, logical fallacies, cognitive biases, and outright lies are nothing new. 

But truth really is under attack today in new and alarming ways. The internet has allowed misinformation and propaganda to spread and multiply at the speed of light. It’s also created a world where anyone, no matter how harebrained their beliefs, can find websites and like-minded people to tell them they’re right. After a stint in this echo chamber, they decide that their opinions are just as good as “elitist experts” like scientists and policy analysts, and dismiss them as out-of-touch liars. The results are pretty shocking. Do you believe the earth is flat, or that many powerful people are secretly sinister reptiles? You can find plenty of people on the internet who think so too, thus reassuring you that these are perfectly non-nutty things for a grownup to believe. 

While post-truth politics is commonly (and justifiably) associated with Donald Trump and the populist right, problems with truth exist across the political spectrum. A generation before Trump was elected, it was a Democratic president who said, “I did not have sexual relations with that woman”. But he did. And it was academics far to the left of Bill Clinton who popularized the idea that there’s no such thing as objective truth; only social constructs manipulated by those with the most power.  

Another trend that affects both sides of the political divide is increasing partisanship and ideological groupthink, which makes both sides more extreme in their views and more prone to confirmation bias and tunnel vision. People consult their ideology to decide what is true, when it should be the other way around. 

This increasing polarization, along with social media and its disinhibiting effects, has turned our public discourse into an absolute farce. Have you ever tried to reason with someone who thinks personal attacks and ALL CAPS make devastating arguments, in front of an audience who thinks so too? I know I have. Social media has revealed how many people don't know the basic rules of logic and evidence. Many debates today are like playing basketball against an opponent who doesn’t know the rules, throws punches, counts personal insults as baskets, and refuses to acknowledge any points you make. Worse, the people watching the debate may think that’s how you win basketball games, too. And that makes the whole exercise useless. Debates serve little purpose if the debaters--as well as the audience--don’t recognize some basic framework for determining what’s fair, what’s factual, and what’s logical. And millions of people have no such framework.

That's a huge problem, because democracies depend on citizens being capable of reasonable, logical discussions. And that requires that people be able to think clearly and logically. Good debate and good thinking are closely related, because in both cases, conclusions need to be based on logic and evidence. In other words, the abysmal quality of public debate in our country reflects a deeper problem: a widespread lack of critical thinking ability. 

This doesn’t just hurt society as a whole. It hurts citizens themselves. We need good thinking skills in our personal and professional lives as well as our civic lives, so we can make good decisions about what to believe and what to do. Should I worry about vaccinating my child? Should I break up with someone I like because our astrological signs aren’t compatible? Should I believe the preacher telling me that wives should obey their husbands? Should I trust the car salesman saying this is the perfect car for me? How about the politician telling me that violent crime rates are going up, and that they would go down if we all had more guns? 

In all these cases, what we believe matters, because it affects the decisions we make, and those decisions have real effects on us and the people around us. That will be a recurring theme of this book: truth and logic matter, because falsehood and nonsense are toxic and sometimes deadly. The stakes are too high to believe whatever feels good.

So, if the well-being of democracies and their citizens depend on good thinking skills, and citizens lack those skills, then what's the solution? Abandon democracy? Let a central authority tell us what to believe, and how the country should be run? I don’t think so, and neither did one of the founders of our democracy, Thomas Jefferson, who said:

I know no safe depository of the ultimate powers of the society, but the people themselves: and if we think them not enlightened enough to exercise their control with a wholesome discretion, the remedy is not to take it from them, but to inform their discretion by education.

"Inform their discretion by education." This mirrors Margaret Mead's dictum that people should "be taught how to think, not what to think." That’s where critical thinking comes in. People aren’t born knowing how to be good critical thinkers. All of us--not just kids in school but all of us--have to be taught how logic and reasoning work. In my next post, I'll talk about why that's true, and just how hard it is to see the world clearly and logically.


The Pillars of Argument: Covering Your ARS

So far, I’ve talked about clarifying the language and structure of arguments, as a first step in evaluating them. But how do we take the nex...