Sunday, August 30, 2020

Faces in the Clouds: Seeing Patterns That Aren't There

“...nothing is so alien to the human mind as the idea of randomness.” --John Cohen


The human mind is a pattern-seeking machine. We’re constantly trying to spot patterns and regularities in the world around us, and for very good reason. Our ancestors who looked in the bushes and saw a pattern of spots as a leopard lived long enough to become our ancestors.That’s true even if it turned out there wasn’t a leopard there. It’s better to accept a false positive than be eaten by a false negative. Our evolutionary history has given us all a case of apophenia: an overactive tendency to see patterns and meaning even in random, meaningless things. That’s why we see faces in the clouds, a man in the moon, and rams and scorpions in random arrangements of stars as they happen to look from Earth. It’s why people hear Satanic messages in music played backwards. It’s why some people see Jesus in a piece of toast, and others see Buddha in a tree trunk. Sometimes we go beyond seeing patterns, and see meaning, and even intention, where there is none. Humans are prone to see omens in the flight of birds, and angry gods in natural disasters.

The problem with our constant pattern-seeking is that we aren’t very good at recognizing a lack of pattern. We aren’t good at seeing randomness as randomness. In fact, real randomness doesn’t like we think it should. Consider the following sequences of digits. Which one was generated by a random process?

01100101100110100110

10001001000100011111

Most people would say the first one is random. But it isn’t. I made that sequence deliberately to look the way I expect a random sequence to look. Then I made the second sequence with a true, time-honored randomness generator. I flipped a coin: heads = 1, tails = 0. So the second sequence is the random one. But it doesn’t look like it, does it? It has too many long stretches of ones and zeros to seem random. It even looks too regular to me, and I’m the one who flipped a coin twenty times to make it!

Our inability to see randomness for what it is leads us to several errors in judgement. For example, I like to shoot pool, but I’m not very consistent. Sometimes I get on a streak, and hit several shots in a row. I start thinking I’m on fire, and then the streak ends. Did I temporarily become a better pool player, or was it just an apparent pattern in randomness, like the string of five 1’s in the random coin flips above? Most likely, it was just random, and I’m the victim of the hot hand illusion. That’s what researchers have found when they studied professional basketball players (who are much better at shooting baskets than I am at shooting pool.) Most players swear they get the “hot hand”--they get on a streak and can hardly miss the basket. But if you run statistical tests of their hits and misses, it turns out that the hot and cold streaks are random. Of course, good players have more hits than misses, but the “hot hand” streaks of hits still seem to be random occurrences.

A sort of reverse of the hot hand illusion, called the gambler’s fallacy, occurs when people think a random streak is “due” to end. Take parents, for example. The odds of a couple having a female baby are 50% (actually not quite, but close enough that we’ll call it 50%). So, here’s a probability problem: if a couple has had 5 male babies in a row, what are the odds of the next baby being a girl?

Answer: 50%. A single, random event has no memory of what’s happened before. A coin doesn’t realize that it’s landed on heads 10 times in a row and decide to land on tails because it’s overdue. But many couples keep having babies, thinking they’re “due” to have one sex or the other. Casinos are full of gamblers thinking their losing streaks are “due” to end. Many of them lose huge amounts of money that way. It’s another clear case where cognitive biases are doing real harm.

A close cousin of the hot hand illusion happened when the Germans were bombing London in World War II. People looked at maps of bomb strikes and noticed suspicious clusters and empty areas. Some Londoners started thinking the relatively empty areas on the map were places where German spies were living. But bombing back then wasn’t very accurate, and the empty spots and the clusters were actually random. This phenomenon is known as the clustering illusion.

Another form of mistaken pattern recognition happens when two things occur together (or one after another) purely by coincidence, and people decide they must be related. Let’s say you’re a baseball player, and you tighten your shoelaces just before hitting a home run. If you’re like a lot of baseball players (a famously superstitious bunch) you may decide that tightening your shoelaces caused you to hit that homer, so you start tightening them every time you go to bat. This is called illusory correlation, and this particular version is also a logical fallacy called post hoc, ergo propter hoc (Latin for “after this, therefore because of this”). This effect also causes people to think home remedies and quack cures are more effective than they really are. If you have a sore throat, it will almost certainly go away by itself if you wait. If you take a shot of whiskey--like your Uncle Bob recommended--it will also go away. Just don’t conclude that it went away because of the whiskey. 

Why don’t people think about all the times a sore throat went away without a shot of whiskey, and realize their mistake? Why don’t baseball players stop tightening their shoelaces when they notice all the times they do it and then don’t hit a homer? Like most humans, they notice the instances that “confirm” their theories, and don’t notice the ones that don’t. And that brings us to what might be the king of all cognitive biases.

No comments:

Post a Comment

The Pillars of Argument: Covering Your ARS

So far, I’ve talked about clarifying the language and structure of arguments, as a first step in evaluating them. But how do we take the nex...