We need a few trusted naysayers in our lives, critics who are willing to puncture our protective bubble of self-justifications and yank us back to reality if we veer too far off. This is especially important for people in positions of power. - Carol Tavris
When a grocer named Sylvan Goldman first invented the shopping cart in 1937, he found that people were embarrassed to use them. Not to be deterred, he hired several models to push them around the store and pretend to be shopping. Before long everybody was using them. Sylvan Goldman was a man who understood human nature.
He had tapped into a phenomenon known as social proof. People decide what to do by looking to see what everybody else is doing. If someone is sprawled on a sidewalk in a city, people will walk around them until one person stops and checks to see if they’re OK. Then others start stopping, too. In some cases, social proof is perfectly rational. If you’re walking to a baseball game in an unfamiliar city, and you don’t know how to find the ballpark, you can probably get there by following the crowd. If you’re a caveman and see your friends looking terrified and climbing trees, it might be a good idea to climb one, too. Back then, nonconformists got eaten by cave bears.
Perhaps it’s not surprising, then, that we look to others--especially others in our own tribe--for cues about what to do, and even how to think. The problem, of course, is that the crowd isn’t always right...even when it’s our crowd. The psychologist Solomon Asch demonstrated the power of conformity in a series of experiments in the 1950’s. He simply showed a set of lines to a group of subjects, and asked them to say which lines were the same length. The answer was obvious--it was easy to see which two lines matched. Or it should have been easy. But all the test subjects except one were actors who would give the same wrong answer on some of the tests. The real subjects in the experiment were bewildered. They looked around at the other people nervously, squinted at the lines, and then nearly 40% of them went along with the crowd. Social conformity made them see--or at least claim to see--what wasn’t there.
Whereas Asch showed how individuals can surrender their critical facilities to a group, sometimes entire groups start to think alike. This is called groupthink. Groupthink was first described by the social psychologist Irving Janis, who wanted to know what causes intelligent policymakers to do stupid things like the disastrous Bay of Pigs invasion of Cuba. He found that groupthink happens when groups of people are too set on agreeing with each other. They “go along to get along”. This can happen when there’s a leader who expects people to agree with him, or when there’s a culture of agreement, where dissent is frowned upon. Over time, the group develops a skewed view of reality, which sets them up for bad decisions and ugly surprises.
While most of us crave group cohesiveness and mutual validation, too much of it can be a bad thing. As the saying goes, “If everybody’s thinking alike, then somebody isn’t thinking.” That’s why it’s important to speak up if you think everybody’s thinking too much alike and settling into comfortable groupthink. Just don’t expect it to be easy. Devil’s advocates play an important role, but they’re rarely popular.
The strange thing about groupthink and related processes is that, while they lead to intellectual uniformity within groups, they can also lead to polarization between groups. Where I live on the Colorado Front Range, there’s a steep political gradient between conservative Colorado Springs, to the south, and ultra-liberal Boulder, to the north. In a recent experiment, researchers put people from each of these towns together to discuss political issues. People from Boulder talked to other people from Boulder, and people from Colorado Springs talked to other people from Colorado Springs. Can you guess what happened? In each group, opinions grew more homogeneous, and more extreme. The Boulder group grew more homogeneously liberal and moved further left, while the Colorado Springs did the same thing in the other direction. Conformity at one level led to growing division at a higher level.
This experiment may help explain what’s happening across the United States right now. More and more, people surround themselves with others who think like they do. In fact, they commonly move to other places for that very reason. They also associate with like-minded people online, and go to websites that confirm their pre-existing beliefs. To make things worse, search engines learn to show people exactly the kinds of sites they already agree with. All this leads to filter bubbles, where people are insulated from diverse points of view. While this effect isn’t as strong as some have claimed--many people online do see multiple points of view--there’s certainly been a trend in recent years for the left and the right to grow more insular and uniform internally as they grow apart externally. This adds to each side’s tendency to see the other as more homogenous and extreme than they really are, because in this case they’re partly right--both sides have become more homogenous and extreme! That’s why each side needs more critical thinkers, with the intellectual virtues I discussed in previous posts. Both sides need more of their members to be brave, independent, and intellectually modest enough to say, “Wait a minute. What makes us so sure we’re right?”
No comments:
Post a Comment