I guess we no longer need the “big lie.” We Americans–for that matter, people everywhere– are perfectly comfortable simply rejecting facts that make us uncomfortable, or otherwise conflict with our preferred realities.
I’ve previously blogged about the emerging academic literature on confirmation bias. A reader sent me an article from the Boston Globe summarizing much of that literature.
Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
Needless to say, this is a real problem for democratic theory, which places a high value on an informed populace.
This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
As the author notes, we humans tend to base our opinions on our beliefs–and those beliefs can have what he delicately calls “an uneasy relationship” with facts. Although we like to believe that we base our beliefs on evidence and fact, research suggests that our beliefs all too often dictate the facts we’re willing to accept.
Sometimes we just twist facts to make them fit with our preferred beliefs; at other times our preconceptions lead us to uncritically accept rumor, misinformation and outright propaganda if those reinforce our worldviews or confirm our resentments and/or suspicions.
The phenomenon is certainly not limited to the political right, but the most recent glaring examples do come from the GOP “clown car.” Donald Trump insists that he saw “thousands of Muslims” cheering when the World Trade Center came down, even though everyone in a positions to know says that never happened. Ben Carson “quotes” America’s founders for statements they never made (and in some cases, expressing sentiments diametrically opposed to what they actually did say.) Carly Fiorina insists that she viewed a video that doesn’t exist. And people who want to believe them, do.
As the Globe article put it, thanks to the internet, “it’s never been easier for people to be wrong, and at the same time feel more certain that they’re right.”
Identifying the problem and solving it are two different issues. To date, there has been progress on identifying the phenomenon, less on what we need to do to counter it. That said, researchers are working on it.
One avenue may involve self-esteem. In other words, if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t. This would also explain why demagogues benefit from keeping people agitated. The more threatened people feel, the less likely they are to listen to dissenting opinions, and the more easily controlled they are.
No wonder those of us advocating for evidence-based public policies are having such a bad time…..