Remember when nutritionists admonished us with the phrase “you are what you eat”? A recent report from Harvard’s Kennedy school has modernized it, warning that–in our era of pervasive propaganda and misinformation–we are what we read (or otherwise access).
The study explored the media consumption of participants, and the degree to which the unreliability of that media left them with inaccurate beliefs about COVID-19 and vaccination. The researchers found that “the average bias and reliability of participants’ media consumption are significant predictors of their perceptions of false claims about COVID-19 and vaccination.”
I know–your first thought was “duh.” Did we really need a study showing that people who depend on garbage media believe ridiculous things? Wouldn’t logic tell us that?
Still, what seems self-evident can often prove less than conclusive, so confirmation of that logic in a rigorous study is important. In addition, the study confirmed politically-relevant differences in media consumption and credulity between Republicans and Democrats.
Here’s their summary of the study:
- We surveyed 3,276 U.S. adults, applying Ad Fontes Media’s (2023) ratings of media bias and reliability to measure these facets of participants’ preferred news sources. We also probed their perceptions of inaccurate claims about COVID-19 and vaccination.
- We found participants who tend to vote for Democrats—on average—consume less biased and more reliable media than those who tend to vote for Republicans. We found these (left-leaning) participants’ media reliability moderates the relationship between their media’s bias and their degree of holding false beliefs about COVID-19 and vaccination.
- Unlike left-leaning media consumers, right-leaning media consumers’ misinformed beliefs seem largely unaffected by their news sources’ degree of (un)reliability.
- This study introduces and investigates a novel means of measuring participants’ selected news sources: employing Ad Fontes’s (2023) media bias and media reliability ratings. It also suggests the topic of COVID-19, among many other scientific fields of recent decades, has fallen prey to the twin risks of a politicized science communication environment and accompanying group-identity-aligned stances so often operating in the polarized present.
The researchers found that the news-seeking and news-avoiding behaviors of the participants confirmed “the longstanding concern that those who embrace—and subsequently seek out—misinformation, even if inadvertently, constitute a group at risk of endangering their own and others’ health.”
In a country sharply divided along partisan lines, the implications rather obviously go further.
As any student of history–especially the history of journalism–can attest, America has always produced biased sources of information. What is different now, thanks to the Internet and social media, is its ubiquity–and greatly increased political motivation to seek out confirmatory “information.”
Other studies tell us that people who want to believe X do not necessarily change their belief in X when confronted with evidence that X is inaccurate. The Harvard study found that anti-vaccine attitudes were “tenacious and challenging to counter, unyielding to evidence, and bolstered by persuasive anti-vaccine messaging—which is not difficult to find and immerse oneself in. In the COVID-19 context, several identity groups appear to have engaged in this immersion.”
Some research has suggested that confrontation with contrary facts can lead to what is called a “backfire effect,” causing people to double down and become even more stubborn in their original beliefs. (Facebook found, for example, that warning users that an article was false caused people to share that article even more.) Other research has suggested that fact-checking, if done properly, can often successfully correct misperceptions. But…
First, facts and scientific evidence are not the most powerful and easy way to encourage people to abandon false or inaccurate beliefs and perspectives. Second, people embrace fake news, misinformation and disinformation because of their beliefs, even if they can be proven wrong, exercising, in many cases, a demonstration of tribal loyalty. Third, engaging in a dialogue in a non-threatening manner to avoid defense mechanisms from activating with personal stories has a greater likelihood of success.
Even when encounters with the facts might actually cause a reconsideration, it turns out that the algorithms used by social media platforms increasingly shield users from information they might find uncongenial. Those “likes” we register act as guidelines used to feed us more of the posts we’ll “like,” and shield us from contrary perspectives or facts that might debunk our preferred prejudices.
And now, the deepfakes are coming.
On the one hand, several sites are available that evaluate the credibility of the sources we consult. On the other hand, no one can force people to visit those sites or believe their ratings.
it has never been easier to avoid uncongenial realities and evade critical thinking…..
Comments