Friday, June 26, 2020

Coronavirus responses highlight how humans are hardwired to dismiss facts that don’t fit their worldview

June 25, 2020 8.18am EDT



Bemoaning uneven individual and state compliance with public health recommendations, top U.S. COVID-19 adviser Anthony Fauci recently blamed the country’s ineffective pandemic response on an American “anti-science bias.” He called this bias “inconceivable,” because “science is truth.” Fauci compared those discounting the importance of masks and social distancing to “anti-vaxxers” in their “amazing” refusal to listen to science.

It is Fauci’s profession of amazement that amazes me. As well-versed as he is in the science of the coronavirus, he’s overlooking the well-established science of “anti-science bias,” or science denial.

Americans increasingly exist in highly polarized, informationally insulated ideological communities occupying their own information universes.

Within segments of the political blogosphere, global warming is dismissed as either a hoax or so uncertain as to be unworthy of response. Within other geographic or online communities, the science of vaccine safety, fluoridated drinking water and genetically modified foods is distorted or ignored. There is a marked gap in expressed concern over the coronavirus depending on political party affiliation, apparently based in part on partisan disagreements over factual issues like the effectiveness of social distancing or the actual COVID-19 death rate.

In theory, resolving factual disputes should be relatively easy: Just present strong evidence, or evidence of a strong expert consensus. This approach succeeds most of the time, when the issue is, say, the atomic weight of hydrogen.

But things don’t work that way when scientific advice presents a picture that threatens someone’s perceived interests or ideological worldview. In practice, it turns out that one’s political, religious or ethnic identity quite effectively predicts one’s willingness to accept expertise on any given politicized issue.

Motivated reasoning” is what social scientists call the process of deciding what evidence to accept based on the conclusion one prefers. As I explain in my book, “The Truth About Denial,” this very human tendency applies to all kinds of facts about the physical world, economic history and current events.The same facts will sound different to people depending on what they already believe. AP Photo/John Raoux


Denial doesn’t stem from ignorance

The interdisciplinary study of this phenomenon has made one thing clear: The failure of various groups to acknowledge the truth about, say, climate change, is not explained by a lack of information about the scientific consensus on the subject.

Instead, what strongly predicts denial of expertise on many controversial topics is simply one’s political persuasion.

A 2015 metastudy showed that ideological polarization over the reality of climate change actually increases with respondents’ knowledge of politics, science and/or energy policy. The chances that a conservative is a climate science denier is significantly higher if he or she is college educated. Conservatives scoring highest on tests for cognitive sophistication or quantitative reasoning skills are most susceptible to motivated reasoning about climate science.

Denialism is not just a problem for conservatives. Studies have found liberals are less likely to accept a hypothetical expert consensus on the possibility of safe storage of nuclear waste, or on the effects of concealed-carry gun laws.
Denial is natural

The human talent for rationalization is a product of many hundreds of thousands of years of adaptation. Our ancestors evolved in small groups, where cooperation and persuasion had at least as much to do with reproductive success as holding accurate factual beliefs about the world. Assimilation into one’s tribe required assimilation into the group’s ideological belief system – regardless of whether it was grounded in science or superstition. An instinctive bias in favor of one’s “in-group” and its worldview is deeply ingrained in human psychology.

A human being’s very sense of self is intimately tied up with his or her identity group’s status and beliefs. Unsurprisingly, then, people respond automatically and defensively to information that threatens the worldview of groups with which they identify. We respond with rationalization and selective assessment of evidence – that is, we engage in “confirmation bias,” giving credit to expert testimony we like while finding reasons to reject the rest.

Unwelcome information can also threaten in other ways. “System justification” theorists like psychologist John Jost have shown how situations that represent a perceived threat to established systems trigger inflexible thinking. For example, populations experiencing economic distress or an external threat have often turned to authoritarian leaders who promise security and stability.

In ideologically charged situations, one’s prejudices end up affecting one’s factual beliefs. Insofar as you define yourself in terms of your cultural affiliations, your attachment to the social or economic status quo, or a combination, information that threatens your belief system – say, about the negative effects of industrial production on the environment – can threaten your sense of identity itself. If trusted political leaders or partisan media are telling you that the COVID-19 crisis is overblown, factual information about a scientific consensus to the contrary can feel like a personal attack.Everyone sees the world through one partisan lens or another, based on their identity and beliefs. Vladyslav Starozhylov/Shutterstock.com

Denial is everywhere

This kind of affect-laden, motivated thinking explains a wide range of examples of an extreme, evidence-resistant rejection of historical fact and scientific consensus.

Have tax cuts been shown to pay for themselves in terms of economic growth? Do communities with high numbers of immigrants have higher rates of violent crime? Did Russia interfere in the 2016 U.S. presidential election? Predictably, expert opinion regarding such matters is treated by partisan media as though evidence is itself inherently partisan.

Denialist phenomena are many and varied, but the story behind them is, ultimately, quite simple. Human cognition is inseparable from the unconscious emotional responses that go with it. Under the right conditions, universal human traits like in-group favoritism, existential anxiety and a desire for stability and control combine into a toxic, system-justifying identity politics.

Science denial is notoriously resistant to facts because it isn’t about facts in the first place. Science denial is an expression of identity – usually in the face of perceived threats to the social and economic status quo – and it typically manifests in response to elite messaging.

I’d be very surprised if Anthony Fauci is, in fact, actually unaware of the significant impact of politics on COVID-19 attitudes, or of what signals are being sent by Republican state government officials’ statements, partisan mask refusal in Congress, or the recent Trump rally in Tulsa. Effective science communication is critically important because of the profound effects partisan messaging can have on public attitudes. Vaccination, resource depletion, climate and COVID-19 are life-and-death matters. To successfully tackle them, we must not ignore what the science tells us about science denial.

This is an updated version of an article originally published on Jan. 31, 2020.

Author
Adrian Bardon
Professor of Philosophy, Wake Forest University
Disclosure statement
Adrian Bardon received funding from the Humility and Conviction in Public Life project at the University of Connecticut.




We believe in the free flow of information
Republish our articles for free, online or in print, under Creative Commons licence.Republish this article


No comments:

Post a Comment