Wednesday, July 8, 2020

On the Anti-Science Bias




Context
Variations of this discussion have been posed here before, but in my opinion, the topic of flawed reasoning and anti-science bias has become critically important and urgent in view of the pandemic and the upcoming election.


Anti-science bias
In an article posted by Live ScienceHumans are hardwired to dismiss (coronavirus) facts that don't fit their worldview, philosopher Adrian Bardon writes on comments by Anthony Fauci, expressing an ‘inconceiveable’ anti-science bias among Americans. Fauci sees ‘science as truth’ and thus cannot understand why many people reject knowledge. Bardon writes:
“It is Fauci's profession of amazement that amazes me. As well-versed as he is in the science of the coronavirus, he's overlooking the well-established science of
‘anti-science bias,’ or science denial. Americans increasingly exist in highly polarized, informationally insulated ideological communities occupying their own information universes. Within segments of the political blogosphere, global warming is dismissed as either a hoax or so uncertain as to be unworthy of response. Within other geographic or online communities, the science of vaccine safety, fluoridated drinking water and genetically modified foods is distorted or ignored.”  


Motivated reasoning
In the 1950's, the prominent psychologist Leon Festinger commented on the human condition, asserting: “A man with a conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.”

Festinger’s observation was prompted in part by how a cult responded after their predicted day of apocalypse. That day, December 21, 1954, came and went without the end of the Earth, or the saving of the cult members by aliens. The cult truly believed that would be the day the Earth ended. In preparation, some quit their jobs and prepared to go with the aliens in their space ship. 

They rationalized the failed apocalypse and turned it into a success. Instead of admitting that they were wrong, the cult responded by saying that the aliens told them their belief had shed light and saved the world. Festinger commented: “The little group, sitting all night long, had spread so much light that God had saved the world from destruction.” 

That shows the awesome power of the human mind to defend strongly held beliefs. That kind of thinking is called motivated reasoning. The label makes sense.

An article in Mother Jones, The Science of Why We Don’t Believe Science, commented on motivated reasoning:
“Reasoning is actually suffused with emotion (or what researchers often call “affect”). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we’re aware of it. That shouldn’t be surprising: Evolution required us to react very quickly to stimuli in our environment. It’s a ‘basic human survival skill’, explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

We’re not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn’t take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that’s highly biased, especially on topics we care a great deal about.”

When people consciously think about something, they are thinking emotionally, logically, intuitively, morally and in biased ways all at the same time.[1] When we think we are reasoning about inconvenient truths, we are usually rationalizing a defense of prior beliefs, even if they are false beliefs. When someone hears about a new discovery that contradicts a strongly held belief, their mind usually instantly mounts a negative, defensive subconscious response. That response guides memories and associations in the conscious mind that are consistent with supporting the prior belief. That mental process tends to lead people to build an argument and challenge or reject the new knowledge.


Footnote:
1. Psychologist Johnathan Haidt commented on motivated reasoning and related biases, e.g., confirmation bias (acceptance of evidence we want to believe) and disconformation bias (skepticism of evidence we want to disbelieve):
“The reasoning process is more like a lawyer defending a client than a judge or scientist seeking truth. Kuhn (1991) found that most people have difficulty understanding what evidence is, and when pressed to give evidence in support of their theories they generally give anecdotes or illustrative examples instead. Furthermore, people show a strong tendency to search for anecdotes and other “evidence” exclusively on their preferred side of an issue, a pattern that has been called the “myside bias” (Baron, 1995; Perkins, Farady, & Bushey, 1991). Once people find supporting evidence, even a single piece of bad evidence, they often stop the search, since they have a “makes-sense epistemology” (Perkins, Allen, & Hafner, 1983) in which the goal of thinking is not to reach the most accurate conclusion; it is to find the first conclusion that hangs together well and that fits with one’s important prior beliefs.”

No comments:

Post a Comment