Neuroscientists at the University of Southern California have published a
paper, Neural correlates of maintaining one’s political
beliefs in the face of counterevidence (Scientific
Reports, 6, No. 39589, December 2016;
http://www.nature.com/articles/srep39589 ), describing brain responses
to evidence that contradicts personal political beliefs. Areas of the
brain that are activated by contrary evidence include the amygdala and
insular cortex. Those areas are associated with emotion,
decision-making, threat perception and feelings of anxiety.
When self-described political liberals were presented with evidence
that contradicted eight strongly held political beliefs, the amygdala
and insular cortex were more activated than when they were presented
with evidence that contradicted eight strongly held, but non-political
beliefs. When asked to rate their beliefs after seeing the contrary
evidence, people’s beliefs about the non-political topics decreased in
strength, but they didn’t significantly change the degree of their faith in their
political beliefs. The contrary evidence was five statements of fact
that contradicted each of the political and non-political beliefs.
According to the paper: “People often discount evidence that
contradicts their firmly held beliefs. However, little is known about
the neural mechanisms that govern this behavior. We used neuroimaging to
investigate the neural systems involved in maintaining belief in the
face of counterevidence, presenting 40 liberals with arguments that
contradicted their strongly held political and non-political views.
Challenges to political beliefs produced increased activity in the
default mode network—a set of interconnected structures associated with
self-representation and disengagement from the external world. . . . We
also found that participants who changed their minds more showed less
BOLD* signal [detectable brain activity] in the insula and the amygdala
when evaluating counterevidence. These results highlight the role of
emotion in belief-change resistance and offer insight into the neural
systems involved in belief maintenance, motivated reasoning, and related
phenomena.”
* BOLD: blood oxygen level dependent
The amygdala are the green areas in the brain scan
The
amygdala and insular cortex are brain areas associated with thinking
about personal identity and abstract or deep thinking that disengages
from present reality.
The paper puts the research into context:
“Few things are as fundamental to human progress as our ability to
arrive at a shared understanding of the world. The advancement of
science depends on this, as does the accumulation of cultural knowledge
in general. Every collaboration, whether in the solitude of a marriage
or in a formal alliance between nations, requires that the beliefs of
those involved remain open to mutual influence through conversation.
Data on any topic—from climate science to epidemiology—must first be
successfully communicated and <em>believed</em> before it
can inform personal behavior or public policy. Viewed in this light, the
inability to change another person’s mind through evidence and
argument, or to have one’s own mind changed in turn, stands out as a
problem of great societal importance. Both human knowledge and human
cooperation depend upon such feats of cognitive and emotional
flexibility.”
Other observations from the paper: “It is well
known that people often resist changing their beliefs when directly
challenged, especially when these beliefs are central to their identity.
In some cases, exposure to counterevidence may even increase a person’s
confidence that his or her cherished beliefs are true. . . . One model
of belief maintenance holds that when confronted with counterevidence,
people experience negative emotions borne of conflict between the
perceived importance of their existing beliefs and the uncertainty
created by the new information.”
The human mind very much
dislikes uncertainty. It is extremely adept at quickly and unconsciously
removing uncertainty via rationalization and just making stuff up until
uncertainty goes away.
The paper raises some obvious questions. Is an inability to change another person’s mind through evidence and
argument, or to have one’s own mind changed, a significant social problem? Is it more ethical or moral to
retain one’s core beliefs, even when faced with evidence that those
beliefs are factually wrong? In other words, is it better to stand on
ideological or moral principle, or, is cognitive and emotional
flexibility (pragmatism) a more ethical or moral mind set?
ScienceDaily also discusses this paper: https://www.sciencedaily.com/releases/2016/12/161223115757.htm
No comments:
Post a Comment