Etiquette



DP Etiquette

First rule: Don't be a jackass.

Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.

Tuesday, August 6, 2019

Identity Protective Cognition

Dan Kahan at Yale University's Cultural Cognition Project studies identity protective cognition. In a working paper, Misconceptions, Misinformation, and the Logic of Identity-protective Cognition, he discusses identity protective cognition.

ABSTRACT: This paper supplies a compact synthesis of the empirical literature on misconceptions of and misinformation about decision-relevant science. The incidence and impact of misconceptions and misperceptions, the article argues, are highly conditional on identity protective cognition. Identity protective cognition refers to the tendency of culturally diverse individuals to selectively credit and dismiss evidence in patterns that reflect the beliefs that predominate in their group. On issues that provoke identity-protective cognition, the members of the public most adept at avoiding misconceptions of science are nevertheless the most culturally polarized. Individuals are also more likely to accept misinformation and resist the correction of it when that misinformation is identity-affirming rather than identity-threatening. Effectively counteracting these dynamics, the paper argues, requires more than simply supplying citizens with correct information. It demands in addition the protection of the science communication environment from toxic social meanings that fuse competing understandings of fact with diverse citizens’ cultural identities.

INTRODUCTION: This paper investigates the role that “misinformation” and “misconceptions of science” play in political controversies over decision-relevant science (DRS). The surmise that their contribution is large is eminently plausible. Ordinary members of the public, we are regularly reminded (e.g., National Science Foundation 2014, 2016), display only modest familiarity with fundamental scientific findings, and lack proficiency in the forms of critical reasoning essential to science comprehension (Marx et al. 2007; Weber 2006). As a result, they are easily misled by special interest groups, who flood public discourse with scientifically unfounded claims on global warming, genetically modified foods, and other issues (e.g., Hmielowski et al. 2013). I will call this perspective the “public irrationality thesis” (PIT).

The unifying theme of this paper is that PIT itself reflects a misconception of a particular form of science: namely, the science of science communication. One of the major tenets of this emerging body of work is that public controversy over DRS typically originates in identity-protective cognition—a tendency to selectively credit and discredit evidence in patterns that reflect people’s commitments to competing cultural groups (Sherman & Cohen 2002, 2006). Far from evincing irrationality, this pattern of reasoning promotes the interests of individual members of the public, who have a bigger personal stake in fitting in with important affinity groups than in forming correct perceptions of scientific evidence. Indeed, the members of the public who are most polarized over DRS are the ones who have the highest degree of science comprehension, a capacity that they actively employ to form and persist in identity-protective beliefs (Kahan 2015a).

The problem, in short, is not a gullible, manipulated public; it is a polluted science communication environment. The pollution consists of antagonistic social meanings that put individuals in the position of having to choose between using their reason to discern what science knows or using it instead to express their group commitments. Safeguarding the science communication environment from such meanings, and repairing it where protective measures fail, should be the principle aim of those committed to assuring that society makes full use of the vast stock of DRS at its disposal (Kahan 2015b)




Is Kahan right to argue that the problem is a polluted science communication environment and not a gullible or manipulated public? For example, the president met yesterday with propagandists of the radical right and praised their efforts at deceiving and manipulating the public, commenting no their propaganda tactics: "The crap you think of is unbelievable. I mean it's genius — but it's bad." That evinces manipulation broader than just science-based content. Is there a difference between a polluted science communication environment and a manipulated public?



B&B orig: 7/12/19

No comments:

Post a Comment