Pragmatic politics focused on the public interest for those uncomfortable with America's two-party system and its way of doing politics. Considering the interface of politics with psychology, cognitive science, social behavior, morality and history.
Etiquette
DP Etiquette
First rule: Don't be a jackass.
Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.
Saturday, August 10, 2019
An emotion self-control method
Given the increasing heat and reason-killing emotion that seemed to be occurring recently, a suggestion about a way to maintain self-control seems to be in order.
A person's emotional state affects unconscious and conscious reason in perceptions of reality, discourse and thinking. Emotion is now believed to be a necessary part of cognition, conscious reasoning and moral decision-making. Despite that, out-of-control emotion tends to degrade the quality of reasoning, leading to beliefs or decisions that are objectively less rational and/or detrimental to the individual. Reasonable control of emotion should be generally helpful to people in their everyday lives.
One scientist observes (pdf) that “the neuronal channels going up from the emotional centers of the brain to the more cognitive centers are denser and more robust than the cognitive centers going down to inhibit and control the emotional structures. Self-conscious efforts to avoid prejudice, fear, hatred, and depression are often rendered unsuccessful by this imbalance.”
In other words, emotional self-control often isn’t easy because our brains are wired that way. That’s just normal human biology.
Research psychologists recently published a paper showing that thinking or talking to yourself in the third person helps maintain emotional control in the face of events or information that provoke emotion and a potential loss of self-control. For politics, that means disagreement over political issues, most of which are highly emotionally charged.
Writing in Scientific Reports (vol. 7, Article 4519, published online July 3, 2017), lead scientist Jason Moser reported: “We hypothesized that it does under the premise that third-person self-talk leads people to think about the self similar to how they think about others, which provides them with the psychological distance needed to facilitate self-control. We tested this prediction by asking participants to reflect on feelings elicited by viewing aversive images (Study 1) and recalling negative autobiographical memories (Study 2) using either “I” or their name while measuring neural activity via ERPs (Study 1)[1] and fMRI (Study 2). . . . . Together, these results suggest that third-person self-talk may constitute a relatively effortless form of self-control. . . . . Specifically, using one’s own name to refer to the self during introspection, rather than the first-person pronoun ‘I’, increases peoples’ ability to control their thoughts, feelings, and behavior under stress.”
Commenting on the study, Government Executive writes: “‘Essentially, we think referring to yourself in the third person leads people to think about themselves more similar to how they think about others, and you can see evidence for this in the brain,’ says Jason Moser, associate professor of psychology at Michigan State University. ‘That helps people gain a tiny bit of psychological distance from their experiences, which can often be useful for regulating emotions.’ . . . . ‘What’s really exciting here,’ says [senior researcher Ethan] Kross, who directs the Emotion and Self-Control Lab, ‘is that the brain data from these two complimentary experiments suggest that third-person self-talk may constitute a relatively effortless form of emotion regulation. If this ends up being true—we won’t know until more research is done—there are lots of important implications these findings have for our basic understanding of how self-control works, and for how to help people control their emotions in daily life.’”
Footnote:
1. ERPS: event-related brain potentials, are electrical brain responses caused by sensory or cognitive stimuli such as photos or verbal information; ERPS are small but accurately measurable electrical brain responses that occurs over about a half second after a stimulus.
fMRI: functional magnetic resonance imaging, is a noninvasive method used to visualize parts of the human brain as it responds to various stimuli such as unpleasant photos, moral dilemmas or information that contradicts personal beliefs; fMRI visualizes areas of brain responses in near real time, with localized brain activity becoming visible a few seconds after a brain area has begun responding to what is seen or heard.
B&B orig: 10/21/17
Cognition And Emotion Interplay: Current Thinking
Cognition: the mental action or process of acquiring knowledge and understanding through experience, the senses, and thinking
In a section for a book to be published in the coming weeks, The Nature of Emotion: Fundamental Questions (2nd edition). New York: Oxford University Press, Hadas Okon‐Singer (HOS) and colleagues address the question of how emotion and cognition interact. Their paper, The Interplay of Emotion and Cognition ( pdf), describes current thinking about emotion’s role in perceiving the world and information, thinking about it and understanding it. The implications of current research for both the clinical medicine of psychological disorders are profound. Although HOS does not focus in it, the same implications hold for politics.
This short paper illustrates how quickly understanding of matters that until recently were the domain of philosophers is expanding. Modern neuroscience, and psychological and clinical research is making rapid inroads into understanding emotion. HOS comments: “Until the 20th century, the study of emotion and cognition was largely a philosophical matter. Although contemporary theoretical perspectives on the mind and its disorders remain heavily influenced by the introspective measures that defined this earlier era of scholarship, the last several decades have witnessed the emergence of powerful new tools for objectively assaying emotion and brain function, which have yielded new insights into the interplay of emotion and cognition.”
The basic interpretation of from existing data that HOS draws is simple but profound: “Emotion—including emotional cues, emotional states, and emotional traits—can profoundly influence key elements of cognition in both adaptive and maladaptive ways.” Until recently, dominant scientific belief was that emotion was a reality and logic distorting influence, and thus it was generally maladaptive or detrimental for rational cognition (seeing and thinking). HOS makes clear that emotion can be helpful. Other researchers have come to the same conclusion. For example, Philip Tetlock, a researcher who analyzes the quality of expert judgment in politics and related topics such as national security and economics, believes that, among other things, consciously controlled emotion is an essential part of accurate expert judgment. (see discussion here)
HOS comments that since the world is far more complex than the human mind can deal with, emotion is a mechanism the mind relies on to help focus attention on what’s important. Citing other researchers, HOS observes that “attention is necessary because . . . . the environment presents far more perceptual information than can be effectively processed, one’s memory contains more competing traces than can be recalled, and the available choices, tasks, or motor responses are far greater than one can handle” Things like angry faces, erotica (sex!) and snakes are far more attention-grabbing than non-emotional inputs. HOS summarizes this point: “Emotional stimuli are associated with enhanced processing in sensory regions of the brain and amplified processing is associated with faster and more accurate performance.” Clearly, emotion can be adaptive or helpful.
Anxiety: Regarding anxiety disordersm HOS observes that “Individuals show marked differences in the amount of attention they allocate to emotionally salient information. Such attentional biases are intimately related to emotional traits and disorders. Hypervigilance for threat is a core component of both dispositional and pathological anxiety. . . . Anxious individuals are more likely to initially orient their gaze towards threat in free‐viewing tasks; they are quicker to fixate threat‐related targets in visual search tasks; and they show difficulty disengaging from threat‐related distractors . . . . There is compelling evidence that attentional biases to threat causally contribute to the development and maintenance of extreme anxiety.”
Working memory, the mind’s blackboard: Working memory actively recalls, maintains and manipulates (thinks about) information for short periods of time when one is consciously focused on something or a mental task. The amount of such information is very limited. HOS comments that “information transiently held in working memory is a key determinant of our momentary thoughts, feelings, and behavior. Recent work by our group indicates that emotionally salient information enjoys privileged access to working memory. . . . anxious individuals allocate excess storage capacity to threat, even when it is completely irrelevant to the task at hand and no longer present in the external world.”
In other words, emotion is a powerful influence on perceptions of reality and thinking about what is perceived.
Emotional control strategies: Research now shows that some emotion control techniques can effectively tamp down emotional responses. The basis for this is increasingly well understood. HOS points out that “. . . the neurobiological underpinnings of this core human capacity [to control emotion] indicates that circuits involved in attention and working memory play a crucial role in the regulation of emotion and other, closely related aspects of motivated behavior, such as temptation and craving.” The biology of such traits is coming into focus.
One effective strategy is to simply divert attention from emotional or distressing sources or inputs such as disturbing videos, photos or speech. Effects of doing this are observable in brain structures, e.g., the amygdala, that regulate emotional states or feelings. Another emotion-damping technique is to consciously reframe[1] or reassess emotional inputs. As discussed previously, another emotion control mechanism is to think in third person terms instead of first person terms.
HOS concludes by observing that “the last decade has witnessed an explosion of interest in the interplay of emotion and cognition and greater attention to key methodological and inferential pitfalls.” Intrusion of philosophers into neuroscience has no doubt raised concern for pitfalls in both experimental methods and in how the resulting data can be interpreted.
Footnote:
1. Framing effects refer to a powerful innate cognitive bias ( https://en.wikipedia.org/wiki/Framing_effect_(psychology) ). It leads the mind to perceive, think about and then make judgements about a situation or an issue that comes to one’s attention depending on how information is framed. Careful framing leads to judgments that vary in often or usually predictable ways. People thus tend to make judgments based on the framework in which information or a situation is presented. In politics, framing ideas, issues and people is also called spinning.
A Chilean Naval Ship
B&B orig: 11/5/17
Cognitive Science: Reason as a Secular Moral
A 2016 peer-reviewed paper by psychologist Tomas Ståhl and colleagues at the University of Illinois at Chicago and the University of Exeter suggests that some people see reason and evidence as a secular moral issue. Those people tend to consider the rationality of another's beliefs as evidence of their morality or lack thereof.
According to the paper’s abstract: “In the present article we demonstrate stable individual differences in the extent to which a reliance on logic and evidence in the formation and evaluation of beliefs is perceived as a moral virtue, and a reliance on less rational processes is perceived as a vice. We refer to this individual difference variable as moralized rationality. . . . Results show that the Moralized Rationality Scale (MRS) is internally consistent, and captures something distinct from the personal importance people attach to being rational (Studies 1–3). Furthermore, the MRS has high test-retest reliability (Study 4), is conceptually distinct from frequently used measures of individual differences in moral values, and it is negatively related to common beliefs that are not supported by scientific evidence (Study 5).” Ståhl T, Zaal MP, Skitka LJ (2016) Moralized Rationality: Relying on Logic and Evidence in the Formation and Evaluation of Belief Can Be Seen as a Moral Issue. PLoS ONE 11(11): e0166332.doi:10.1371/journal.pone.0166332.
According to Ståhl’s paper, “People who moralize rationality should not only respond more strongly to irrational (vs. rational) acts, but also towards the actors themselves. . . . . a central finding in the moral psychology literature is that differences in moral values and attitudes lead to intolerance. For example, the more morally convicted people are on a particular issue (i.e., the more their stance is grounded in their fundamental beliefs about what is right or wrong), the more they prefer to distance themselves socially from those who are attitudinally dissimilar.”
ScienceDaily commented on the paper: moral rationalists see less rational individuals as “less moral; prefer to distance themselves from them; and under some circumstances, even prefer them to be punished for their irrational behavior . . . . By contrast, individuals who moralized rationality judged others who were perceived as rational as more moral and worthy of praise. . . . While morality is commonly linked to religiosity and a belief in God, the current research identifies a secular moral value and how it may affect individuals' interpersonal relations and societal engagement.”
ScienceDaily also noted that “in the wake of a presidential election that often kept fact-checkers busy, Ståhl (the paper’s lead researcher) says the findings would suggest a possible avenue to more productive political discourse that would encourage a culture in which it is viewed as a virtue to evaluate beliefs based on logical reasoning and the available evidence. . . . . ‘In such a climate, politicians would get credit for engaging in a rational intellectually honest argument . . . . They would also think twice before making unfounded claims, because it would be perceived as immoral.’”
Since most people believe they are mostly or always quite rational, it seems reasonable to argue that rationality is a moral issue. The finding of personal value for evidence-based rational thinking about political issues suggests it be a possible basis for a political principle or moral value in political ideology.
B&B orig: 8/1/18
According to the paper’s abstract: “In the present article we demonstrate stable individual differences in the extent to which a reliance on logic and evidence in the formation and evaluation of beliefs is perceived as a moral virtue, and a reliance on less rational processes is perceived as a vice. We refer to this individual difference variable as moralized rationality. . . . Results show that the Moralized Rationality Scale (MRS) is internally consistent, and captures something distinct from the personal importance people attach to being rational (Studies 1–3). Furthermore, the MRS has high test-retest reliability (Study 4), is conceptually distinct from frequently used measures of individual differences in moral values, and it is negatively related to common beliefs that are not supported by scientific evidence (Study 5).” Ståhl T, Zaal MP, Skitka LJ (2016) Moralized Rationality: Relying on Logic and Evidence in the Formation and Evaluation of Belief Can Be Seen as a Moral Issue. PLoS ONE 11(11): e0166332.doi:10.1371/journal.pone.0166332.
According to Ståhl’s paper, “People who moralize rationality should not only respond more strongly to irrational (vs. rational) acts, but also towards the actors themselves. . . . . a central finding in the moral psychology literature is that differences in moral values and attitudes lead to intolerance. For example, the more morally convicted people are on a particular issue (i.e., the more their stance is grounded in their fundamental beliefs about what is right or wrong), the more they prefer to distance themselves socially from those who are attitudinally dissimilar.”
ScienceDaily commented on the paper: moral rationalists see less rational individuals as “less moral; prefer to distance themselves from them; and under some circumstances, even prefer them to be punished for their irrational behavior . . . . By contrast, individuals who moralized rationality judged others who were perceived as rational as more moral and worthy of praise. . . . While morality is commonly linked to religiosity and a belief in God, the current research identifies a secular moral value and how it may affect individuals' interpersonal relations and societal engagement.”
ScienceDaily also noted that “in the wake of a presidential election that often kept fact-checkers busy, Ståhl (the paper’s lead researcher) says the findings would suggest a possible avenue to more productive political discourse that would encourage a culture in which it is viewed as a virtue to evaluate beliefs based on logical reasoning and the available evidence. . . . . ‘In such a climate, politicians would get credit for engaging in a rational intellectually honest argument . . . . They would also think twice before making unfounded claims, because it would be perceived as immoral.’”
Since most people believe they are mostly or always quite rational, it seems reasonable to argue that rationality is a moral issue. The finding of personal value for evidence-based rational thinking about political issues suggests it be a possible basis for a political principle or moral value in political ideology.
B&B orig: 8/1/18
Cognitive Science: Halo Error Is Inevitable Like Death & Taxes
“The attractiveness stereotype is a specific instance of a more general psychological principle known as the halo effect, in which individuals ascribe characteristics to others based on the presence of another observable characteristic. Such errors are stunningly prevalent in data derived from ratings of others to such an extent that one scholar described the problem thusly: ‘halo error, like death and taxes, seems inevitable.’” Carl Palmer and Rolfe Peterson, American Politics Research, 44(2):353–382, 2016
Halo effect: “The halo effect is a type of immediate judgement discrepancy, or cognitive bias, where a person making an initial assessment of another person, place, or thing will assume ambiguous information based upon concrete information. A simplified example of the halo effect is when an individual noticing that the person in the photograph is attractive, well groomed, and properly attired, assumes, using a mental heuristic, that the person in the photograph is a good person based upon the rules of that individual's social concept.” (presumably, ‘judgement discrepancy’ means a demonstrable deviation of personal belief from objective truth)
In their paper, Halo Effects and the Attractiveness Premium in Perceptions of Political Expertise, Palmer and Peterson observe that “halo errors are thought to be a reflection of a rater’s inability to differentiate between characteristics being evaluated, although in many circumstances, these errors occur automatically, below the level of conscious information processing.”
To some extent, inputs such as a speaker’s personal attractiveness are unconsciously translated into a belief that the speaker is more knowledgeable, competent and/or trustworthy than might be warranted by other inputs such as the content of the speech.
In their study, Palmer and Peterson conducted surveys to assess the halo effect, which earlier studies had reported. They found that as previously observed, subjective assessments attractive people were more knowledgeable and persuasive than for others. They also found that attractive people, even if uninformed, were more likely to report attempting to persuade others. In addition, people surveyed “were more willing to turn to more attractive individuals as potential sources for political information.” Those results were observed even after controlling for possibly confounding factors such as partisanship and gender.
The authors pointed out that vision of attractiveness seems to be consistent in cultural groups, and this may be a universal human cognitive trait that is relatively stable over time. They also point out that attractiveness counts in elections: “Beyond competence, there is also a clear preference for more attractive candidates, with those rated as more attractive enjoying greater electoral success . . . . . Under conditions of limited information, citizens appear to vote with their eyes, rather than their minds. It is important to note that these attractiveness biases in expressed preferences not only emerge automatically but also appear to persist in light of additional information about the candidates.”
The latter statement refers to the well-known stickiness of at least some kinds of misinformation, e.g., climate change is a hoax.
The authors speculate about the fundamental basis of democracy: “It stands to reason that we should expect these biases to creep into political discussions as well, influencing individuals’ political perceptions, orientations, and, most importantly, with whom they choose to discuss politics. This tendency to engage in biased information processing raises questions not only about the ability of citizens to make suitable evaluations of the quality of candidates but also the expertise of political discussants.”
Social science has raised similar concerns before this. Evidence of a collapse in respect for expertise seems to be solid and stable, if not increasing. Given the complexity of most political issues and the rise of relentless propaganda and disrespect for both fact and logic, there is legitimate cause for concern.
The researchers asked where political influence was coming from. They observed: “But who is trying to influence whom? Is it simply the uninformed, attractive respondents who are influencing their social contacts, or are the politically active attractive individuals well informed, as well as politically active? Given our findings from Table 1, we believe it is the former, rather than the latter. . . . . The end result is that the less informed have their perceptions of the political world shaped and their voting decisions influenced by those they perceive to be credible others. If those perceptions of expertise are mistaken beliefs influenced by an individuals’ physical appearance, many poorly informed individuals might simply be being led astray as they seek to upgrade their political knowledge. The body of evidence we present would seem to confirm these normative concerns.”
B&B orig: 8/19/18
Halo effect: “The halo effect is a type of immediate judgement discrepancy, or cognitive bias, where a person making an initial assessment of another person, place, or thing will assume ambiguous information based upon concrete information. A simplified example of the halo effect is when an individual noticing that the person in the photograph is attractive, well groomed, and properly attired, assumes, using a mental heuristic, that the person in the photograph is a good person based upon the rules of that individual's social concept.” (presumably, ‘judgement discrepancy’ means a demonstrable deviation of personal belief from objective truth)
In their paper, Halo Effects and the Attractiveness Premium in Perceptions of Political Expertise, Palmer and Peterson observe that “halo errors are thought to be a reflection of a rater’s inability to differentiate between characteristics being evaluated, although in many circumstances, these errors occur automatically, below the level of conscious information processing.”
To some extent, inputs such as a speaker’s personal attractiveness are unconsciously translated into a belief that the speaker is more knowledgeable, competent and/or trustworthy than might be warranted by other inputs such as the content of the speech.
In their study, Palmer and Peterson conducted surveys to assess the halo effect, which earlier studies had reported. They found that as previously observed, subjective assessments attractive people were more knowledgeable and persuasive than for others. They also found that attractive people, even if uninformed, were more likely to report attempting to persuade others. In addition, people surveyed “were more willing to turn to more attractive individuals as potential sources for political information.” Those results were observed even after controlling for possibly confounding factors such as partisanship and gender.
The authors pointed out that vision of attractiveness seems to be consistent in cultural groups, and this may be a universal human cognitive trait that is relatively stable over time. They also point out that attractiveness counts in elections: “Beyond competence, there is also a clear preference for more attractive candidates, with those rated as more attractive enjoying greater electoral success . . . . . Under conditions of limited information, citizens appear to vote with their eyes, rather than their minds. It is important to note that these attractiveness biases in expressed preferences not only emerge automatically but also appear to persist in light of additional information about the candidates.”
The latter statement refers to the well-known stickiness of at least some kinds of misinformation, e.g., climate change is a hoax.
The authors speculate about the fundamental basis of democracy: “It stands to reason that we should expect these biases to creep into political discussions as well, influencing individuals’ political perceptions, orientations, and, most importantly, with whom they choose to discuss politics. This tendency to engage in biased information processing raises questions not only about the ability of citizens to make suitable evaluations of the quality of candidates but also the expertise of political discussants.”
Social science has raised similar concerns before this. Evidence of a collapse in respect for expertise seems to be solid and stable, if not increasing. Given the complexity of most political issues and the rise of relentless propaganda and disrespect for both fact and logic, there is legitimate cause for concern.
The researchers asked where political influence was coming from. They observed: “But who is trying to influence whom? Is it simply the uninformed, attractive respondents who are influencing their social contacts, or are the politically active attractive individuals well informed, as well as politically active? Given our findings from Table 1, we believe it is the former, rather than the latter. . . . . The end result is that the less informed have their perceptions of the political world shaped and their voting decisions influenced by those they perceive to be credible others. If those perceptions of expertise are mistaken beliefs influenced by an individuals’ physical appearance, many poorly informed individuals might simply be being led astray as they seek to upgrade their political knowledge. The body of evidence we present would seem to confirm these normative concerns.”
B&B orig: 8/19/18
Science: The Earliest Known Example of ‘Modern Cognition
What? This doesn’t look like much of anything
Nature, probably the world’s top science journal, has published a paper believed to reveal evidence of the earliest known human abstract drawing. The drawing dates to the Middle Stone Age, about 73,000 years ago. One researcher commented that this find is interpreted as “a prime indicator of modern cognition.” The rock fragment has cross-hatch lines sketched onto stone with red ochre pigment.
The paper’s abstract commented: “This notable discovery pre-dates the earliest previously known abstract and figurative drawings by at least 30,000 years. This drawing demonstrates the ability of early Homo sapiens in southern Africa to produce graphic designs on various media using different techniques.” Although scientists have found older drawings, this research indicates the lines on this stone mark the first abstract drawing, an indicator of abstract thinking.
Extrapolation of the lines on the rock fragment is interpreted to be an abstract drawing. The fragment was analyzed to be coarse-grained silcrete (length 38.6 mm, width 12.8 mm, height 15.4 mm). One inch equals 25.4 mm, so the fragment is small. According to one researcher, “the abrupt termination of all lines on the fragment edges indicates that the pattern originally extended over a larger surface.” Sometimes, that is how tenuous human knowledge or belief can be.
B&B orig: 9/12/18
Nature, probably the world’s top science journal, has published a paper believed to reveal evidence of the earliest known human abstract drawing. The drawing dates to the Middle Stone Age, about 73,000 years ago. One researcher commented that this find is interpreted as “a prime indicator of modern cognition.” The rock fragment has cross-hatch lines sketched onto stone with red ochre pigment.
The paper’s abstract commented: “This notable discovery pre-dates the earliest previously known abstract and figurative drawings by at least 30,000 years. This drawing demonstrates the ability of early Homo sapiens in southern Africa to produce graphic designs on various media using different techniques.” Although scientists have found older drawings, this research indicates the lines on this stone mark the first abstract drawing, an indicator of abstract thinking.
Extrapolation of the lines on the rock fragment is interpreted to be an abstract drawing. The fragment was analyzed to be coarse-grained silcrete (length 38.6 mm, width 12.8 mm, height 15.4 mm). One inch equals 25.4 mm, so the fragment is small. According to one researcher, “the abrupt termination of all lines on the fragment edges indicates that the pattern originally extended over a larger surface.” Sometimes, that is how tenuous human knowledge or belief can be.
B&B orig: 9/12/18
Cognitive Impairment Associated with Radical Political Beliefs
(A) Using factor analysis, we investigated the underlying factor structure of multiple questionnaires about political issues. Three latent factors were identified and labeled “political orientation,” “dogmatic intolerance,” and “authoritarianism” according to the pattern of individual item loadings. Item loadings for each question (questionnaires indicated by different colors) are presented.
(B–D) To investigate the relation between these constructs, scores on the three factors were extracted for each individual. (B) We observed a quadratic relationship between political orientation and dogmatic intolerance, revealing that people on the extremes of the political spectrum are more rigid and dogmatic in their world views. (C) A linear relationship between political orientation and authoritarianism was observed, with people from the far right of the political spectrum showing more obedience to authorities and conventions. (D) Dogmatic intolerance and authoritarianism were positively correlated, indicating commonality between these two sub-components of radicalism.
Widening polarization about political, religious, and scientific issues threatens open societies, leading to entrenchment of beliefs, reduced mutual understanding, and a pervasive negativity surrounding the very idea of consensus. Such radicalization has been linked to systematic differences in the certainty with which people adhere to particular beliefs. However, the drivers of unjustified certainty in radicals are rarely considered from the perspective of models of metacognition, and it remains unknown whether radicals show alterations in confidence bias (a tendency to publicly espouse higher confidence), metacognitive sensitivity (insight into the correctness of one’s beliefs), or both Max Rollwage et al., Current Biology, Vol. 28, Iss. 24, Pgs. 4014-4021, Dec. 17, 2018
Metacognition: awareness and understanding of one's own thought processes, roughly, self-awareness.
Confidence bias (overconfidence effect): a bias observed as a person’s subjective confidence in his or her judgements being greater than the objective accuracy of those judgements, especially when confidence is relatively high; overconfidence is one example of a miscalibration of subjective probabilities.
Motivated reasoning: a powerful emotion-biased decision-making phenomenon; the term refers to the role of motivation in cognitive processes such as decision-making and attitude change in a number of situations, including cognitive dissonance reduction, e.g., in the face of discomforting information or logic.
The journal Current Biology published a paper, Metacognitive Failure as a Feature of Those Holding Radical Beliefs, with some evidence that individuals who hold radical beliefs tend to lack self-awareness relative to others. This mindset was associated with higher confidence in correct and incorrect choices, and a reduced tendency to change levels of confidence in the face of new but contrary information.
The researchers pointed out that multiple cognitive effects could be going on that would account for the observed opinionation and resistance to change among radicals. Influences the researchers tried to dissect included motivated reasoning, confidence bias, and metacognition:
This research does not shed light on the direction cause and effect. Commenting on the paper, Steven Novella writes:
The results described here are asserted to come from the first attempt to tease cognitive processes apart to determine the cognitive and social sources of radicalism. Because of that, the research needs to be replicated and expanded to generate confidence in the results and conclusions. It is reasonable to think that multiple influences lead to radicalization including life experiences, personality, self and social identity, etc.
On replication of this research, it may turn out that a major source of radicalization, maybe the most important source, is impaired metacognition, as the researchers propose. In that case, there is a large body of research and real world experience with methods to teach enhanced metacognitive skill. The education community is fully aware of the usefulness of metacognition in education.
One reader of the 2009 Handbook of Metacognition in Education wrote this in his forward to the handbook: “This handbook goes a long way toward capturing the state of the science and the art of the study of metacognition. It reveals great strides in the sophistication and precision with which metacognition can be conceptualized, assessed, and developed [and] covers the gamut, including research and development on metacognition across a wide variety of subject-matter areas, as well as in more abstract issues of theory and measurement . . . . It is truly a landmark work.”
Maybe there is some hope for deradicalization of radicals and prevention of radicalization in minds susceptible to it. Of course, that begs the question of whether radical beliefs are usually more harmful than beneficial. There appears to be at least some research on that point.[1] For at least some uninformed people, ‘common sense’ might suggest radicalism is generally not a good thing.
Footnote:
1. From the article, Radical Beliefs and Violent Actions Are Not Synonymous: How to Place the Key Disjuncture Between Attitudes and Behaviors at the Heart of Our Research into Political Violence: This article develops and elaborates on three core points. First, as with research into other social science themes, it is argued that it is necessary to apply the logic of correlation and causality to the study of political violence. Second, it highlights the critical disjuncture between attitudes and behaviors. Many or most individuals who support the use of political violence remain on the sidelines, including those who sympathize with insurgents in Afghanistan (reportedly 29 percent in 2011), and those supportive of “suicide attacks” in the Palestinian Territories (reportedly reaching 66 percent in 2005). Conversely, those responsible for such behaviors are not necessarily supportive of the ostensible political aims. Third, it is argued that the motives that drive these attitudes and behaviors are often (or, some would argue, always) distinct. While the former are motivated by collective grievances, there is substantial case study evidence that the latter are commonly driven by economic (e.g., payments for the emplacement of improvised explosive devices), security-based (i.e., coercion) and sociopsychological (e.g., adventure, status, and vengeance) incentives. Thus, it is necessary for the research community to treat attitudes and behaviors as two separate, albeit interrelated, lines of inquiry.
B&B orig: 12/21/18
Widening polarization about political, religious, and scientific issues threatens open societies, leading to entrenchment of beliefs, reduced mutual understanding, and a pervasive negativity surrounding the very idea of consensus. Such radicalization has been linked to systematic differences in the certainty with which people adhere to particular beliefs. However, the drivers of unjustified certainty in radicals are rarely considered from the perspective of models of metacognition, and it remains unknown whether radicals show alterations in confidence bias (a tendency to publicly espouse higher confidence), metacognitive sensitivity (insight into the correctness of one’s beliefs), or both Max Rollwage et al., Current Biology, Vol. 28, Iss. 24, Pgs. 4014-4021, Dec. 17, 2018
Metacognition: awareness and understanding of one's own thought processes, roughly, self-awareness.
Confidence bias (overconfidence effect): a bias observed as a person’s subjective confidence in his or her judgements being greater than the objective accuracy of those judgements, especially when confidence is relatively high; overconfidence is one example of a miscalibration of subjective probabilities.
Motivated reasoning: a powerful emotion-biased decision-making phenomenon; the term refers to the role of motivation in cognitive processes such as decision-making and attitude change in a number of situations, including cognitive dissonance reduction, e.g., in the face of discomforting information or logic.
The journal Current Biology published a paper, Metacognitive Failure as a Feature of Those Holding Radical Beliefs, with some evidence that individuals who hold radical beliefs tend to lack self-awareness relative to others. This mindset was associated with higher confidence in correct and incorrect choices, and a reduced tendency to change levels of confidence in the face of new but contrary information.
The researchers pointed out that multiple cognitive effects could be going on that would account for the observed opinionation and resistance to change among radicals. Influences the researchers tried to dissect included motivated reasoning, confidence bias, and metacognition:
An unjustified certainty in one’s beliefs is a characteristic common to those espousing radical beliefs, and such overconfidence is observed for both political and non-political issues, implying a general cognitive bias in radicals. However, the underpinnings of radicals’ distorted confidence estimates remain unknown. In particular, one-shot measures of the discrepancy between performance and confidence are unable to disentangle the contributions of confidence bias (changes in an overall belief about performance, which may be affected by optimism and mood) from changes in metacognitive sensitivity (an ability to distinguish accurate from inaccurate performance). This distinction may be particularly important as changes in metacognitive sensitivity may account for radicals’ reluctance to change their mind in the face of new evidence.
This research does not shed light on the direction cause and effect. Commenting on the paper, Steven Novella writes:
What this study cannot tell us about is the arrow of cause and effect. One possibility is that those who lack the metacognitive ability to properly assess and correct their own confidence levels will tend to fall into more extreme views. Their confidence will allow them to more easily brush off dissenting opinions and information, more nuanced and moderate narratives, and the consensus of opinion.
At the same time I find it plausible that those who become radicalized into extreme political views may adopt overconfidence and stubbornness as a motivated reasoning strategy, in order to maintain their views, which they hold for emotional and identity reasons. This may become more of a general cognitive style that they employ, rather than being limited to just their radical views.
The results described here are asserted to come from the first attempt to tease cognitive processes apart to determine the cognitive and social sources of radicalism. Because of that, the research needs to be replicated and expanded to generate confidence in the results and conclusions. It is reasonable to think that multiple influences lead to radicalization including life experiences, personality, self and social identity, etc.
On replication of this research, it may turn out that a major source of radicalization, maybe the most important source, is impaired metacognition, as the researchers propose. In that case, there is a large body of research and real world experience with methods to teach enhanced metacognitive skill. The education community is fully aware of the usefulness of metacognition in education.
One reader of the 2009 Handbook of Metacognition in Education wrote this in his forward to the handbook: “This handbook goes a long way toward capturing the state of the science and the art of the study of metacognition. It reveals great strides in the sophistication and precision with which metacognition can be conceptualized, assessed, and developed [and] covers the gamut, including research and development on metacognition across a wide variety of subject-matter areas, as well as in more abstract issues of theory and measurement . . . . It is truly a landmark work.”
Maybe there is some hope for deradicalization of radicals and prevention of radicalization in minds susceptible to it. Of course, that begs the question of whether radical beliefs are usually more harmful than beneficial. There appears to be at least some research on that point.[1] For at least some uninformed people, ‘common sense’ might suggest radicalism is generally not a good thing.
Footnote:
1. From the article, Radical Beliefs and Violent Actions Are Not Synonymous: How to Place the Key Disjuncture Between Attitudes and Behaviors at the Heart of Our Research into Political Violence: This article develops and elaborates on three core points. First, as with research into other social science themes, it is argued that it is necessary to apply the logic of correlation and causality to the study of political violence. Second, it highlights the critical disjuncture between attitudes and behaviors. Many or most individuals who support the use of political violence remain on the sidelines, including those who sympathize with insurgents in Afghanistan (reportedly 29 percent in 2011), and those supportive of “suicide attacks” in the Palestinian Territories (reportedly reaching 66 percent in 2005). Conversely, those responsible for such behaviors are not necessarily supportive of the ostensible political aims. Third, it is argued that the motives that drive these attitudes and behaviors are often (or, some would argue, always) distinct. While the former are motivated by collective grievances, there is substantial case study evidence that the latter are commonly driven by economic (e.g., payments for the emplacement of improvised explosive devices), security-based (i.e., coercion) and sociopsychological (e.g., adventure, status, and vengeance) incentives. Thus, it is necessary for the research community to treat attitudes and behaviors as two separate, albeit interrelated, lines of inquiry.
B&B orig: 12/21/18
Subscribe to:
Posts (Atom)