Pragmatic politics focused on the public interest for those uncomfortable with America's two-party system and its way of doing politics. Considering the interface of politics with psychology, cognitive science, social behavior, morality and history.
Etiquette
DP Etiquette
First rule: Don't be a jackass.
Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.
Saturday, August 10, 2019
Cognition And Emotion Interplay: Current Thinking
Cognition: the mental action or process of acquiring knowledge and understanding through experience, the senses, and thinking
In a section for a book to be published in the coming weeks, The Nature of Emotion: Fundamental Questions (2nd edition). New York: Oxford University Press, Hadas Okon‐Singer (HOS) and colleagues address the question of how emotion and cognition interact. Their paper, The Interplay of Emotion and Cognition ( pdf), describes current thinking about emotion’s role in perceiving the world and information, thinking about it and understanding it. The implications of current research for both the clinical medicine of psychological disorders are profound. Although HOS does not focus in it, the same implications hold for politics.
This short paper illustrates how quickly understanding of matters that until recently were the domain of philosophers is expanding. Modern neuroscience, and psychological and clinical research is making rapid inroads into understanding emotion. HOS comments: “Until the 20th century, the study of emotion and cognition was largely a philosophical matter. Although contemporary theoretical perspectives on the mind and its disorders remain heavily influenced by the introspective measures that defined this earlier era of scholarship, the last several decades have witnessed the emergence of powerful new tools for objectively assaying emotion and brain function, which have yielded new insights into the interplay of emotion and cognition.”
The basic interpretation of from existing data that HOS draws is simple but profound: “Emotion—including emotional cues, emotional states, and emotional traits—can profoundly influence key elements of cognition in both adaptive and maladaptive ways.” Until recently, dominant scientific belief was that emotion was a reality and logic distorting influence, and thus it was generally maladaptive or detrimental for rational cognition (seeing and thinking). HOS makes clear that emotion can be helpful. Other researchers have come to the same conclusion. For example, Philip Tetlock, a researcher who analyzes the quality of expert judgment in politics and related topics such as national security and economics, believes that, among other things, consciously controlled emotion is an essential part of accurate expert judgment. (see discussion here)
HOS comments that since the world is far more complex than the human mind can deal with, emotion is a mechanism the mind relies on to help focus attention on what’s important. Citing other researchers, HOS observes that “attention is necessary because . . . . the environment presents far more perceptual information than can be effectively processed, one’s memory contains more competing traces than can be recalled, and the available choices, tasks, or motor responses are far greater than one can handle” Things like angry faces, erotica (sex!) and snakes are far more attention-grabbing than non-emotional inputs. HOS summarizes this point: “Emotional stimuli are associated with enhanced processing in sensory regions of the brain and amplified processing is associated with faster and more accurate performance.” Clearly, emotion can be adaptive or helpful.
Anxiety: Regarding anxiety disordersm HOS observes that “Individuals show marked differences in the amount of attention they allocate to emotionally salient information. Such attentional biases are intimately related to emotional traits and disorders. Hypervigilance for threat is a core component of both dispositional and pathological anxiety. . . . Anxious individuals are more likely to initially orient their gaze towards threat in free‐viewing tasks; they are quicker to fixate threat‐related targets in visual search tasks; and they show difficulty disengaging from threat‐related distractors . . . . There is compelling evidence that attentional biases to threat causally contribute to the development and maintenance of extreme anxiety.”
Working memory, the mind’s blackboard: Working memory actively recalls, maintains and manipulates (thinks about) information for short periods of time when one is consciously focused on something or a mental task. The amount of such information is very limited. HOS comments that “information transiently held in working memory is a key determinant of our momentary thoughts, feelings, and behavior. Recent work by our group indicates that emotionally salient information enjoys privileged access to working memory. . . . anxious individuals allocate excess storage capacity to threat, even when it is completely irrelevant to the task at hand and no longer present in the external world.”
In other words, emotion is a powerful influence on perceptions of reality and thinking about what is perceived.
Emotional control strategies: Research now shows that some emotion control techniques can effectively tamp down emotional responses. The basis for this is increasingly well understood. HOS points out that “. . . the neurobiological underpinnings of this core human capacity [to control emotion] indicates that circuits involved in attention and working memory play a crucial role in the regulation of emotion and other, closely related aspects of motivated behavior, such as temptation and craving.” The biology of such traits is coming into focus.
One effective strategy is to simply divert attention from emotional or distressing sources or inputs such as disturbing videos, photos or speech. Effects of doing this are observable in brain structures, e.g., the amygdala, that regulate emotional states or feelings. Another emotion-damping technique is to consciously reframe[1] or reassess emotional inputs. As discussed previously, another emotion control mechanism is to think in third person terms instead of first person terms.
HOS concludes by observing that “the last decade has witnessed an explosion of interest in the interplay of emotion and cognition and greater attention to key methodological and inferential pitfalls.” Intrusion of philosophers into neuroscience has no doubt raised concern for pitfalls in both experimental methods and in how the resulting data can be interpreted.
Footnote:
1. Framing effects refer to a powerful innate cognitive bias ( https://en.wikipedia.org/wiki/Framing_effect_(psychology) ). It leads the mind to perceive, think about and then make judgements about a situation or an issue that comes to one’s attention depending on how information is framed. Careful framing leads to judgments that vary in often or usually predictable ways. People thus tend to make judgments based on the framework in which information or a situation is presented. In politics, framing ideas, issues and people is also called spinning.
A Chilean Naval Ship
B&B orig: 11/5/17
Cognitive Science: Reason as a Secular Moral
A 2016 peer-reviewed paper by psychologist Tomas Ståhl and colleagues at the University of Illinois at Chicago and the University of Exeter suggests that some people see reason and evidence as a secular moral issue. Those people tend to consider the rationality of another's beliefs as evidence of their morality or lack thereof.
According to the paper’s abstract: “In the present article we demonstrate stable individual differences in the extent to which a reliance on logic and evidence in the formation and evaluation of beliefs is perceived as a moral virtue, and a reliance on less rational processes is perceived as a vice. We refer to this individual difference variable as moralized rationality. . . . Results show that the Moralized Rationality Scale (MRS) is internally consistent, and captures something distinct from the personal importance people attach to being rational (Studies 1–3). Furthermore, the MRS has high test-retest reliability (Study 4), is conceptually distinct from frequently used measures of individual differences in moral values, and it is negatively related to common beliefs that are not supported by scientific evidence (Study 5).” Ståhl T, Zaal MP, Skitka LJ (2016) Moralized Rationality: Relying on Logic and Evidence in the Formation and Evaluation of Belief Can Be Seen as a Moral Issue. PLoS ONE 11(11): e0166332.doi:10.1371/journal.pone.0166332.
According to Ståhl’s paper, “People who moralize rationality should not only respond more strongly to irrational (vs. rational) acts, but also towards the actors themselves. . . . . a central finding in the moral psychology literature is that differences in moral values and attitudes lead to intolerance. For example, the more morally convicted people are on a particular issue (i.e., the more their stance is grounded in their fundamental beliefs about what is right or wrong), the more they prefer to distance themselves socially from those who are attitudinally dissimilar.”
ScienceDaily commented on the paper: moral rationalists see less rational individuals as “less moral; prefer to distance themselves from them; and under some circumstances, even prefer them to be punished for their irrational behavior . . . . By contrast, individuals who moralized rationality judged others who were perceived as rational as more moral and worthy of praise. . . . While morality is commonly linked to religiosity and a belief in God, the current research identifies a secular moral value and how it may affect individuals' interpersonal relations and societal engagement.”
ScienceDaily also noted that “in the wake of a presidential election that often kept fact-checkers busy, Ståhl (the paper’s lead researcher) says the findings would suggest a possible avenue to more productive political discourse that would encourage a culture in which it is viewed as a virtue to evaluate beliefs based on logical reasoning and the available evidence. . . . . ‘In such a climate, politicians would get credit for engaging in a rational intellectually honest argument . . . . They would also think twice before making unfounded claims, because it would be perceived as immoral.’”
Since most people believe they are mostly or always quite rational, it seems reasonable to argue that rationality is a moral issue. The finding of personal value for evidence-based rational thinking about political issues suggests it be a possible basis for a political principle or moral value in political ideology.
B&B orig: 8/1/18
According to the paper’s abstract: “In the present article we demonstrate stable individual differences in the extent to which a reliance on logic and evidence in the formation and evaluation of beliefs is perceived as a moral virtue, and a reliance on less rational processes is perceived as a vice. We refer to this individual difference variable as moralized rationality. . . . Results show that the Moralized Rationality Scale (MRS) is internally consistent, and captures something distinct from the personal importance people attach to being rational (Studies 1–3). Furthermore, the MRS has high test-retest reliability (Study 4), is conceptually distinct from frequently used measures of individual differences in moral values, and it is negatively related to common beliefs that are not supported by scientific evidence (Study 5).” Ståhl T, Zaal MP, Skitka LJ (2016) Moralized Rationality: Relying on Logic and Evidence in the Formation and Evaluation of Belief Can Be Seen as a Moral Issue. PLoS ONE 11(11): e0166332.doi:10.1371/journal.pone.0166332.
According to Ståhl’s paper, “People who moralize rationality should not only respond more strongly to irrational (vs. rational) acts, but also towards the actors themselves. . . . . a central finding in the moral psychology literature is that differences in moral values and attitudes lead to intolerance. For example, the more morally convicted people are on a particular issue (i.e., the more their stance is grounded in their fundamental beliefs about what is right or wrong), the more they prefer to distance themselves socially from those who are attitudinally dissimilar.”
ScienceDaily commented on the paper: moral rationalists see less rational individuals as “less moral; prefer to distance themselves from them; and under some circumstances, even prefer them to be punished for their irrational behavior . . . . By contrast, individuals who moralized rationality judged others who were perceived as rational as more moral and worthy of praise. . . . While morality is commonly linked to religiosity and a belief in God, the current research identifies a secular moral value and how it may affect individuals' interpersonal relations and societal engagement.”
ScienceDaily also noted that “in the wake of a presidential election that often kept fact-checkers busy, Ståhl (the paper’s lead researcher) says the findings would suggest a possible avenue to more productive political discourse that would encourage a culture in which it is viewed as a virtue to evaluate beliefs based on logical reasoning and the available evidence. . . . . ‘In such a climate, politicians would get credit for engaging in a rational intellectually honest argument . . . . They would also think twice before making unfounded claims, because it would be perceived as immoral.’”
Since most people believe they are mostly or always quite rational, it seems reasonable to argue that rationality is a moral issue. The finding of personal value for evidence-based rational thinking about political issues suggests it be a possible basis for a political principle or moral value in political ideology.
B&B orig: 8/1/18
Cognitive Science: Halo Error Is Inevitable Like Death & Taxes
“The attractiveness stereotype is a specific instance of a more general psychological principle known as the halo effect, in which individuals ascribe characteristics to others based on the presence of another observable characteristic. Such errors are stunningly prevalent in data derived from ratings of others to such an extent that one scholar described the problem thusly: ‘halo error, like death and taxes, seems inevitable.’” Carl Palmer and Rolfe Peterson, American Politics Research, 44(2):353–382, 2016
Halo effect: “The halo effect is a type of immediate judgement discrepancy, or cognitive bias, where a person making an initial assessment of another person, place, or thing will assume ambiguous information based upon concrete information. A simplified example of the halo effect is when an individual noticing that the person in the photograph is attractive, well groomed, and properly attired, assumes, using a mental heuristic, that the person in the photograph is a good person based upon the rules of that individual's social concept.” (presumably, ‘judgement discrepancy’ means a demonstrable deviation of personal belief from objective truth)
In their paper, Halo Effects and the Attractiveness Premium in Perceptions of Political Expertise, Palmer and Peterson observe that “halo errors are thought to be a reflection of a rater’s inability to differentiate between characteristics being evaluated, although in many circumstances, these errors occur automatically, below the level of conscious information processing.”
To some extent, inputs such as a speaker’s personal attractiveness are unconsciously translated into a belief that the speaker is more knowledgeable, competent and/or trustworthy than might be warranted by other inputs such as the content of the speech.
In their study, Palmer and Peterson conducted surveys to assess the halo effect, which earlier studies had reported. They found that as previously observed, subjective assessments attractive people were more knowledgeable and persuasive than for others. They also found that attractive people, even if uninformed, were more likely to report attempting to persuade others. In addition, people surveyed “were more willing to turn to more attractive individuals as potential sources for political information.” Those results were observed even after controlling for possibly confounding factors such as partisanship and gender.
The authors pointed out that vision of attractiveness seems to be consistent in cultural groups, and this may be a universal human cognitive trait that is relatively stable over time. They also point out that attractiveness counts in elections: “Beyond competence, there is also a clear preference for more attractive candidates, with those rated as more attractive enjoying greater electoral success . . . . . Under conditions of limited information, citizens appear to vote with their eyes, rather than their minds. It is important to note that these attractiveness biases in expressed preferences not only emerge automatically but also appear to persist in light of additional information about the candidates.”
The latter statement refers to the well-known stickiness of at least some kinds of misinformation, e.g., climate change is a hoax.
The authors speculate about the fundamental basis of democracy: “It stands to reason that we should expect these biases to creep into political discussions as well, influencing individuals’ political perceptions, orientations, and, most importantly, with whom they choose to discuss politics. This tendency to engage in biased information processing raises questions not only about the ability of citizens to make suitable evaluations of the quality of candidates but also the expertise of political discussants.”
Social science has raised similar concerns before this. Evidence of a collapse in respect for expertise seems to be solid and stable, if not increasing. Given the complexity of most political issues and the rise of relentless propaganda and disrespect for both fact and logic, there is legitimate cause for concern.
The researchers asked where political influence was coming from. They observed: “But who is trying to influence whom? Is it simply the uninformed, attractive respondents who are influencing their social contacts, or are the politically active attractive individuals well informed, as well as politically active? Given our findings from Table 1, we believe it is the former, rather than the latter. . . . . The end result is that the less informed have their perceptions of the political world shaped and their voting decisions influenced by those they perceive to be credible others. If those perceptions of expertise are mistaken beliefs influenced by an individuals’ physical appearance, many poorly informed individuals might simply be being led astray as they seek to upgrade their political knowledge. The body of evidence we present would seem to confirm these normative concerns.”
B&B orig: 8/19/18
Halo effect: “The halo effect is a type of immediate judgement discrepancy, or cognitive bias, where a person making an initial assessment of another person, place, or thing will assume ambiguous information based upon concrete information. A simplified example of the halo effect is when an individual noticing that the person in the photograph is attractive, well groomed, and properly attired, assumes, using a mental heuristic, that the person in the photograph is a good person based upon the rules of that individual's social concept.” (presumably, ‘judgement discrepancy’ means a demonstrable deviation of personal belief from objective truth)
In their paper, Halo Effects and the Attractiveness Premium in Perceptions of Political Expertise, Palmer and Peterson observe that “halo errors are thought to be a reflection of a rater’s inability to differentiate between characteristics being evaluated, although in many circumstances, these errors occur automatically, below the level of conscious information processing.”
To some extent, inputs such as a speaker’s personal attractiveness are unconsciously translated into a belief that the speaker is more knowledgeable, competent and/or trustworthy than might be warranted by other inputs such as the content of the speech.
In their study, Palmer and Peterson conducted surveys to assess the halo effect, which earlier studies had reported. They found that as previously observed, subjective assessments attractive people were more knowledgeable and persuasive than for others. They also found that attractive people, even if uninformed, were more likely to report attempting to persuade others. In addition, people surveyed “were more willing to turn to more attractive individuals as potential sources for political information.” Those results were observed even after controlling for possibly confounding factors such as partisanship and gender.
The authors pointed out that vision of attractiveness seems to be consistent in cultural groups, and this may be a universal human cognitive trait that is relatively stable over time. They also point out that attractiveness counts in elections: “Beyond competence, there is also a clear preference for more attractive candidates, with those rated as more attractive enjoying greater electoral success . . . . . Under conditions of limited information, citizens appear to vote with their eyes, rather than their minds. It is important to note that these attractiveness biases in expressed preferences not only emerge automatically but also appear to persist in light of additional information about the candidates.”
The latter statement refers to the well-known stickiness of at least some kinds of misinformation, e.g., climate change is a hoax.
The authors speculate about the fundamental basis of democracy: “It stands to reason that we should expect these biases to creep into political discussions as well, influencing individuals’ political perceptions, orientations, and, most importantly, with whom they choose to discuss politics. This tendency to engage in biased information processing raises questions not only about the ability of citizens to make suitable evaluations of the quality of candidates but also the expertise of political discussants.”
Social science has raised similar concerns before this. Evidence of a collapse in respect for expertise seems to be solid and stable, if not increasing. Given the complexity of most political issues and the rise of relentless propaganda and disrespect for both fact and logic, there is legitimate cause for concern.
The researchers asked where political influence was coming from. They observed: “But who is trying to influence whom? Is it simply the uninformed, attractive respondents who are influencing their social contacts, or are the politically active attractive individuals well informed, as well as politically active? Given our findings from Table 1, we believe it is the former, rather than the latter. . . . . The end result is that the less informed have their perceptions of the political world shaped and their voting decisions influenced by those they perceive to be credible others. If those perceptions of expertise are mistaken beliefs influenced by an individuals’ physical appearance, many poorly informed individuals might simply be being led astray as they seek to upgrade their political knowledge. The body of evidence we present would seem to confirm these normative concerns.”
B&B orig: 8/19/18
Science: The Earliest Known Example of ‘Modern Cognition
What? This doesn’t look like much of anything
Nature, probably the world’s top science journal, has published a paper believed to reveal evidence of the earliest known human abstract drawing. The drawing dates to the Middle Stone Age, about 73,000 years ago. One researcher commented that this find is interpreted as “a prime indicator of modern cognition.” The rock fragment has cross-hatch lines sketched onto stone with red ochre pigment.
The paper’s abstract commented: “This notable discovery pre-dates the earliest previously known abstract and figurative drawings by at least 30,000 years. This drawing demonstrates the ability of early Homo sapiens in southern Africa to produce graphic designs on various media using different techniques.” Although scientists have found older drawings, this research indicates the lines on this stone mark the first abstract drawing, an indicator of abstract thinking.
Extrapolation of the lines on the rock fragment is interpreted to be an abstract drawing. The fragment was analyzed to be coarse-grained silcrete (length 38.6 mm, width 12.8 mm, height 15.4 mm). One inch equals 25.4 mm, so the fragment is small. According to one researcher, “the abrupt termination of all lines on the fragment edges indicates that the pattern originally extended over a larger surface.” Sometimes, that is how tenuous human knowledge or belief can be.
B&B orig: 9/12/18
Nature, probably the world’s top science journal, has published a paper believed to reveal evidence of the earliest known human abstract drawing. The drawing dates to the Middle Stone Age, about 73,000 years ago. One researcher commented that this find is interpreted as “a prime indicator of modern cognition.” The rock fragment has cross-hatch lines sketched onto stone with red ochre pigment.
The paper’s abstract commented: “This notable discovery pre-dates the earliest previously known abstract and figurative drawings by at least 30,000 years. This drawing demonstrates the ability of early Homo sapiens in southern Africa to produce graphic designs on various media using different techniques.” Although scientists have found older drawings, this research indicates the lines on this stone mark the first abstract drawing, an indicator of abstract thinking.
Extrapolation of the lines on the rock fragment is interpreted to be an abstract drawing. The fragment was analyzed to be coarse-grained silcrete (length 38.6 mm, width 12.8 mm, height 15.4 mm). One inch equals 25.4 mm, so the fragment is small. According to one researcher, “the abrupt termination of all lines on the fragment edges indicates that the pattern originally extended over a larger surface.” Sometimes, that is how tenuous human knowledge or belief can be.
B&B orig: 9/12/18
Cognitive Impairment Associated with Radical Political Beliefs
(A) Using factor analysis, we investigated the underlying factor structure of multiple questionnaires about political issues. Three latent factors were identified and labeled “political orientation,” “dogmatic intolerance,” and “authoritarianism” according to the pattern of individual item loadings. Item loadings for each question (questionnaires indicated by different colors) are presented.
(B–D) To investigate the relation between these constructs, scores on the three factors were extracted for each individual. (B) We observed a quadratic relationship between political orientation and dogmatic intolerance, revealing that people on the extremes of the political spectrum are more rigid and dogmatic in their world views. (C) A linear relationship between political orientation and authoritarianism was observed, with people from the far right of the political spectrum showing more obedience to authorities and conventions. (D) Dogmatic intolerance and authoritarianism were positively correlated, indicating commonality between these two sub-components of radicalism.
Widening polarization about political, religious, and scientific issues threatens open societies, leading to entrenchment of beliefs, reduced mutual understanding, and a pervasive negativity surrounding the very idea of consensus. Such radicalization has been linked to systematic differences in the certainty with which people adhere to particular beliefs. However, the drivers of unjustified certainty in radicals are rarely considered from the perspective of models of metacognition, and it remains unknown whether radicals show alterations in confidence bias (a tendency to publicly espouse higher confidence), metacognitive sensitivity (insight into the correctness of one’s beliefs), or both Max Rollwage et al., Current Biology, Vol. 28, Iss. 24, Pgs. 4014-4021, Dec. 17, 2018
Metacognition: awareness and understanding of one's own thought processes, roughly, self-awareness.
Confidence bias (overconfidence effect): a bias observed as a person’s subjective confidence in his or her judgements being greater than the objective accuracy of those judgements, especially when confidence is relatively high; overconfidence is one example of a miscalibration of subjective probabilities.
Motivated reasoning: a powerful emotion-biased decision-making phenomenon; the term refers to the role of motivation in cognitive processes such as decision-making and attitude change in a number of situations, including cognitive dissonance reduction, e.g., in the face of discomforting information or logic.
The journal Current Biology published a paper, Metacognitive Failure as a Feature of Those Holding Radical Beliefs, with some evidence that individuals who hold radical beliefs tend to lack self-awareness relative to others. This mindset was associated with higher confidence in correct and incorrect choices, and a reduced tendency to change levels of confidence in the face of new but contrary information.
The researchers pointed out that multiple cognitive effects could be going on that would account for the observed opinionation and resistance to change among radicals. Influences the researchers tried to dissect included motivated reasoning, confidence bias, and metacognition:
This research does not shed light on the direction cause and effect. Commenting on the paper, Steven Novella writes:
The results described here are asserted to come from the first attempt to tease cognitive processes apart to determine the cognitive and social sources of radicalism. Because of that, the research needs to be replicated and expanded to generate confidence in the results and conclusions. It is reasonable to think that multiple influences lead to radicalization including life experiences, personality, self and social identity, etc.
On replication of this research, it may turn out that a major source of radicalization, maybe the most important source, is impaired metacognition, as the researchers propose. In that case, there is a large body of research and real world experience with methods to teach enhanced metacognitive skill. The education community is fully aware of the usefulness of metacognition in education.
One reader of the 2009 Handbook of Metacognition in Education wrote this in his forward to the handbook: “This handbook goes a long way toward capturing the state of the science and the art of the study of metacognition. It reveals great strides in the sophistication and precision with which metacognition can be conceptualized, assessed, and developed [and] covers the gamut, including research and development on metacognition across a wide variety of subject-matter areas, as well as in more abstract issues of theory and measurement . . . . It is truly a landmark work.”
Maybe there is some hope for deradicalization of radicals and prevention of radicalization in minds susceptible to it. Of course, that begs the question of whether radical beliefs are usually more harmful than beneficial. There appears to be at least some research on that point.[1] For at least some uninformed people, ‘common sense’ might suggest radicalism is generally not a good thing.
Footnote:
1. From the article, Radical Beliefs and Violent Actions Are Not Synonymous: How to Place the Key Disjuncture Between Attitudes and Behaviors at the Heart of Our Research into Political Violence: This article develops and elaborates on three core points. First, as with research into other social science themes, it is argued that it is necessary to apply the logic of correlation and causality to the study of political violence. Second, it highlights the critical disjuncture between attitudes and behaviors. Many or most individuals who support the use of political violence remain on the sidelines, including those who sympathize with insurgents in Afghanistan (reportedly 29 percent in 2011), and those supportive of “suicide attacks” in the Palestinian Territories (reportedly reaching 66 percent in 2005). Conversely, those responsible for such behaviors are not necessarily supportive of the ostensible political aims. Third, it is argued that the motives that drive these attitudes and behaviors are often (or, some would argue, always) distinct. While the former are motivated by collective grievances, there is substantial case study evidence that the latter are commonly driven by economic (e.g., payments for the emplacement of improvised explosive devices), security-based (i.e., coercion) and sociopsychological (e.g., adventure, status, and vengeance) incentives. Thus, it is necessary for the research community to treat attitudes and behaviors as two separate, albeit interrelated, lines of inquiry.
B&B orig: 12/21/18
Widening polarization about political, religious, and scientific issues threatens open societies, leading to entrenchment of beliefs, reduced mutual understanding, and a pervasive negativity surrounding the very idea of consensus. Such radicalization has been linked to systematic differences in the certainty with which people adhere to particular beliefs. However, the drivers of unjustified certainty in radicals are rarely considered from the perspective of models of metacognition, and it remains unknown whether radicals show alterations in confidence bias (a tendency to publicly espouse higher confidence), metacognitive sensitivity (insight into the correctness of one’s beliefs), or both Max Rollwage et al., Current Biology, Vol. 28, Iss. 24, Pgs. 4014-4021, Dec. 17, 2018
Metacognition: awareness and understanding of one's own thought processes, roughly, self-awareness.
Confidence bias (overconfidence effect): a bias observed as a person’s subjective confidence in his or her judgements being greater than the objective accuracy of those judgements, especially when confidence is relatively high; overconfidence is one example of a miscalibration of subjective probabilities.
Motivated reasoning: a powerful emotion-biased decision-making phenomenon; the term refers to the role of motivation in cognitive processes such as decision-making and attitude change in a number of situations, including cognitive dissonance reduction, e.g., in the face of discomforting information or logic.
The journal Current Biology published a paper, Metacognitive Failure as a Feature of Those Holding Radical Beliefs, with some evidence that individuals who hold radical beliefs tend to lack self-awareness relative to others. This mindset was associated with higher confidence in correct and incorrect choices, and a reduced tendency to change levels of confidence in the face of new but contrary information.
The researchers pointed out that multiple cognitive effects could be going on that would account for the observed opinionation and resistance to change among radicals. Influences the researchers tried to dissect included motivated reasoning, confidence bias, and metacognition:
An unjustified certainty in one’s beliefs is a characteristic common to those espousing radical beliefs, and such overconfidence is observed for both political and non-political issues, implying a general cognitive bias in radicals. However, the underpinnings of radicals’ distorted confidence estimates remain unknown. In particular, one-shot measures of the discrepancy between performance and confidence are unable to disentangle the contributions of confidence bias (changes in an overall belief about performance, which may be affected by optimism and mood) from changes in metacognitive sensitivity (an ability to distinguish accurate from inaccurate performance). This distinction may be particularly important as changes in metacognitive sensitivity may account for radicals’ reluctance to change their mind in the face of new evidence.
This research does not shed light on the direction cause and effect. Commenting on the paper, Steven Novella writes:
What this study cannot tell us about is the arrow of cause and effect. One possibility is that those who lack the metacognitive ability to properly assess and correct their own confidence levels will tend to fall into more extreme views. Their confidence will allow them to more easily brush off dissenting opinions and information, more nuanced and moderate narratives, and the consensus of opinion.
At the same time I find it plausible that those who become radicalized into extreme political views may adopt overconfidence and stubbornness as a motivated reasoning strategy, in order to maintain their views, which they hold for emotional and identity reasons. This may become more of a general cognitive style that they employ, rather than being limited to just their radical views.
The results described here are asserted to come from the first attempt to tease cognitive processes apart to determine the cognitive and social sources of radicalism. Because of that, the research needs to be replicated and expanded to generate confidence in the results and conclusions. It is reasonable to think that multiple influences lead to radicalization including life experiences, personality, self and social identity, etc.
On replication of this research, it may turn out that a major source of radicalization, maybe the most important source, is impaired metacognition, as the researchers propose. In that case, there is a large body of research and real world experience with methods to teach enhanced metacognitive skill. The education community is fully aware of the usefulness of metacognition in education.
One reader of the 2009 Handbook of Metacognition in Education wrote this in his forward to the handbook: “This handbook goes a long way toward capturing the state of the science and the art of the study of metacognition. It reveals great strides in the sophistication and precision with which metacognition can be conceptualized, assessed, and developed [and] covers the gamut, including research and development on metacognition across a wide variety of subject-matter areas, as well as in more abstract issues of theory and measurement . . . . It is truly a landmark work.”
Maybe there is some hope for deradicalization of radicals and prevention of radicalization in minds susceptible to it. Of course, that begs the question of whether radical beliefs are usually more harmful than beneficial. There appears to be at least some research on that point.[1] For at least some uninformed people, ‘common sense’ might suggest radicalism is generally not a good thing.
Footnote:
1. From the article, Radical Beliefs and Violent Actions Are Not Synonymous: How to Place the Key Disjuncture Between Attitudes and Behaviors at the Heart of Our Research into Political Violence: This article develops and elaborates on three core points. First, as with research into other social science themes, it is argued that it is necessary to apply the logic of correlation and causality to the study of political violence. Second, it highlights the critical disjuncture between attitudes and behaviors. Many or most individuals who support the use of political violence remain on the sidelines, including those who sympathize with insurgents in Afghanistan (reportedly 29 percent in 2011), and those supportive of “suicide attacks” in the Palestinian Territories (reportedly reaching 66 percent in 2005). Conversely, those responsible for such behaviors are not necessarily supportive of the ostensible political aims. Third, it is argued that the motives that drive these attitudes and behaviors are often (or, some would argue, always) distinct. While the former are motivated by collective grievances, there is substantial case study evidence that the latter are commonly driven by economic (e.g., payments for the emplacement of improvised explosive devices), security-based (i.e., coercion) and sociopsychological (e.g., adventure, status, and vengeance) incentives. Thus, it is necessary for the research community to treat attitudes and behaviors as two separate, albeit interrelated, lines of inquiry.
B&B orig: 12/21/18
Does The Human Mind Create Illusions?
Image from a Mike Huckabee political ad - the unconscious mind treats the cross in the background as symbolic of Christianity and that instantly (within about 50-100 milliseconds) primes the unconscious mind to think positively or negatively about the accompanying message that Huckabee is trying to convey
CONTEXT: In a recent discussion here, one commenter rejected the idea that the human mind can and does create illusions. The idea was dismissed as woo, whatever that is. The logic went something like this: After all, a person can see an object, e.g., my dog snoring on the floor, and know that it is real and it is where it is, i.e., on the floor in my living room next to the boxes of dynamite I keep for self-defense. Other people can see it too, so there is no illusion. Therefore, what is this illusion nonsense? Reasonable informed people cannot possibly believe in illusions.
Things like dogs (and boxes of dynamite) lying on the floor are not illusions, at least in the important respects that humans rely on to go about their lives. Obviously the human cannot perceive everything about the dog because we are limited to interpreting information our senses can detect with the level of sensitivity our senses operate on. But for everyday life, information beyond human senses can be considered irrelevant and the dog is not an illusion in any significant sense.
So, no illusion, right? And, please, don't throw optical illusions in my face. I'm talking about meaningful illusions, not parlor tricks.
Parlor trick
The sources of illusion: Actually, optical illusions such as illusory rabbit and invisible rabbit are more than parlor tricks. They show how the human mind creating illusions in real time. But that's not the focus here. This channel is focused on politics and what people think they see and how they think about what they think they saw. The illusions that smart political manipulators and propagandists routinely create to win hearts and minds are absolutely central to politics.
In his 1991 book, The User Illusion, science writer Tor Norretranders summarized some of the neuroscience and cognitive science knowledge on how the human mind works and its data processing capacity. Norretranders describes the experiments of Benjamin Libet from the mid-1980s showing that the mind creates a time shift illusion about when we think we make a decision and when our minds actually made the decision. It turns out that our unconscious mind usually (> 99% of the time ?) makes a decision (1) about 0.5 second before we are consciously aware of the decision, but (2) we consciously believe we made the decision about 0.5 seconds after it was actually made. In other words, the mind creates the illusion that the conscious mind is in control and making decisions when that simply is not true. Later research pushed the time shift illusion out to about 7-10 seconds for at least certain kinds of decisions.
The importance of recognizing the time shift illusion is that the human mind can create illusions all the time and easily, and it happens fast. Our unconscious minds do most of the perceiving, thinking and deciding before we are consciously aware.
Norretranders and later researchers describe the vast difference in data processing power between the unconscious and conscious minds. Current estimates for the unconscious mind put bandwidth at about 11.1 million bits of information per second, with most of that being discarded as trivial. This happens in real time all of the time. The unconscious mind does parallel processing and draws on unknown thousands of memories available to it, but not to consciousness. By contrast the conscious mind can only do serial processing at a maximum of about 500 bits per second and it works from a maximum of 9 memories in working memory at any given time. Unconsciousness is fast and effortless. Consciousness is slow, easily distracted, easily tired and is largely trapped by the unconscious mind. The consciousness trap is this: For politics-related information that contradicts or undermines existing political beliefs, morals and/or ideology, the conscious mind looks to defend beliefs and decisions the unconscious mind has made, even when the beliefs and decisions are objectively false or bad.
In his 2012 book, social psychologist Johnathan Haidt describes the situation: “The mind is divided into parts, like a rider (consciousness) on an elephant (unconsciousness). The rider evolved to serve the elephant. . . . . intuitions come first, strategic reasoning second. Therefore, if you want to change someone’s mind about a moral or political issue, talk to the elephant first. . . . . moral intuitions (i.e., judgments) arise automatically [unconsciously] and almost instantaneously, long before moral reasoning [conscious reasoning] has a chance to get started, and those first intuitions tend to drive our later reasoning. . . . . The rider is our conscious reasoning—the stream of words and images of which we are fully aware. The elephant is the other 99 percent of mental processes—the ones that occur outside of awareness but that actually govern most of our behavior. . . . . We do moral reasoning not to reconstruct why we ourselves came to a judgment; we reason to find the best possible reasons why somebody else ought to join us in our judgment. . . . . The rider is skilled at fabricating post hoc explanations for whatever the elephant has just done, and it is good at finding reasons to justify whatever the elephant wants to do next. . . . . We make our first judgments rapidly, and we are dreadful at seeking out evidence that might disconfirm those initial judgments.”
In addition to the time shift illusion is the intuitive-emotional nature of cognition or perception. This applies to what we see and hear about politics-related information. In their 2016 book, Democracy for Realists: Why Elections Do Not Produce Responsive Government, social scientists Christopher Achen and Larry Bartels argue that existing evidence points to the inherent limitations of mental data processing and a process of perceiving and thinking about politics-related messaging. They describe the process like this: “. . . . the typical citizen drops down to a lower level of mental performance as soon as he enters the political field. He argues and analyzes in a way which he would readily recognize as infantile within the sphere of his real interests. . . . cherished ideas and judgments we bring to politics are stereotypes and simplifications with little room for adjustment as the facts change. . . . . the real environment is altogether too big, too complex, and too fleeting for direct acquaintance. We are not equipped to deal with so much subtlety, so much variety, so many permutations and combinations. Although we have to act in that environment, we have to reconstruct it on a simpler model before we can manage it.”
In their 2013 book, The Rationalizing Voter, social scientists Milton Lodge and Charles Taber discuss leading hypotheses about how the human mind deals with politics-related information: “We are witnessing a revolution in thinking about thinking. Three decades of research in the cognitive sciences, backed by hundreds of well-crafted behavioral studies in social psychology and new evidence from the neurosciences, posit affect-driven dual process models of thinking and reasoning that directly challenge the way we political scientists interpret and measure the content, structure, and relationships among political beliefs and attitudes. Central to such models is the distinction between conscious and unconscious thinking, with hundreds of experiments documenting pervasive effects of unconscious thoughts and feelings on judgment, preferences, attitude change, and decision-making.”
Another leading researcher commenting on Lodge and Taber’s book wrote this: “The central question in the study of political psychology and public opinion is whether citizens can form and update sensible beliefs and attitudes about politics. Though previous research was skeptical about the capacities of the mass public, many studies in the 1980s and early 1990s emphasized the potential merits of simple heuristics in helping citizens to make reasonable choices. In subsequent years, however, motivated reasoning has been impossible to avoid for anyone who follows either contemporary politics or the latest developments in psychology and political science. . . . . it is increasingly difficult for observers to defend micro-level attitude formation and information processing as rational or even consistently reasonable. Evidence continues to mount that people are often biased toward their prior beliefs and prone to reject counter-attitudinal information in the domains of both opinions and politically controversial facts.”
It important to point out that when objective fact and truth and sound logic accord with existing beliefs, morals and ideologies, there is no need for, and the mind does not, significantly distort fact and logic. In a real sense, the mind is a stubborn, easily self-deluded beast that wants to world to be what it wants it to be, even if that isn't the case. There is now a basis in neuroscience that explains this at least partly in terms of the strength of neural pathways and now neural pathways become strong, e.g., by repeating a political lie or spouting bogus reasoning (logic) often enough, the mind often comes to accept it as true.
In summary, all of that research points to a mind that, for reasons related to evolution, routinely distorts politics-related information that is perceived and how that distorted reality is further distorted unconsciously so as to make what is perceived better fit with existing beliefs, morals and political and religious ideologies. To the extent perceptions, thinking and decisions conflict with objective fact and unbiased logic, the mind produces illusion. All of that distortion and illusion is also powerfully influenced, unconsciously once again, by society and social institutions that people are, for better or worse, trapped in. In his 1963 book, Invitation to Sociology, sociologist Peter Berger wrote: “Society not only controls our movements, but shapes our identity, our thought, and our emotions.” Social institutions are therefore, to a significant extent, “structures of our own consciousness.”
If one accepts the foregoing science and its description as more true and valid than not, then one can see a basis to argue that, at least for politics, there is some or even a great deal of illusion that people of all political persuasions are subject to. That is not a criticism of human intelligence. It is a statement of biological, psychological and sociological fact and illusion is grounded in human evolutionary heritage.
Although these comments are focused on politics, the evidence shows the same applies to other areas of human activity, especially religion, but also to varying degrees to science and commerce. If one comes to accept that description of the human mind, one can easily come to see humans, their societies, and all their limitations in a very different light.
B&B orig: 1/17/19
CONTEXT: In a recent discussion here, one commenter rejected the idea that the human mind can and does create illusions. The idea was dismissed as woo, whatever that is. The logic went something like this: After all, a person can see an object, e.g., my dog snoring on the floor, and know that it is real and it is where it is, i.e., on the floor in my living room next to the boxes of dynamite I keep for self-defense. Other people can see it too, so there is no illusion. Therefore, what is this illusion nonsense? Reasonable informed people cannot possibly believe in illusions.
Things like dogs (and boxes of dynamite) lying on the floor are not illusions, at least in the important respects that humans rely on to go about their lives. Obviously the human cannot perceive everything about the dog because we are limited to interpreting information our senses can detect with the level of sensitivity our senses operate on. But for everyday life, information beyond human senses can be considered irrelevant and the dog is not an illusion in any significant sense.
So, no illusion, right? And, please, don't throw optical illusions in my face. I'm talking about meaningful illusions, not parlor tricks.
Parlor trick
The sources of illusion: Actually, optical illusions such as illusory rabbit and invisible rabbit are more than parlor tricks. They show how the human mind creating illusions in real time. But that's not the focus here. This channel is focused on politics and what people think they see and how they think about what they think they saw. The illusions that smart political manipulators and propagandists routinely create to win hearts and minds are absolutely central to politics.
In his 1991 book, The User Illusion, science writer Tor Norretranders summarized some of the neuroscience and cognitive science knowledge on how the human mind works and its data processing capacity. Norretranders describes the experiments of Benjamin Libet from the mid-1980s showing that the mind creates a time shift illusion about when we think we make a decision and when our minds actually made the decision. It turns out that our unconscious mind usually (> 99% of the time ?) makes a decision (1) about 0.5 second before we are consciously aware of the decision, but (2) we consciously believe we made the decision about 0.5 seconds after it was actually made. In other words, the mind creates the illusion that the conscious mind is in control and making decisions when that simply is not true. Later research pushed the time shift illusion out to about 7-10 seconds for at least certain kinds of decisions.
The importance of recognizing the time shift illusion is that the human mind can create illusions all the time and easily, and it happens fast. Our unconscious minds do most of the perceiving, thinking and deciding before we are consciously aware.
Norretranders and later researchers describe the vast difference in data processing power between the unconscious and conscious minds. Current estimates for the unconscious mind put bandwidth at about 11.1 million bits of information per second, with most of that being discarded as trivial. This happens in real time all of the time. The unconscious mind does parallel processing and draws on unknown thousands of memories available to it, but not to consciousness. By contrast the conscious mind can only do serial processing at a maximum of about 500 bits per second and it works from a maximum of 9 memories in working memory at any given time. Unconsciousness is fast and effortless. Consciousness is slow, easily distracted, easily tired and is largely trapped by the unconscious mind. The consciousness trap is this: For politics-related information that contradicts or undermines existing political beliefs, morals and/or ideology, the conscious mind looks to defend beliefs and decisions the unconscious mind has made, even when the beliefs and decisions are objectively false or bad.
In his 2012 book, social psychologist Johnathan Haidt describes the situation: “The mind is divided into parts, like a rider (consciousness) on an elephant (unconsciousness). The rider evolved to serve the elephant. . . . . intuitions come first, strategic reasoning second. Therefore, if you want to change someone’s mind about a moral or political issue, talk to the elephant first. . . . . moral intuitions (i.e., judgments) arise automatically [unconsciously] and almost instantaneously, long before moral reasoning [conscious reasoning] has a chance to get started, and those first intuitions tend to drive our later reasoning. . . . . The rider is our conscious reasoning—the stream of words and images of which we are fully aware. The elephant is the other 99 percent of mental processes—the ones that occur outside of awareness but that actually govern most of our behavior. . . . . We do moral reasoning not to reconstruct why we ourselves came to a judgment; we reason to find the best possible reasons why somebody else ought to join us in our judgment. . . . . The rider is skilled at fabricating post hoc explanations for whatever the elephant has just done, and it is good at finding reasons to justify whatever the elephant wants to do next. . . . . We make our first judgments rapidly, and we are dreadful at seeking out evidence that might disconfirm those initial judgments.”
In addition to the time shift illusion is the intuitive-emotional nature of cognition or perception. This applies to what we see and hear about politics-related information. In their 2016 book, Democracy for Realists: Why Elections Do Not Produce Responsive Government, social scientists Christopher Achen and Larry Bartels argue that existing evidence points to the inherent limitations of mental data processing and a process of perceiving and thinking about politics-related messaging. They describe the process like this: “. . . . the typical citizen drops down to a lower level of mental performance as soon as he enters the political field. He argues and analyzes in a way which he would readily recognize as infantile within the sphere of his real interests. . . . cherished ideas and judgments we bring to politics are stereotypes and simplifications with little room for adjustment as the facts change. . . . . the real environment is altogether too big, too complex, and too fleeting for direct acquaintance. We are not equipped to deal with so much subtlety, so much variety, so many permutations and combinations. Although we have to act in that environment, we have to reconstruct it on a simpler model before we can manage it.”
In their 2013 book, The Rationalizing Voter, social scientists Milton Lodge and Charles Taber discuss leading hypotheses about how the human mind deals with politics-related information: “We are witnessing a revolution in thinking about thinking. Three decades of research in the cognitive sciences, backed by hundreds of well-crafted behavioral studies in social psychology and new evidence from the neurosciences, posit affect-driven dual process models of thinking and reasoning that directly challenge the way we political scientists interpret and measure the content, structure, and relationships among political beliefs and attitudes. Central to such models is the distinction between conscious and unconscious thinking, with hundreds of experiments documenting pervasive effects of unconscious thoughts and feelings on judgment, preferences, attitude change, and decision-making.”
Another leading researcher commenting on Lodge and Taber’s book wrote this: “The central question in the study of political psychology and public opinion is whether citizens can form and update sensible beliefs and attitudes about politics. Though previous research was skeptical about the capacities of the mass public, many studies in the 1980s and early 1990s emphasized the potential merits of simple heuristics in helping citizens to make reasonable choices. In subsequent years, however, motivated reasoning has been impossible to avoid for anyone who follows either contemporary politics or the latest developments in psychology and political science. . . . . it is increasingly difficult for observers to defend micro-level attitude formation and information processing as rational or even consistently reasonable. Evidence continues to mount that people are often biased toward their prior beliefs and prone to reject counter-attitudinal information in the domains of both opinions and politically controversial facts.”
It important to point out that when objective fact and truth and sound logic accord with existing beliefs, morals and ideologies, there is no need for, and the mind does not, significantly distort fact and logic. In a real sense, the mind is a stubborn, easily self-deluded beast that wants to world to be what it wants it to be, even if that isn't the case. There is now a basis in neuroscience that explains this at least partly in terms of the strength of neural pathways and now neural pathways become strong, e.g., by repeating a political lie or spouting bogus reasoning (logic) often enough, the mind often comes to accept it as true.
In summary, all of that research points to a mind that, for reasons related to evolution, routinely distorts politics-related information that is perceived and how that distorted reality is further distorted unconsciously so as to make what is perceived better fit with existing beliefs, morals and political and religious ideologies. To the extent perceptions, thinking and decisions conflict with objective fact and unbiased logic, the mind produces illusion. All of that distortion and illusion is also powerfully influenced, unconsciously once again, by society and social institutions that people are, for better or worse, trapped in. In his 1963 book, Invitation to Sociology, sociologist Peter Berger wrote: “Society not only controls our movements, but shapes our identity, our thought, and our emotions.” Social institutions are therefore, to a significant extent, “structures of our own consciousness.”
If one accepts the foregoing science and its description as more true and valid than not, then one can see a basis to argue that, at least for politics, there is some or even a great deal of illusion that people of all political persuasions are subject to. That is not a criticism of human intelligence. It is a statement of biological, psychological and sociological fact and illusion is grounded in human evolutionary heritage.
Although these comments are focused on politics, the evidence shows the same applies to other areas of human activity, especially religion, but also to varying degrees to science and commerce. If one comes to accept that description of the human mind, one can easily come to see humans, their societies, and all their limitations in a very different light.
B&B orig: 1/17/19
Subscribe to:
Posts (Atom)