Etiquette



DP Etiquette

First rule: Don't be a jackass. Most people are good.

Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.

Tuesday, October 8, 2024

Chapter review: Morality


The 2021 book, An Introduction to the Cognitive Science of Religion: Connecting Evolution, Brain, Cognition, and Culture, is a college undergraduate textbook by Claire White. Chapter 8 is entitled Morality. of As the book’s title indicates, research on morality and religion is multidisciplinary. My main interest is the science of morality related to politics and what progress the field has made in the last ~25 years since I became aware of the high importance of morality. 

From what I can tell, the field has not progressed much. What progress there is seems to be a modest increased understanding by some researchers at how little they know compared to what they used to think they knew. But that is not all bad. Dispelling misunderstandings is necessary for researchers to progress by reassessing their research and data.

Some points White makes about current beliefs among experts:
  • Morality is believed to have arose from the human need to cooperate for survival, a human cognitive trait that is believed to predate religion. Children show signs of moral understanding and behavior at an early age, indicating that moral impulses are at least partly hard wired early on.
  • White and experts are unable to define morality, which arguably (in my opinion) makes it an essentially contested concept: “Although scholars differ in how they conceptualize the term morality, it is used here in a broad sense too refer to standards or principles about right or wrong conduct. .... Scholars seem to circumvent the definitional problems of studying morality by investigating ‘prosociality’ to mean behavior that furthers the interests of a particular group. Yet scholars are seldom explicit about what, precisely, their use of the term prosociality designates. The lack of upfront conceptualizations is especially problematic because, depending on which definition is used, the same behavior can be labeled as prosocial or not. For instance, murder and even genocide can be viewed as prosocial according to the evolutionary conceptualization of the term because they [at least sometimes] facilitate success in intergroup competition.” 
  • Theists and non-theists show some overlap in moral beliefs, but religious social and moral influences sometimes makes theist morality manifest differently compared to non-theists. Specifically, theists tend to be less trustful of members of other religions, and even more distrustful of non-theists. Theists tend to direct their moral impulses and behaviors to members of their group. That sometimes leads to social division. By contrast, non-theists tend to apply their moral values universally to all people. Presumably the non-theist moral mindset and behavior tends to be less socially divisive, but White does not comment specifically on that point in Chapter 8.

White discusses one bit of possible progress, namely a view by some experts that morality is not a cognitive unitary whole, but instead moral judgments and beliefs are fragmented into different moral domains or values. Responses to those values are triggered by different things and weighed differently by different individuals. This line of thinking goes back to 2004 when Moral Foundations Theory (MFT) was initially proposed. MFT hypothesized the existence of several basic moral domains or values that constituted care/harm, fairness/cheating, loyalty/betrayal, authority/subversion, sanctity/degradation, and later liberty/oppression. Values of equality and proportionality were added to MFT in 2023. Different people displayed sensitivities to, or even degrees of rejection of, those moral values in their reasoning, beliefs and behaviors. 

An overlapping variant of MFT that White calls Morality Fractionation has been proposed to broaden the MFT concept to includes lots of moral values or domains. MFT constitutes one specific implementation or example of the broader concept of morality fractionation. Both convey the general idea that distinct moral values, domains or mental modules exist.

My suspicion is that some or many individual moral values or domains at least partly overlap with other moral values or domains. If so, sensitivity to, or rejection of, one moral value may be able to influence a person's reaction to the triggering of a different value. That is a very messy state of affairs if it is real.


Germaines commentary & hubris
The cognitive science of morality still struggles with simply describing and understanding what morality is. As best I can tell, cognitive science is nowhere close to having any authoritative, or at least near-universal, prescriptive theory of how people ought to make moral judgments, much less what judgments should be under various circumstances. My suspicion is that this goal will turn out to be something that humans can never achieve. If nothing else, (1) essentially contested concepts, (2) differing personal and cultural norms, and (3) dark free speech, all stand squarely in the way of a universal moral theory.

There are a few semi-universal beliefs that White mentions, i.e., don’t murder or steal, but in my opinion that is infantile in its shallowness. It is inadequate, to put it mildly. My sense of White, maybe significantly reflecting the mindsets of experts generally, is that her mind is trapped by her academic circumstances and the cramped, constraining sociology and history of the cognitive science of morality. Assuming most experts want to improve and/or sustain a happy and peaceful human condition, I suspect that the human history of morality blinds experts to what the most important moral values actually are in modern societies. The experts keep looking back to the apes and young children. Instead they need to look at least as hard, probably a lot harder, at modern adults as individuals or in groups, tribes, nations and political mindsets. Yes, political mindsets. (I'll circle back to this assertion shortly)

Despite science’s limited knowledge and excruciatingly slow progress, I find that White’s overview of morality in chapter 8 is rather comforting. I strongly suspect that morality fractionation is on the right track and MFT is a part of the story, maybe a big part. But I admit to having serious bias in favor of the more recent morality fractionation hypothesis. What bias?

Two kinds of bias. First, the same bias that I asserted (without evidence) most experts have, i.e., a general desire to improve and/or sustain a happy and peaceful human condition. I bet if a survey was done, at least 95% of experts would agree with that assertion, larded with essentially contested concepts as it is. If that is true, and I bet it is, then there is no choice but to consider fundamentally different political mindsets as a central focus of the cognitive science of morality. What different political-moral mindsets? Pro-democracy and pro-authoritarian. Politics and morality cannot be separated. That is an inherent part of the human condition.

My second bias is that the moral foundations of pro-democracy and pro-authoritarianism are fundamentally different. Generally speaking, and as supported by human history, including the modern American MAGA movement, democracy exists and sustains itself only on the basis of reasonable acceptance of facts, true truths, sound reasoning and reasonable compromise (polluted with biases and moral judgments as those factors may be). All major forms of authoritarianism I am aware of, autocracy, plutocracy and theocracy, are opposed to or reject reasonable compromise, and facts, true truths, and sound reasoning, especially when they are inconvenient. 

From that point of view, and in view of the morality fractionation hypothesis, there is good reason to think that (i) support for reasonable compromise, and (ii) fidelity to facts, true truths, and sound reasoning (less irrational) are all core moral political values, separately or overlapping.

Yeah, that is hubris, but I think is it basically correct. My pragmatic rationalism anti-ideology ideology easily fits into the morality fractionation hypothesis. Fitting it into MFT is messier, but probably doable. It is possible that in time data from new research will collapse MFT and the morality fractionation hypothesis back into the earlier unitary morality hypothesis. But at present, that strikes me as unlikely. 

For better or worse, we’re still awfully ignorant. But, we ought to have a better grasp of the cognitive science of morality in another 20-30 years. Of course, I said that in 2017 in my original review of S.M. Liao’s 2016 book, Moral Brains: The Neuroscience of Morality. (review reposted here in 2019).

Is it just me, or is the science of morality in a slowed time warp? . . . . Is morality science something humans can even coherently study? . . . . . grumble, grumble . . . . . . ๐Ÿคจ 

Monday, October 7, 2024

Like Snowy, I need help…

In furtherance of promoting my grouch-bucket OP this morning, I’m forming a new group. I’m looking for the perfect name for our new society. Any ideas? I was thinking: 


“Grouch-Bucketers Unite!” ๐Ÿ’ช


“Foundation for the United States of Grouch-Bucketer’s”


“GB’rs Rule! ๐Ÿ™

(Motto: There is literally NOTHING we won’t complain about.”


(by PrimalSoup)


Go ahead…let it all hang out! ๐Ÿ˜๐Ÿ˜ฎ


(Oops, I meant this for Snowy’s Forum. But, on second thought, maybe this is that right place.  ๐Ÿ˜)

War Game

Might as well get it out of the way - I thought it was BS. I won't say why, everyone will have to decide for themselves.

THAT doesn't mean it isn't worth a look. It definitely is.

https://www.imdb.com/title/tt26681810/

At least check out the trailer in that link.

Simply, it prophesizes what would happen if another insurrection occurred and when a President should or should not activate the military under the Insurrection Act. 

Why I thought it was BS is kinda irrelevant and I don't want to taint your enjoyment or opinion of the film.

Let's just say, WOW, the timing is perfect with another deeply contested election coming up and the dangers of another attempted insurrection. The Candidates featured in this film or not Harris or Trump, but fictional candidates. BUT the scenario of how an insurrection would play out and be handled comes across with a great deal of realism. 

Give it a viewing. And if you are inclined, come back to this thread and give us your take on it. 





Biology & morality: Asymmetry in political lying; The mindset of some trolls

Researchers at MIT and Oxford published an interesting paper in Nature, Differences in misinformation sharing can lead to politically asymmetric sanctions. That research indicates that conservatives tend to share misinformation more than liberals. This arguably also applies to the spread of disinformation (see the 2nd part of this post below about trolls). What the boffins postulate is that asymmetric misinformation spreading, not political bias, is the main reason that conservatives get canceled more than by social media sites is that they violate policies against spreading misinformation, not because the sites are biased.

I believe that is probably true, but as usual, the results need to be repeated and confirmed. True or not, one can apply the moral logic of Sisella Bok and believe that liberal or conservative people who knowingly share misinformation online, are more immoral (and sometimes evil) than people who do not spread knowingly misinformation. Bok's moral logic is simple

Misinformation, lies, slanders, crackpottery and the like lead some people to base their beliefs and behaviors on false information and/or flawed thinking. That takes from deceived people their power to believe and act for themselves based on facts and sound reason. Sometimes, behavior grounded in deceit physically or financially harms some people. Sometimes the harm amounts to literal death, e.g., for false belief in anti-vaxx lies and crackpottery.

How to assess people who unknowingly spread misinformation presents a somewhat different moral analysis. 

Also, authoritarians who believe that the ends, e.g., their side winning and gaining power and/or wealth, justify the means, e.g., spreading misinformation, lies, slanders and crackpottery, will say and/or sometimes actually believe that their tactics at least as moral as people who are constrained by fact, true truth and sound reasoning. 

The moral reasoning that those people assert is wrong. Flat out wrong for the reasons that Bok laid out decades ago. And therein lies the most important moral distinction between authoritarianism (kleptocratic autocracy, plutocracy and/or theocracy) and democracy. A corollary is that the mindset of most or nearly all chronic liars, slanderers and crackpotters are significantly more authoritarian (and kleptocratic) than democratic. People like this are not values voters in the context of democracy. They are the opposite because their moral values are rotted. 


In response to intense pressure, technology companies have enacted policies to combat misinformation. The enforcement of these policies has, however, led to technology companies being regularly accused of political bias. We argue that differential sharing of misinformation by people identifying with different political groups could lead to political asymmetries in enforcement, even by unbiased policies. We first analysed 9,000 politically active Twitter users during the US 2020 presidential election. Although users estimated to be pro-Trump/conservative were indeed substantially more likely to be suspended than those estimated to be pro-Biden/liberal, users who were pro-Trump/conservative also shared far more links to various sets of low-quality news sites—even when news quality was determined by politically balanced groups of laypeople, or groups of only Republican laypeople—and had higher estimated likelihoods of being bots. We find similar associations between stated or inferred conservatism and low-quality news sharing (on the basis of both expert and politically balanced layperson ratings) in 7 other datasets of sharing from Twitter, Facebook and survey experiments, spanning 2016 to 2023 and including data from 16 different countries. Thus, even under politically neutral anti-misinformation policies, political asymmetries in enforcement should be expected. (emphasis added)

_____________________________________________________________________
_____________________________________________________________________

The Conversation published an interesting article about the mindset that some trolls have when they are doing their troll thing:
Some online conspiracy-spreaders don’t even 
believe the lies they’re spewing

There has been a lot of research on the types of people who believe conspiracy theories, and their reasons for doing so. But there’s a wrinkle: My colleagues and I have found that there are a number of people sharing conspiracies online who don’t believe their own content.

They are opportunists. These people share conspiracy theories to promote conflict, cause chaos, recruit and radicalize potential followers, make money, harass, or even just to get attention.

In our chapter of a new book on extremism and conspiracies, my colleagues and I discuss evidence that certain extremist groups intentionally use conspiracy theories to entice adherents. They are looking for a so-called “gateway conspiracy” that will lure someone into talking to them, and then be vulnerable to radicalization. They try out multiple conspiracies to see what sticks.

When the Boogaloo Bois militia group showed up at the Jan. 6, 2021, insurrection, for example, members stated they didn’t actually endorse the stolen election conspiracy, but were there to “mess with the federal government.” Aron McKillips, a Boogaloo member arrested in 2022 as part of an FBI sting, is another example of an opportunistic conspiracist. In his own words: “I don’t believe in anything. I’m only here for the violence.”

In general, research has found that individuals with what scholars call a high “need for chaos” are more likely to indiscriminately share conspiracies, regardless of belief. These are the everyday trolls who share false content for a variety of reasons, none of which are benevolent. Dark personalities and dark motives are prevalent.

Plenty of regular people share content where they doubt the veracity, or know it is false.

These posts are common: Friends, family and acquaintances share the latest conspiracy theory with “could this be true?” queries or “seems close enough to the truth” taglines. Their accompanying comments show that sharers are, at minimum, unsure about the truthfulness of the content, but they share nonetheless. Many share without even reading past a headline. Still others, approximately 7% to 20% of social media users, share despite knowing the content is false. Why?

Often, folks are just looking for attention or other personal benefit. They don’t want to miss out on a hot-topic conversation. They want the likes and shares. They want to “stir the pot.” Or they just like the message and want to signal to others that they share a common belief system.
One can apply the same moral logic to trolls who knowingly spread false information for whatever reason. They are morally rotted. If people get hurt, they are evil. But I bet that most trolls like this don't care how society sees them or their moral character. After all, if they don't care about what they spew online, why would they care about their moral standing? These folks are not value voters. They are toxic parasites on democracy and society.