Etiquette



DP Etiquette

First rule: Don't be a jackass.

Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.

Sunday, August 11, 2019

Dissecting False Information: Some Know Truth Better Than They Admit

Disinformation: intentionally false or inaccurate information that is spread deliberately and intended to convince someone of untruth or lies

Misinformation: false information that is spread, regardless of whether there is intent to mislead, but research strongly suggests that essentially all politics-related information is intended to convince listeners of its truth

False information: information that is objectively false; it includes all disinformation and all misinformation

Expressive responding: intentional reporting of false assertions of trust in a partisan information source to signal support for a party or tribe instead of an intent to signal belief in the information the source reports; this signals a person’s party or tribe loyalty by claiming belief that false information is true even when the person knows it is false; people who respond expressively choose to misreport their beliefs to show support for their political group

Motivated reasoning: applying little or no critical assessment of information that confirms or reinforces existing beliefs, ideology and tribal or social identity, while applying critical assessment of information that contradicts or undermines existing beliefs, ideology and tribal or social identity making belief in such information difficult or impossible, even if the information is true; people tend to evaluate information that aligns with their views as more trustworthy and truthful, while tending to see misaligned information as not trustworthy and thus not truthful

An initial study based on data obtained from 400 participants, suggests two interesting findings about what influences how people respond to information related to politics.[1] The research was based on headlines about true political news stories that were asserted to have come from either the New York Times or Fox News. None of the stories were from either the NYT or Fox. Research participants were asked if they believed the headlines. The stories, all true, were selected based on prior research showing that people had a hard time telling if the stories were true or false.

People were assigned into two groups. The control group reported whether they believed the 16 stories they were shown was true or false. The treatment group was paid a bonus of $1.60 to correctly state whether 12 of the 16 stories were true or false.


The research was designed to try to determine the relative contribution of three different factors that could affect people’s trust in political information; (1) perceived institutional trustworthiness, e.g., NYT vs Fox News, (2) motivated reasoning, and (3) expressive responding. The researchers write in their article: “While these mechanisms are not exclusive, it is important to estimate their separate impact to not conflate a crisis in trust in the media with a rise in political expressive behavior.”

The first finding is that the source of the news or institutional trustworthiness, NYT or Fox, was not an influential factor (p > 0.05), which was not expected based on prior research. That data is summarized in the top and bottom right panels of figure 2 shown below. The institutional trustworthiness data obtained from the control and paid groups was reported as indicating a lower level of influence than expected from prior research: “The figure shows that participants from the left and right rated New York Times articles and Fox News articles as true at a similar rate (right panel).”



What was more important was information that tended to confirm or contradict existing ideology, beliefs and tribe identity (p < 0.0001). The $1.60 incentive to correctly assess true or false stories increased accuracy, but that failed to achieve statistical significance among left-leaning participants. That outcome puzzled the researchers who expected to see similar expressive responding results from left- and right-leaning participants.

The authors conclude with this:
There is some good news in our study: we show that the bias that is introduced by evaluating a politically aligned source may not be as severe as has been widely believed. We offer some bad news as well: there is a large gap in evaluating headline claims, depending on whether they align with a person’s politics. Worse, this gap is not significantly reduced even when the claims are made by a publisher that aligns with participant’s political views.

CAVEATS: This research must be taken with a grain of salt. It must be confirmed in a larger, follow-on replication study. The small influence of (i) institutional trustworthiness and the failure to see expressive reasoning in left-leaning participants contradict prior results. The point of this discussion is not to assert the validity of this study. Instead, the point is to show that (1) researchers are highly focused on trying to understand the deadly serious problem of false information influence on American politics, and (2) how tricky it is to tease apart the different cognitive and social factors that lead to false and irrational political beliefs and behaviors.[2]

Footnotes:
1. This research is preliminary. The published manuscript (free download here) has not been peer-reviewed. This research needs to be replicated and expanded on to confirm the results.

2. The researchers acknowledge possible sources of error in their study.
Our study is not without limitations. It is possible that the participant responses in our incentive treatment group do not present respondents’ truthful evaluations of the headlines, as we propose, but instead are their best guess of what the researchers might label as ‘true’ or ‘false.’ . . . . However, our post-experiment questionnaire and open-ended responses by participants did not provide any indication that such activity had taken place.

Second, our study was limited to a specific set of publishers and our choice may have affected the results’ generalizability. Of related, but lesser concern, is the potential effect of the specific articles we selected as our stimuli. We believe, however, that our selections were robust, as we relied on previous literature and pre-tests to arrive at balanced samples.

B&B orig: 2/20/19

The Tribalism of Truth



Morally objective or relative?: A religious pregnant woman seeking an abortion argues in court that her deeply held religious beliefs are that that “a nonviable fetus is not a separate human being but is part of her body and that abortion of a nonviable fetus does not terminate the life of a separate, unique, living human being.” The woman is arguing violation of the establishment clause by being forced by state law to wait (i) 72 hours to have her abortion, and (ii) read a pamphlet that states that life begins at conception, which she argues is a nonmedical religious viewpoint she rejects as false.

Writing for Scientific American, cognitive scientist Matthew Fisher and colleagues raise the questions of if and how polarized American political discourse affects perceptions of truth. Fisher is not asking if being an objectivist shapes behaviors. For example, some research evidence shows that objectivists tend to shy away from relativists or objectivists with opposing beliefs. The hypothesis is that it's not worth listening to anyone who disagrees with the objectivist's personal beliefs. That point requires new research to answer.

Instead Fisher asks this: Is it possible that when objectivists interact with people who disagree with their beliefs, they experience subtle mindset shifts that alter the degree to which they are objectivist about challenged beliefs? Existing research is clear that people vary in their degree of relativism and objectivism. What is not yet known is if or how mindsets change in response to belief challenges under various social circumstances.

The winning vs learning experiment: Fisher describes one experiment that he and his colleagues ran to begin answering the ‘mindset shift’ question. (Mindset shift is my term for the phenomenon - Fisher didn't label it) In the win vs learn experiment, Fisher paired people with opposite views on abortion, gun control and other issues. The pairs would engage in an online conversation under one of two sets of instructions. The first group was instructed that the conversation was competitive and a winner would be assessed. The second group instructed that the conversation was intended to be informational to assess how well each participant came to understand the other's beliefs and basis for them.

Not surprisingly, the online conversations in first group sounded exactly like current, emotionally charged and polarized political rhetoric. It was mostly useless. By contrast, the second group conversations had a civilized tone and generally revealed the reasons for why people believed as they did.

The participants were then assessed for what effects, if any, could be detected in mindsets. Fisher asked: “But would these exchanges in turn lead to different views about the very nature of the question being discussed? After the conversation was over, we asked participants whether they thought there was an objective truth about the topics they had just debated.”

The tentative answer is yes: “Strikingly, these 15-minute exchanges actually shifted people's views [i.e., caused mindset shift]. People were more objectivist after arguing to win than they were after arguing to learn.”

Given that result, ‘arguing’ in the learning mode seems like a misnomer. When one is learning without the fact- and logic-destroying motivation to win, maybe it's better to call it conversing. In terms of brain biology, debating to win doesn't have the same biological effect as conversing to learn.

If the results here hold up to additional research and are found to be influential, there could be important implications for politics. First, Americans would do well to reject the winner take all attitude that increasingly characterize polarized political debate and rhetoric. Second, one should acknowledge that the objectivist mindset has been actively fostered for decades by the two-party system, especially republicans and their no-compromise ideology. That no-compromise mindset is now growing on the left, presumably in reaction to its rise on the right. That rejection of civility for moral absolutes constituted a profound betrayal of the American people and democratic norms. Unless one is an intractable moral objectivist,[1] it may also constitute a threat to American democracy and values.

Footnote:
1. To test whether you tend toward moral relativism or objectivism, here's a self-assessment test. “This short word problem has proven remarkably successful in assessing people's tendency to look at multiple possibilities, an indication of a relativist moral sensibility. Try the test and see in which camp you belong.”

The green blocks problem There are five blocks in a stack. In this stack, the second block from the top is green, and the fourth is not green. Is a green block definitely on top of a non-green block?
A. Yes
B. No
C. Cannot be determined

B&B orig: 2/4/18; DP 8/11/19

Trump's play on our cognitive biology



According to one observer, Scott Adams, the creator of the Dilbert comic strip, Donald Trump's rhetorical style is a masterpiece of persuasion. Consciously or not, Trump has mastered the art of speaking to our intuitive-emotional unconscious minds to persuade people to his side.

On his blog at dilbert.com, Adams describes his take on politics like this: “For new readers of this blog, my starting point is the understanding that human brains did not evolve to show us reality. We aren’t that smart. Instead, our brains create little movies in our heads, and yours can be completely different from mine.”

Adams is an aficionado of hypnosis and the art of persuasion via rhetorical tricks. He knows more than a little about human cognitive biology. Rhetorical tricks can fool our cognitive biological processes to create realities the speaker wants to create, regardless of how well or poorly tethered to objective reality they may be. Those tricks are persuasive to our unconscious minds. For the most part, the tricks bypass conscious reasoning.

After hearing Trump in the first primary debate, most people thought Trump's performance was the death knell of Trump's candidacy. By contrast, Adams saw in Trump's rhetorical style the makings of an election victory based on his mastery of the art of persuasion. In an interview with Caroline Winter for Bloomberg Businessweek ( Mar. 27 - Apr. 2 2017 issue, pages 58-61), Winter writes of the debate: “In August 2015 viewers of the first Republican primary debate could be forgiven for thinking that Donald Trump was finished. “You’ve called women you don’t like fat pigs, dogs, slobs, and disgusting animals,” the moderator, Fox News anchor Megyn Kelly, said to him. “You once told a ‘Celebrity Apprentice’ it would be a pretty picture to see her on her knees. Does that sound to you like the temperament of a man we should elect as president?” Trump didn’t act contrite, or statesmanlike, as conventional candidates might have done. Instead, he interrupted Kelly with another nasty dig, about Rosie O’Donnell, and volunteered that he’d probably insulted others, too. Many pundits proclaimed that the response cemented Trump’s unelectability.”

Winter writes that Adams saw in Trump's performance “something different. In that moment, he realized that Trump might be a kindred spirit—a fellow “Master Wizard,” Adams’s term for experts in hypnosis and persuasion. Watching the debate alone at home, he grew excited. “I really got out of my chair and said, ‘Whoa, there’s something happening here that’s not like regular politics,’ ” Adams recalled. As he saw it, Trump had deftly defanged Kelly’s accusations by replacing them with a powerful visual: the iconic O’Donnell, “who is very unpopular among his base,” Adams said. “It was the most brilliant thing I’ve ever seen.” A week later, he published a blog post titled ‘Clown Genius.’ In the 3D world of emotion, where Trump exclusively plays, he has set the world up for the most clever persuasion you will ever see.”

The persuasive techniques that Trump uses include deft application of the powerful unconscious bias called anchoring[1] in a game of 3-dimensional emotional chess.

Finally, Winter observes in her article “Of Trump, he [Adams] wrote: “There is an eerie consistency to his success so far. Is there a method to it? ... Probably yes. Allow me to describe some of the hypnosis and persuasion methods Mr. Trump has employed on you.” At a time when virtually the entire professional political class was convinced Trump would self-immolate, Adams’s essay reframed his actions as the deliberate work of a political savant. Trump, he wrote, was using such “Persuasion 101” tricks as “anchors,” “intentional exaggeration,” and “thinking past the sale” to wage “three-dimensional chess” against his opponents and the media, including Kelly and Fox News. “Now that Trump owns Fox, and I see how well his anchor trick works with the public,” Adams concluded, “I’m going to predict he will be our next president.” . . . . “My predictions are based on my unique view into Trump’s toolbox of persuasion, . . . . I believe those tools are invisible to almost everyone but trained hypnotists and people that study the science of persuasion.””

Adams is a Trump supporter. He sees his blog as doing a public service. He just might be right about performing a public service by describing why Trump's rhetorical style is so powerful.

Questions: Assuming that Adams is right and Trump is a master of persuasion, does that mean that Trump will always work toward the right thing using his powerful talent? In other words, is it possible that a Master Wizard always acts in the public interest, or can there be White Hat, Black Hat and various shades of Grey Hat Master Wizards?

Footnote:
1. According to Wikipedia, “anchoring is a cognitive bias that describes the common human tendency to rely too heavily on the first piece of information offered (the "anchor") when making decisions. During decision making, anchoring occurs when individuals use an initial piece of information to make subsequent [unconscious] judgments. Once an anchor is set, other [unconscious] judgments are made by adjusting away from that anchor, and there is a bias toward interpreting other information around the anchor.”



B&B orig: 4/1/17; DP: 8/11/19

The 2016 cognitive bias codex



Cognitive science and bias research is fairly new. Most of the good stuff and fundamental observations arose from research since the 1960s. As the field continues to mature and knowledge becomes more nuanced, a more detailed, more accurate view of human cognition is coming into view.

The list (codex) is an attempt to convey the richness, and biological and social bases that human biases are grounded in. An excellent layman's overview of biases is described at the site Better Humans. The author of the overview, Buster Benson, argues that human biases are designed to deal with one or more of four basic problems: “Information overload, lack of meaning, the need to act fast, and how to know what needs to be remembered for later.”

Followers of B&B might recognize reference to some of those problems. The issue of lack of meaning in political rhetoric has been raised here several times. Misunderstanding is often grounded in subjective or personal meanings or definitions for core concepts that people use in politics. The other big problem is information overload and that is made worse by fake news, alt-facts, and other tactics of deceit. All of those are aspects of core constitutionally protected political free speech. For sophisticated spinners and liars, there is a whole menu of biases to play on to create false facts and realities using bogus logic or reasoning. The possibilities for deceit are sobering to say the least.

B&B orig: 4/14/17

Media Reporting and the Cognitive Biology of Terrorism

CONTEXT: The human mind evolved in such a way that it places greater importance and biological responses on real and perceived threats. The survival benefit is obvious. The human mind also evolved in such a way that it perceives reality through personal biasing lenses or mental processes. Important sources of reality biasing and simplifying include personal morals, political ideology, universal innate biases such as confirmation bias and framing effects or biases, innate and learned mental rules of thumb (heuristics) such as anchoring and availability heuristics, and our social and group identities, e.g., race, political party affiliation, gender, religion, etc. Human biasing lenses usually operate unconsciously (> 98% of the time?) and perceptions of reality and beliefs are therefore often mistaken as arising from conscious reason. The degree of reality and logic distortion resulting from normal biasing is high, but it appears to be necessary for the human mind to make sense or coherence of a complex world based on information that is usually far too limited for any rational basis for coherence.

FRAMING EFFECTS: After a terror attack resulting in a murder(s) and a claim of responsibility by a terrorist(s) and/or terror group, US mainstream media sources routinely report on the individual's or group's claim of responsibility. Characterizing an attack as a "claim of responsibility" frames the attack in such a way as to glamorize the attack in the minds of individuals susceptible to terrorism appeals. That is an serious, avoidable error that the mainstream media, and politicians, routinely make.

Instead, "claims of responsibility" for terror attacks should always be framed as something such as "an admission of guilt", "admission of murder", or a "confession to the slaughter of innocents." This mode of framing is emotional and it deprives terrorists and murderous publicity seekers of the glory that standard framing of an incident permits. As discussed previously, some (or all) cognitive scientists now argue that appeal to emotion is often or usually necessary for persuasion. Proper framing saps at least some terrorist recruiting power from an organization when societies universally think of these incidents in this negative frame.

Unconscionable, immoral, harmless error or no error (just free speech)?: Framing effects are innate (hard wired), unconscious and powerful. Ignoring this well-known biological reality in public discourse about terrorism constitutes error so unconscionable that one can reasonably argue it rises to the level of being immoral. It is beyond mere incompetence. Of course, even if one were motivated to do so, proper framing will be difficult and take time. Old habits are hard to break. Mental thought habits are no exception.

THE AVAILABILITY HEURISTIC: The availability heuristic is an unconscious, reality simplifying bias[1] that gives undue cognitive weight or importance to events or ideas most easily recalled, i.e., it's readily available to conscious thought. What is usually most easily recalled are exposure to events or ideas that are the most frequent and/or recent. Repeated recent exposures reinforce the bias.

The availability heuristic tends to lead people to believe the probability of an easily recalled event is more likely to happen again and to apply personally, even when the statistical odds are low.

Although some, e.g., president Trump, have criticized the mainstream media for insufficient coverage of at least some terrorist attacks, some empirical data suggests the opposite is true. Analysis of terrorism coverage by the New York Times shows far more coverage for terrorism events than for other events that cause far more deaths. For example, for January 2015 through August 2016, the New York Times, about half of homicide coverage in the first three pages was focused on terror attacks, despite the fact that over a 15-year period that included the 9/11 attacks, terrorist murders in America accounted for less than about 2% of all homicides.

In the scheme of things, the risk of death from a terrorist attack on US soil is minuscule. Despite low personal risk, a significant number Americans nonetheless grossly overestimate the risk and frequently change their behavior to avoid what is essentially a non-existent risk.[2] This grossly flawed thinking about risk spills over into and affects politics and policy. That directly reflects bias-induced, reality-disconnected error the availability heuristic unconsciously gives rise to.

If one accepts those facts and that logic, one can again argue that mainstream media coverage of terrorism and the flawed logic it induces in both American citizens and their elected leaders reflects incompetence by both the media and our leaders. Of course, that argument should be set in the context of a mainstream media that is under constant, severe economic pressures to simply survive. Survival means selling news content for profit.

For better or worse, humans are powerfully attracted to, and/or entertained by, violence, fear and anger. The media (and politician?) imperative that "if it bleeds, it leads" is firmly grounded in economic (and political?) reality. But even with that factor in mind, both the press and politicians usually do a dismal job of conveying the overall context including constantly repeating relative risk in reporting and in political discourse. If economics requires appeal to emotion and over reporting of attacks, one can argue that there is an even higher obligation to report relevant context so that the availability and/or other biases don't distort reality more than is reasonable to expect.

Questions: Are risks of an American civilian being killed anywhere on Earth high (more than 1% per year), medium ( 0.1 to 0.99% per year) or low (less than 0.1% per year)?* Do politicians have a moral obligation to take statistical reality into account when talking about terrorism, or is politics a matter of any means (usually preferably legal, but sometimes illegal is OK too) justification of the ends? Should the mainstream media reframe terrorist attacks to the extent is makes cognitive sense to do so?

* A: Low, less than 1 in 100,000/year (< 0.001%).

Information sources:
WNYC, On the Media, May 25, 2107 boradcast
Nemil Dalal, Priceonomics

Footnotes:
1. In essence, a reality simplifying bias is a way the human brain reduces the cognitive load needed to make coherence out of what a person sees, hears or otherwise experiences. Relative to the complexity of the real world, humans have astonishingly limited information processing bandwidth. Humans have no biological choice but to mentally simplify reality, even though errors (reality disconnects) frequently arise in the simplification process.

2. For example, after terrorist attacks in 2015, including an attack in Paris (130 murders) and in San Bernardino (14 murders), 53% of Americans changed their travel plans even though the risk of attack was nil.

B&B orig: 5/29/17

Cognitive Science: Conspiracy Theory Belief & Teleology Bias



A conspiracy theory

Teleology: the explanation of phenomena by the purpose they serve rather than by postulated causes; a reason or explanation for something in function of its end, purpose, or goal. For example, teleological, hands are made (by God) for grasping things, vs non-teleological, evolution caused hands to evolve to grasp things. A teleological explanation includes a final cause or end goal to explain how some system or thing came into being. Teleological thinking is as aspect of thinking related to belief in creationism. It is also known as the argument from design, which argues for the God's existence or, for an intelligent creator. Teleological thinking is a powerful cognitive bias for people who tend to apply this form of thinking to the real world and it has influenced religious thinking for millennia.

A team of European researchers recently published evidence that people who tend to accept conspiracy theories often employ teleological thinking as a basis for belief in conspiracies. It is important to note that the evidence amounts to a correlation, not an always-present cause and effect relationship. In other words, the evidence is that believing in final causes (teleological thinking) correlates with conspiratorial thinking.

In their article, the researchers write: “Teleological thinking — the attribution of purpose and a final cause to natural events and entities — has long been identified as a cognitive hindrance to the acceptance of evolution, yet its association to beliefs other than creationism has not been investigated. Here, we show that conspiracism — the proneness to explain socio-historical events in terms of secret and malevolent conspiracies — is also associated to a teleological bias. Across three correlational studies (N > 2000), we found robust evidence of a teleological link between conspiracism and creationism, which was partly independent from religion, politics, age, education, agency detection, analytical thinking and perception of randomness. As a resilient ‘default’ component of early cognition, teleological thinking is thus associated with creationist as well as conspiracist beliefs, which both entail the distant and hidden involvement of a purposeful and final cause to explain complex worldly events.”

In an article misleadingly entitled, ‘Scientists discover the reason people believe in conspiracy theories’, one mainstream media source discussed this research. Referring to the research, the article comments: “They found that conspiracy theorists are more likely to think ‘everything happens for a reason’ and things are ‘meant to be’, an approach they share with another group often considered extreme in their beliefs: creationists.

We find a previously unnoticed common thread between believing in creationism and believing in conspiracy theories,” said Dr Sebastian Dieguez of the University of Fribourg, one of the researchers behind the study.

‘Although very different at first glance, both these belief systems are associated with a single and powerful cognitive bias named teleological thinking, which entails the perception of final causes and overriding purpose in naturally occurring events and entities.’”

In other comments on their data, the researchers observe: “Although teleological thinking has long been banned from scientific reasoning, it persists in childhood cognition, as well as in adult intuitions and beliefs. . . . . the ‘everything happens for a reason’ or ‘it was meant to be’ intuition at the heart of teleological thinking not only remains an obstacle to the acceptance of evolutionary theory, but could also be a more general gateway to the acceptance of antiscientific views and conspiracy theories.”

Prior research had shown other cognitive characteristics of people who tend to believe in conspiracy theories. For example, individuals who are intolerant of uncertainty and seek cognitive closure share a trait called the need for cognitive closure. Evidence indicates that that trait seems foster, or at least correlate with, conspiracy beliefs about events that have no clear official explanation.

This research represents another step in our incremental evolution of understanding the biology of how people perceive and think about issues in politics and other aspects of life. The mental processes that underpin our perceptions and thoughts are often heavily influenced by our innate biology. In turn, that biology is shaped by both nature, and probably more importantly, nurture. Our culture, families, social identities, personal morals and other factors are all at play in shaping the world we perceive, whether the perception is accurate or not.

Most of this thinking and bias influence arises unconsciously. We are simply not aware of these things, unless we are told about them. And, even when told, many or most people cannot effectively internalize the knowledge. Mindsets are very hard to change. Is that what God intended or is it what arose naturally from evolution?

B&B orig: 8/24/18