Etiquette



DP Etiquette

First rule: Don't be a jackass.

Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.
Showing posts sorted by relevance for query logic fallacies. Sort by date Show all posts
Showing posts sorted by relevance for query logic fallacies. Sort by date Show all posts

Friday, November 15, 2019

Chapter Review: Arguments and Logical Fallacies

This is a review of Chapter 10, Arguments and Logical Fallacies, of Steven Novella's 2018 book, The Skeptic’s Guide the the Universe: How to Know What’s Really Real in a World Increasingly Full of Fake. In this chapter, Novella marches through basic logic and the kinds of logical fallacies that people tend to rely to support their beliefs. The flaws are usually asserted unconsciously. The content of this chapter seems timely in view of the completely contradictory facts and arguments the two sides in the impeachment inquiry are hurling at each other. Novella points out that, in situations like this, one or both sides can be mostly wrong, but both cannot be mostly right.

The point of chapter 10 is to try to lay out the skills needed for critical thinking, something that humans are usually not good at unless they try to be good at it. Novella asserts: “Arguing is something that everyone does but few understand. Yet arguing is an essential skill of critical thinking.” Fortunately, the understanding needed is easy to grasp. Unfortunately, it takes time and sustained effort to learn to apply it.

Basic terminology
Logical fallacy: A logical fallacy is reasoning mistake or error that makes an argument invalid. All logical fallacies are non-sequiturs, which are arguments where the conclusion doesn't follow logically from what preceded it. Novella describes it like this: “A logical fallacy is an invalid connection between a premise and a conclusion, where the conclusion does not necessarily flow from the premise(s) but is argued as if it does.” The human mind did not evolve to do precise logic and people make various kinds of mistakes unless they are aware of the errors and consciously try to avoid them. Instead of using formal logic, humans usually rely on informal logic

Argument: An argument is what connects premises (facts) with conclusions (beliefs). Although people see arguments as something to be won and beliefs to be defended, that isn't how Novella sees it. Instead, an argument is an effort to find reasoned truth, not win points. To help people engaged in argument find truth, they would best try to find as much common ground as possible and then carefully proceed to engage with belief differences.

Assertion: An assertion is a stated or argued premise or conclusion without supporting evidence.

Premise: A premise is an asserted fact(s) that an argument is based on. These days, many, arguably most, political disagreements among people are pointless because they do not agree on the facts. There needs to be a logical connection showing the premises necessarily lead to the conclusion. If there are sufficient premises that are true and the logic is valid (and thus the argument is “sound”), then the conclusion must be true. For completeness, a conclusion based on an unsound argument can be true or false, e.g., all spheres are pretty, therefore the sun is a sphere.

Novella makes an important point: “There is no way to objectively resolve to resolve a difference of opinion regarding aesthetics, for example.” Thus to avoid bickering endlessly over inherently unresolvable, people can simply agree that the disagreement is unresolvable as a matter of aesthetics, moral choice and so forth. Inherent irresolvability appears to apply to many (most?) political disagreements where moral judgments are involved, e.g., what constitutes an impeachable act by a sitting president.[1]

Checking premises
The first thing to do when beginning to engage in argument, people would do well to check their premises or facts. Four problems can occur, (1) the asserted facts or premises are simply wrong, (2) the asserted facts or premises are possibly wrong and not verified enough, (3) a premise is hidden, e.g., evolution is false because there are not ‘enough’ transitional fossils, but the definition of transitional is different from the standard science definition, which makes the disagreement unresolvable, and (4) a premise(s) is based on a subjective judgment, e.g., an information source is ‘reliable’ without an independent assessment or a person asserting a premise that ‘feels’ correct.

Logical fallacies
1. Non-sequitur: All logical fallacies are non-sequiturs. The conclusion doesn't necessarily follow the premises. In giving his version of economic conditions in the US a few weeks ago, the president Tweeted: “Nobody has ever heard of numbers like that, so people want to find out: Why was it so corrupt during that election? And I want to find out more than anybody else.”  Here, the non-sequitur was a false connection between the economy in October of 2019 and the 2016 election.

2. Argument from authority: Appeal to authority can be probative, but it needs to be used carefully. Some non-experts in climate science, like me, tend to point to expert consensus about global warming, the human role in it and options to reduce it. Consensus expert opinion does carry some legitimate weight, but sometimes consensus is wrong. Sometimes, the appealed to authority really isn't an expert. Sometimes the appealed to expert is expert on one field but not the one at issue. Both undermines the persuasive power of the appeal.

3. Post hoc fallacy: This is among the most common fallacies. The fallacy goes like this: Since event Y followed event X, event Y must have been caused by event X. This argument is common in defenses of alternative medicines: “I took the pills and then felt better, therefore the pills worked.” The erroneous assumption is that because of their different positions on a timeline, the first event caused the second event.

The president used a post hoc fallacy when he asserted: “Since my election, Ford, Fiat Chrysler, General Motors, Sprint, SoftBank, Lockheed, Intel, Walmart and many others have announced that they will invest billions of dollars in the United States and will create tens of thousands of new American jobs.” Fact checkers found that those business decisions were make before the president was elected and not due to his role as president.

4. Whataboutism (tu quoque): This fallacy argues that since someone or some group did something in the past, doing it now is justified. The president and his supporters sometimes justify actions the president takes as justified because democrats did it. From my point of view, the whataboutism tactic appears to lead to a spiral down in civility and social norms. For example, the president asserted: “I will release my tax returns — against my lawyer’s wishes — when [Hillary Clinton] releases her 33,000 emails that have been deleted.”

5. Ad hominem fallacy: This is an argument that attacks the opponent or their motivations instead of their arguments or conclusions. Asserting that an opponent is closed-minded is a common form of this attack. Novella asserts that people accusing their opponent of being closed-minded tend to be “closed to the possibility that they are wrong.” In other words, there are times when a person one is engaged with is in fact closed-minded.

6. Appeal to ignorance (proving a negative, ad ignorantiam): This is a fairly common fallacy based on a belief that something is true because it has not been shown to be false. Proving a negative can be difficult to deal with and thus this fallacy can be difficult to deal with. For example, the president asserted the following to CNN about his election in 2016: “What PROOF do you have Donald Trump did not suffer from millions of FRAUD votes? Journalists? Do your job!” and “Pathetic – you have no sufficient evidence that Donald Trump did not suffer from voter fraud, shame! Bad reporter.”

7. False analogy: A comparison between two things are similar in one way are falsely claimed to be similar in a different way. An example is the president's complaint about how he sees his treatment by democrats: “All Republicans must remember what they are witnessing here — a lynching. But we will WIN!” The president is being investigated and criticized, but that is simply not the same as being lynched. The president's claim ignores the difference.

8. Slippery slope: This fallacy assumes that one action or policy will necessarily lead to other, worse outcomes. The mistake here is the belief that one action, e.g., a law that requires universal background checks for gun purchases, will lead inevitably to an extreme ultimate position, e.g., all guns in private hands will be taken away by force.

9. Straw man fallacy: Here, a person uses a weak version or caricature of an opponent's argument and then attacks that. The opponent may not even hold the asserted straw man position. Novella argues that critical thinking demands that the strongest version of an opponent’s argument should be assumed and addressed. Examples include assertions by the president that (1) Democrats “don’t mind executing babies AFTER birth” and (2) Democrats “have become the party of crime. [They] want to open our borders to a flood of deadly drugs and ruthless gangs [and] turn America into a giant sanctuary for criminal aliens and MS-13 thugs.”

The red herring fallacy is similar to the straw man, but it asserts a fact or premise that looks true but is either false or irrelevant. An example is the president’s Tweet two days after Attorney General Sessions recused himself from Justice Department investigations of Russian attacks on the 2016 election: “Terrible. Just found out that Obama had my ‘wires tapped’ in Trump Tower just before the victory.”

10. Tautology (begging the question): This fallacy relies on circular reasoning where the premise assumes the conclusion. Thus the argument is that since A = B, therefore A = B. The two sets of A = B tend to be worded differently, making them sometimes had to spot. One example is the president’s argument that the impeachment inquiry is illegitimate because he did nothing wrong. Another example is expressed in a legal memo the president relies on in his own defense: “The President’s actions here, by virtue of his position as the chief law enforcement officer, could neither constitutionally nor legally constitute obstruction because that would amount to him obstructing himself.” That falsely argues the president cannot obstruct justice because the justice department works for him. Since the President tells the DOJ what to do, the memo argues, and any action he takes is leading justice, not obstructing it.

There are other fallacies, but these account for most of the common ones.

Footnote:
1. Pragmatic rationalism compared to arguments & logical fallacies: For people familiar with the pragmatic rationalism anti-ideology ideology argued here from time to time, its moral basis will probably jump right out as being in full accord with logic and what critical thinking requires. Specifically, the first two moral values are (i) conscious effort to try to see facts with less bias or distortion, and (ii) conscious effort to try to apply less biased conscious reason (arguments) to the facts that people think they see. The broad scope of disagreements that are not logically or objectively resolvable accords with the idea asserted here many times is that the best that people in civil, rational political disagreement can do is try to reach stasis, the point at which each understands why they disagree. Based on disagreements in my experience, about 85% of disagreements arise because of disagreements over facts. 



Sunday, July 31, 2022

The science of propaganda, spin and doubt: A short summary

At the least, the information in this post should be mandatory knowledge for both a high school degree and for any post high school credential. If a person does not know this, they are more susceptible to the dark arts than is justifiable in American democracy. -- Germaine, 2022


Context
Lots of books and thousands of research articles have been written on propaganda and why and how it works so well. Propaganda became sophisticated in America a couple of years before World War 1. To get the US into WW1, president Woodrow Wilson created the Committee on Public Information. The CPI was a gigantic US government deceit and emotional manipulation machine. Tens of thousands of spinning con artists worked for it. Wilson's goal was to con the American people into supporting American entry into the war and feeling emotionally justified, e.g., making the world safe for democracy. Some of the greatest propagandists of the 20th century, maybe of all time, worked on that effort. It was a smashing success.

Wilson's massive public disinformation effort jump-started modern propaganda ("public relations") in support of businesses and commerce (discussed here). Business leaders watching how effective propaganda could be to get people to walk into a brutal war quickly realized that good propaganda wasn't just for governments to use to deceive people into making the ultimate self-sacrifice. It could be used by businesses to deceive both customers and governments. It was, and still is, a freaking super rich gold mine chock full of diamonds, platinum, lithium and all the hot, juicy cheeseburgers that T**** could ever eat.


A short summary of propaganda tactics
In 2021, two researchers, Rebecca Goldberg and Laura Vandenberg, at the University of Massachusetts, Department of Environmental Health Sciences, and School of Public Health and Health Sciences, published a very nice summary of spin or propaganda tactics from 5 major sources.[1] Their paper is entitledThe science of spin: targeted strategies to manufacture doubt with detrimental effects on environmental and public health.

The paper's abstract includes these comments:
Results: We recognized 28 unique tactics used to manufacture doubt. Five of these tactics were used by all five organizations, suggesting that they are key features of manufactured doubt. The intended audience influences the strategy used to misinform, and logical fallacies contribute to their efficacy.

Conclusions: This list of tactics can be used by others to build a case that an industry or group is deliberately manipulating information associated with their actions or products. Improved scientific and rhetorical literacy could be used to render them less effective, depending on the audience targeted, and ultimately allow for the protection of both environmental health and public health more generally.

The list of tactics that special interests who used them is shown below in Table 1 from the article. Table 2 lists the logic fallacies the propagandists tend to rely on.





Tactics or strategies 1, 2, 3, 8 and 21 were all used by all five sources of deceit and doubt.
  • 1. Attack Study Design: To emphasize study design flaws in A** that have only minimal effects on outcomes. Flaws include issues related to bias, confounding, or sample size
  • 2. Gain Support from Reputable Individuals: Recruit experts or influential people in certain fields (politicians, industry, journals, doctors, scientists, health officials) to defend B** in order to gain broader support
  • 3. Misrepresent data: Cherry-pick data, design studies to fail, or conduct meta-analyses to dilute the work of A
  • 8. Employ Hyperbolic or Absolutist Language: Discuss scientific findings in absolutist terms or with hyperbole, use buzzwords to differentiate between “strong” and “poor” science (i.e. sound science, junk science, etc.),
  • 21. Influence Government/Laws: Gain inappropriate proximity to regulatory bodies and encourage pro-B policy
** “A” refers to information generated to combat scientific evidence and facts
“B” refers to information generated to promote narratives that are favorable to the industry




Acknowledgement: Thanks to Larry Motuz for bringing the work of these two researchers to my attention.


Footnote: 
1. The researchers describe the five sources of propaganda like this:
The first, Big Tobacco, is widely considered to have “written the playbook” on manufactured doubt [1]. The tobacco industry has managed to maintain its clientele for many decades in part due to manufactured scientific controversy about the health effects of active and secondhand smoking [1, 2, 4, 6, 10,11,12,13].

The other industries we examined include the coal industry, whose employees often suffer from black lung disease [14], yet the industry has avoided awarding compensation to many affected miners by wielding disproportionate influence in the courtroom [15,16,17,18,19]; the sugar industry, which distracted from its role contributing to metabolic and cardiovascular diseases [20] by deflecting blame toward dietary fat as a plausible alternative cause for rising population-level chronic disease rates [21,22,23,24,25]; the agrochemical business, Syngenta, manufacturer of the herbicide atrazine [26,27,28], which conducted personal attacks against a vocal critic of atrazine whose research revealed disruptive effects on the endocrine systems of aquatic animals [29, 30]; and the Marshall Institute, a conservative think tank comprised of Cold War physicists eager to maintain their proximity to government, and associated scientists who deliberately misrepresented information to the government to both minimize and normalize the effects of fossil fuels on global temperatures [1, 4, 31].

Friday, August 9, 2019

Chapter Review: Reason

“So convenient at thing it is to be a rational creature, since it enables us to find or make a reason for everything one has a mind to.” Benjamin Franklin, polymath, diplomat, humorist, Founding Father, 1706-1790

“Those who are governed by reason desire nothing for themselves which they do not also desire for the rest of mankind.” Baruch Spinoza, Dutch philosopher, 1632-1677



This discussion reviews Reason, which is chapter 21 of Steven Pinker's 2018 book, Enlightenment Now: The Case for Reason, Science, Humanism and Progress. Pinker is a professor of psychology at Harvard. In this chapter, he defends the use of reason as a necessary component if, among other things, one thinks it is possible that politics maybe can be more rational than it is now.

Pinker paints a picture of reason, defined roughly as evidence- and logic-based thinking, as (i) having been critical to the development of modern civilization, and (ii) in need of defense despite its obvious role in improving nearly all aspects of life for centuries. He goes on to address one area of life where reason has been significantly derailed in a “flaming exception” to progress, tribal politics, and how the problem might be ameliorated. He argues that it is possible to significantly defuse the irrationality of tribal politics in part by greater reliance on modern teaching methods in debiasing and critical thinking, and in part how modern media and social institutions can deal with politics in a less political and thus more rational way.

Digression - correcting an error: Pinker contradicts an assertion made here on several occasions, specifically that some human minds can predict events 18-30 months in the future. That is wrong. Some people can predict events up to about 5 years into the future before their predictions fade into statistical insignificance. Accuracy seems to be highest for predictions up to one year out, and then they fade in accuracy.[1]

Attacking anti-rationalism: Pinker goes directly after people who argue that rationalism is “a pretext to exert power” that “all statements are trapped in a web of self-reference and collapse into paradox”, or that “it is futile to even try to make the world a more rational place.” He logically points out that “all these positions have a fatal flaw: they refute themselves. The deny that there can be a reason for those very positions.” His logic is simple: all of those arguments depend on a rational reason to believe any of their arguments, and as soon as “their defenders open their mouths to begin their defense, they have lost the argument, because in that very act they are tacitly committed to persuasion – to adducing reasons for which they are about to argue, which, they insist, ought to be accepted by their listeners according to standards of rationality that both accept.”

On this point, Pinker cites philosopher Thomas Nagel as arguing that “subjectivity and relativism regarding logic and reality are incoherent, because ‘one can’t criticize something with nothing’.” Pinker goes on to point out that even unhinged conspiracy theorists and spewers of alternative facts defend their indefensible beliefs and falsehoods with “Why should I believe you?” or “Prove it.” They never respond to reasonable questions or disbelief with “That’s right, there’s no reason to believe me.” or “Yes, I’m lying right now.” Everyone relies on reason, not as a matter of faith in reason, but as a matter of its unavoidable use.

We’re not always rational: Pinker next turns to an apparent reason to distrust reason, namely cognitive psychology, which is subject matter that informs and drives much of B&B’s ideology and advocacy. Among others, he cites Daniel Khaneman’s 2011 book, Thinking, Fast and Slow, with its now famous description of System 1, our fast, powerful and tireless unconsciousness, and System 2, our slow, weak and easily fatigued consciousness. Throughout his book, Pinker raises such concerns about various biases such as the availability heuristic, stereotyping, non-Bayesian thinking, and motivated reasoning, that sometimes lead to (1) false perceptions of reality, facts and truth, and/or (2) flawed reasoning that is applied to our perceptions, true or false:
But as important as these discoveries are, it is a mistake to see them as refuting some Enlightenment tenet that humans are rational actors, or as licensing the fatalistic conclusion that we might as well give up on reasoned persuasion and fight demagoguery with demagoguery. To begin with, no Enlightenment thinker ever claimed that humans were consistently rational. . . . . . What they argued was that we ought to be rational, by learning to repress the fallacies and dogmas that so readily seduce us, and that we can be rational, collectively if not individually, by implementing institutions and adhering to norms that constrain our faculties, including free speech, logical analysis, and empirical testing.

Here, Pinker seems to come close to giving up on the individual, and at least sees a possibility of greater rationality in collectivism and social institutions dedicated to rationality and norms that foster it. The norms he mentions are crucial. The rebuttal from the alternative fact and bogus logic populist and conservative crowd is obvious. It goes something like this: ‘I’ve got my free speech rights, you can’t touch them, and besides, I’m the truth teller and you are the evil liar and corrupter of all that is good, civilized and American.’ Pinker may very well be right that rationality will probably require social institutions and collective actions that render this irrational social- and self-identity no longer worth defending. That is a task for social institutions to play a major role in.

Pinker points to an aspect of evolutionary biology we have to accept and deal with, namely the deep human craving for reasons and explanations of the world. But there’s a catch:
Since the world is the way it is regardless of what people believe about it, there is a strong selection pressure for an ability to develop explanations that are true. Reasoning thus has deep evolutionary roots. . . . . But reality is a mighty selection pressure, so a species that lives by ideas must have evolved with an ability to prefer correct ones. The challenge for us today is to design an informational environment in which that ability that prevails over the ones that leads us into folly.

He makes good points here. All the available research points to a powerful innate need to explain things, even when there isn’t enough information to do that. Unfortunately, we often form beliefs without enough evidence, and we do it all the time. If this explanation is basically true, it clearly reveals the origin of many false beliefs and the flawed logic that generates them. This phenomenon is rampant in politics.



Symbols of cultural allegiance – social identity is a tough nut to crack: Pinker turns to researcher Dan Kahan (Yale, legal scholar), who generated evidence that people often hold false beliefs do so as a signal of cultural allegiance and who they are. The fact that their beliefs are false is often not important enough to abandon them, even when they know the belief is false. This is a matter of a person showing social identity and, in this case liberal, conservative or libertarian tribal affiliation:

A given belief, depending on how it is framed and who endorses it, can become a touchstone . . . . . sacred value, or oath of allegiance to one of these tribes. As Kahan and his collaborators explain: “The principle reason people disagree about climate change science is not that it has been communicated to them in forms they cannot understand. Rather, it is that positions on climate change convey values – communal concerns versus individual self-reliance; prudent self-abnegation versus the heroic pursuit of reward; humility versus ingenuity; harmony with nature versus mastery over it – all that divide them along cultural lines.”

Pinker points out that in one sense, belief in obviously false ideas and truths is rational in a person’s social context. People intuitively know that their opinions, e.g., on climate change, are not going to affect anything, but if they change from climate science denial to acceptance (or vice versa) they do know that can make an enormous difference in their social standing with the tribe. The mind-flipper on a sacred value can be seen as odd at best and at worst, a traitor to the tribe and ostracized. Pinker observes that “Given these payoffs, endorsing a belief that hasn’t passed muster with science and fact-checking isn’t so irrational after all – at least by the criterion of the immediate effects on the believer.”

It’s hard to argue with that logic. Kahan sees this mess as people playing a role in a giant “Tragedy of the Belief Commons: what’s rational for every individual to believe (based on [self-]esteem) can be irrational for the society as a whole to act upon (based on reality).” Researchers call this self-defense phenomenon “expressive rationality” or “identity-protective cognition.” From this observer’s point of view, this explains a great deal about why it is so very hard, usually impossible, to change minds with facts that run counter to self-identity, self-esteem, and political beliefs, morals and values. For climate change deniers, it is a matter of those human factors, not science, and not science-based logic. Instead, it is a matter of self-preserving logic, with or without bad science to back it up. Of course, that ignores people and businesses in climate science denial mode for the money, and self-aggrandizers and blowhards in it for themselves in way or another.

One quick note for completeness. Having education, much expertise and employing conscious reason does not guarantee rationality. It can lead to more ingenious rationalizations instead of a search for truth. That bias is called motivated reasoning. Hence the quote by Ben Franklin above.



Predicting the future: Pinker discusses at length research that is the empirical basis this observer’s favorite source of hope for mankind, Philip Tetlock (U. Pennsylvania, Annenberg Professor, https://www.sas.upenn.edu/tetlock/bio , book review 1 - https://disqus.com/home/discussion/channel-biopoliticsandbionews/book_review_superforecasting/ , book review 2 - https://disqus.com/home/discussion/channel-biopoliticsandbionews/book_review_expert_political_judgment_30/ ). Tetlock’s research is revolutionary. It revealed that people can learn to become more rational if they want to. Among other things, his finding of superforecasters who have an outstanding intuitive ability to predict future events (1) opened new fronts in warfare among nations, (2) intensified competition among businesses, and, at least for this observer, (3) revealed a major part of a pathway to partially rationalize politics. Pinker sets his discussion up and then leverages it like this:

Though examining data from history and social science is a better way of evaluating our ideas than arguing from the imagination, the acid test of empirical rationality is prediction. Science proceeds by testing the predictions of hypotheses . . . . Unfortunately the epistemological standards of common sense – we should credit the people and ideas that make correct predictions, and discount the ones that don’t – are rarely applied to the intelligentsia and the commentariat [the blithering class?]. Always wrong prognosticators [blowhards?] like Paul Ehrlich continue to be canvassed by the press, . . . . . The consequences can be dire: many military and political debacles arose from misplaced confidence in the predictions of experts . . . . . A track record of prediction also ought to inform our appraisal of intellectual systems, including political ideologies. . . . . A rational society should seek the answers by consulting the world rather than assuming the omniscience of a bloc of opinionators who have coalesced around a creed.

Pinker notes that we continue to hear from blowhards with dismal prediction track records because (1) no one is keeping score, and (2) blowhards are expert at couching their predictions in vagueness and hard to pin down generalities. And, on the rare occasions that a blowhard is shown to have been wrong and called out for it, they superb at rationalizing their failure into insignificance, e.g., “I was almost right”, “I was wrong but for the right reasons”, “I would have been right but for that unexpected incident”, “I will be proven right next year”, “That wasn’t what I predicted, you got it wrong”, and etc. Tetlock’s first book, Expert Political Judgment: How good it is? How can we know? dives deep into the amazing ability of experts to deflect their dismal track records into nothingness. Pinker is absolutely right to pound on the dismal failures that experts have been, and mostly still are. The people and nations who have learned from Tetlock and take his research seriously are building competitive advantages over those who ignore him.

Trump is incompetent at picking personnel, unless you like incompetence: It may be of some interest to readers that made it this far, thanks for that, Tetlock’s books cite two well-known people as examples of stunning, above and beyond the normal standard of expert failure: (a) Larry Kudlow, now President Trump’s Director of the National Economic Council, and (b) Michael Flynn, former high-ranking US intelligence officer and, briefly, Trump’s National Security Advisor. Before they came to power under Trump, Tetlock ripped Kudlow and Flynn to pieces as prime examples of America’s unfettered modern blowhardoisie. Trump certainly knows how to ‘pick the best people’, if by that he means the most incompetent – they are among the best at being the worst, especially Kudlow.

The superforecaster mindset: One final consideration deserves to be mentioned. Exactly who or what are these superforecaster people compared to non-superforecasters? Pinker describes it like this, and quoting Tetlock:
The forecasters who did the worst were the ones with Big Ideas – left-wing or right-wing, optimistic or pessimistic – which they held with an inspiring (but misguided) confidence: “As ideologically diverse as they were, they were united by the fact that their thinking was so ideological. They sought to squeeze complex problems into their preferred cause-effect templates and treated what did not fit as irrelevant distractions. . . . . As a result, they were unusually confident and likelier to declare things ‘impossible’ or ‘certain’. Committed to their conclusions, they were reluctant to change their minds even when their predictions clearly failed. They would tell us, ‘Just wait.’” Indeed, the very traits that put these experts in the public eye made them the worst at prediction. . . . . Tetlock’s superforecasters were: “pragmatic experts who drew on many analytical tools, with the choice hinging on the particular problem they faced. . . . . When thinking, they often shifted mental gears, sprinkling their speech with transition markers such as ‘however’, ‘but’, ‘although’, and ‘on the other hand’. They talked about probabilities, not certainties. And while no one like to say ‘I was wrong’, these experts more readily admitted it and changed their minds.” . . . . . they are humble about particular beliefs, treating them as ‘hypotheses to be tested, not treasures to be guarded. . . . . .They display what the psychologist Johnathan Baron calls “active open-mindedness”. . . . . Even more important that their temperament is their manner of reasoning. Superforecasters are Bayesian, tacitly using the rule . . . . . on how to update one’s degree of credence [confidence] in a proposition in light of new evidence. . . . . Two other traits distinguish superforecasters from pundits and chimpanzees. The superforecasters believe in the wisdom of crowds, laying their hypotheses on the table for others to criticize or amend and pooling their estimates with those of others. And they have strong opinions on chance and contingency in human history as opposed to necessity and fate. . . . . with the most accurate superforecasters expressing the most vehement rejection of fate and acceptance of chance.

This review will be followed by another discussion that focuses mostly on how Pinker sees a way forward toward more rational politics.

Footnote:
1. A personal misconception dispelled and a point about ideology: Pinker contradicts a belief that this reviewer held and expressed several times here at B&B. Specifically, I believed that researcher Philip Tetlock’s data on the human ability to predict future events faded into statistical insignificance at about 18-30 months in the future. Pinker flatly contradicts that. He asserts that the best minds can see future events with statistical significance before they fade at about 5 years in the future:
Once again, there was plenty of dart throwing [referring to the essentially random guessing that characterizes the spew from nearly all experts, talking heads, politicians, partisans, special interests, pundits, and ideological blowhards], but in both tournaments [rigidly controlled tests of people’s ability to predict the future] the couple [Tetlock and his colleague] could pick out “superforecasters” who performed not just better than chimps and pundits, but better than professional intelligence officers with access to classified information, better than prediction markets, and not too far from the theoretical maximum. How can we explain this apparent clairvoyance? (For a year that is – accuracy declines with distance into the future, and it declines to the level of chance around five years out.)(emphasis added)

B&B orig: 2/8/19

Thursday, March 26, 2020

Thinking About the Morality of Less Biased Conscious Reasoning


Ethics: rules provided by an external source, e.g., written codes of conduct in workplaces, or professions, or principles or rules in religions

Free will: (i) the power of acting without the constraint of necessity, fate or uncontrolled biological imperative; (ii) the ability to act at one's own discretion; (iii) actions or behaviors that are not pre-determined by genetic, environmental or automatic unconscious responses to stimuli or information

Morals: an individual’s own beliefs regarding good and bad or right and wrong; morality is subjective; people do not always act in ways that accord with their morals

Virtue: (i) a characteristic of our true, natural self; (ii) sometimes, the quality of being morally good; (iii) properties of people who habitually act rightly and they may or may not be following a moral or ethical rule; some believe that virtues are subjective, while others believe that virtues are universal, and thus arguably more objective than subjective

Acknowledgment: This discussion was inspired by an excellent discussion that PD posted on his Books & Ideas blog,  Is Reflective Reason A Virtue?







Free will
Most experts believe humans have no free will based on a lot of empirical data that shows our behavior is dictated by the unconscious mind deciding what to do before we are consciously aware of the decision. Others believe we have at least some free will. It operates as a conscious decision to accept or reject automatic unconscious responses and resulting pre-determined behaviors. One researcher commented: “An unfree will may not be so hard to swallow if we have at least a free unwill.” In other words, human free will amounts to (1) conscious partial or complete veto power over what our unconscious mind wants to believe and/or do, and (2) conscious acceptance of what our unconscious mind wants to believe and do.

For this discussion to make sense, one has to assume that humans have some free will at least when when matters of ethics, morals or virtues are implicated. If we have no free will, as PD points out, then all virtuous behaviors , e.g., conscious reasoning, honesty, fairness or bravery in defense of others, are automatic. In that case, such behaviors cannot be said to be good or bad, or praiseworthy or blameworthy. Absent free will, human behavior just is what it is, leaving the conscious mind with no role in any of it. Speaking of good or bad in that scenario doesn't make much sense. One might like or dislike a certain uncontrolled behavior, but one cannot rationally assign goodness or badness to it.


Less biased conscious reasoning (LBCR)
LBCR is the second core moral value (virtue?) of the pragmatic rationality anti-ideology ideology. From what I understand, it refers to about the same thing that PD and philosopher Nick Byrd calls reflective reason. When one engages in LBCR, e.g., to consider an argument, a hypothesis or a proposed political policy, one is consciously reasoning in a more rational way than when one allows unconscious thinking to control. The unconscious mind is intuitive, emotional, moral, biased and usually tinged with some degree of intolerance, judgmentalism and tribalism.

That is the solution that evolution came up with as a means for the human brain-mind to deal with the world in the Pleistocene epoch, about 2.5 million to 11,700 years ago. That worked to keep humans alive and survive in those times. In modern times, it arguably presents an existential threat to modern civilization and possibly even the humans species itself. Although human minds are probably about the same as those in the Pleistocene, modern threats aren't the same. Most humans alive today do not worry about being attacked by lions or irate hippos.

Can LBCR be considered to be a moral or a virtue? Yes, if one accepts the following logic or reasoning. No, if one doesn’t.

1.The point of elevating it to the status of a moral value is that LBCR can counteract bad decisions the unconscious mind makes based on how modern science understands what is going on when we deal with politics. The unconscious mind is susceptible to emotional manipulation, irrational appeals to personal morals, logical fallacies, biases and a host of other reality and reason[1] distorting human traits.

2. Personal experience indicates that most people (~99%) believe they (1) base their politics on facts, valid truths, and LBCR, and (2) the political opposition does not. Evidence from empirical research shows that, for the most part, that is not true. But the near-universal belief that one should be fact-based and rational about politics is evidence that LBCR is better than the flawed thinking the opposition allegedly relies on.

3. If a widespread belief in a nation or society that X is better than not X, then that could constitute at least one source of authority for considering LBCR to be a moral value.


Questions: Is it reasonable to believe that LBCR is a good moral value? Or, is it something else, e.g., a ‘desirable trait’?


Footnote:
1. Applying logic and reasoning to an issue are quite different modes of operation. The human did not evolve to use logic or be strictly rational in most situations. It evolved to reason about things and apply a soft or fuzzy rationality, usually based mostly (~99% ?) on what the unconscious mind thinks, believes and decides. The unconscious mind gets things right most of the time and there's no problem. It still works great for most things. But when dealing with politics with all of its complexity, opacity, deceit, appeal to logic fallacies, manipulation, misinformation and disinformation, the unconscious mind is mostly out of its depth. We did not evolve minds that can deal rationally with the underlying complexity and subjectivity of things in politics, including objective facts.



Nuclear submarine and tugboat

Sunday, October 11, 2020

Climate Science Denial: The Motte-and-Bailey Logic Fallacy

The motte is the structure on the high ground and the bailey is 
below and inside the fenced area:
the bailey is easier to attack than the motte
(10th century technology)


Wikipedia: The motte-and-bailey fallacy (named after the motte-and-bailey castle) is a form of argument and an informal fallacy where an arguer conflates two positions which share similarities, one modest and easy to defend (the "motte") and one much more controversial (the "bailey").[1] The arguer advances the controversial position, but when challenged, they insist that they are only advancing the more modest position.[2][3] Upon retreating to the motte, the arguer can claim that the bailey has not been refuted (because the critic refused to attack the motte)[1] or that the critic is unreasonable (by equating an attack on the bailey with an attack on the motte).


Employing logic fallacies to deceive, distract, disinform and so forth is a common tactic among purveyors of dark free speech or epistemic terrorism. In the vice presidential debate, Mike Pence used the motte-and-bailey fallacy to deceive and confuse people about climate change. At the Neurologica blog, Steve Novella explains it nicely:
“Pence represented the typical denial strategy. He started by saying that the climate is changing, we just don’t know why or what to do about it. This is the motte and bailey fallacy in action – pull back from the position that is untenable to defend an easier position, but don’t completely surrender the outer position. Pence was not about to deny that global warming is happening at all in that forum because he would be too easily eviscerated, so he just tried to muddy the waters on what may seem like an easier point.

But of course, he is completely wrong on both counts. We do know what is causing climate change, it is industrial release of CO2 and other greenhouse gases. At least there is a strong consensus of scientists who are 95% confident or more this is the major driver, and there is no tenable competing theory. That is what a scientific fact looks like. We also know what to do about it – decrease global emissions of CO2 and other greenhouse gases. And we know how to do that – change our energy infrastructure to contain more carbon neutral sources with the goal of decarbonizing energy. Change our transportation industry as much as possible over to electric (or perhaps hydrogen) vehicles. Advance other industrial processes that release significant amounts of CO2. And look for ways to improve energy efficiency and sequester carbon efficiently. It’s not like there aren’t actual detailed published plans for exactly what to do about it.

Pence, however, will rush from his perceived motte into the bailey of total denial when he feels he has an opening. So he also said that the “climate change alarmists” are warning about hurricanes, but we are having the same number of hurricanes today as we did 100 years ago. This is not literally true (there were six hurricanes so far this year in the North Atlantic, and four in 1920), and it looks from the graph like there is a small uptick, but let’s say it’s true enough that statistically there isn’t a significant change in the number of hurricanes. This is called lying with facts – give a fact out of context that creates a deliberately false impression. In this case the false impression is also a straw man, because climate scientists don’t claim that global warming increases the number of hurricanes. They claim (their models predict) that warming increases the power and negative effects from the hurricanes that do occur.

Pence next tried to take credit for dropping CO2 release from the US, as if this is tied to pulling out of the Paris Accord. It is true that CO2 emissions are decreasing, but this is a trend that has been fairly linear since 2005. Between 2005 and 2018 US CO2 emissions dropped 12%. This is largely due to shifting energy production to less CO2 producing methods, including rising renewables. But also, I will acknowledge, this is partly due to a shift from coal to natural gas. There has been a huge drop in coal as a percentage of US energy. Pence selectively used this fact to defend natural gas, glossing over the fact that this is a greater knock against coal, which he does not want to criticize.

Admittedly a live debate is not the place to get into all these details, but pretty much everything Pence said on the climate was misleading and tracked with fossil fuel industry talking points rather than the scientific consensus.”

A couple of things merit comment. 

First, Trump, Pence and the GOP generally have been ruthlessly using logic flaws, lies and deceptive rhetoric for decades to confuse people and sow doubt in the face of contrary climate science evidence they cannot refute using either evidence (facts) or sound reasoning (~logic). Since they do that with climate science, it seems reasonable to believe that they would do that for all other things they dislike or want to deny, science-related or not.

Second, special interests with threatened economic interests have been doing the same thing for decades. 

Third, conservative politicians and special interests who distort or deny realities based on science or anything else are deeply immoral in their unwarranted distortions and denials. In this regard, they are moral cowards.

Monday, August 3, 2020

The Human Mind and the Hot-Cold Empathy Gap

Prior research has shown that people mispredict their own behavior and preferences across affective states. When people are in an affectively “cold” state, they fail to fully appreciate how “hot” states will affect their own preferences and behavior. When in hot states, they underestimate the influence of those states and, as a result, overestimate the stability of their current preferences. The same biases apply interpersonally; for example, people who are not affectively aroused underappreciate the impact of hot states on other people’s behavior. After reviewing research documenting such intrapersonal and interpersonal hot– cold empathy gaps, this article examines their consequences for medical, and specifically cancer-related, decision making, showing, for example, that hot– cold empathy gaps can lead healthy persons to expose themselves excessively to health risks and can cause health care providers to undertreat patients for pain. -- George Loewenstein, Carnegie Mellon University, Health Psychology, Vol. 24, No. 4(Suppl.), S49 –S56, 2005 [1]


The Hot-Cold Empathy Gap
An NPR broadcast of Hidden Brain, discussed research on strong physiological (hunger, sexual arousal, pain) and emotional states (fear, anger, disgust) that can move people's minds from cold states to hot states. In hot states, physiology and/or emotions control, and at the same time memory of cold state knowledge and logic or reasoning are unavailable to shape behavior. In hot states, things just happen, and sometimes (usually?) they are bad or dumb things.

The comments below are mostly based on the broadcast from the start to about 20:40 and ~50:00 to 53:00. Maybe most people here will already understand all of this. Nonetheless, it should help to keep this important aspect of the human mind in easily accessed memory.


People in a cold state tend to misjudge what their behavior would be when they are in a hot state. Men's behavior when sexually aroused changes compared to when not aroused. When arousal passes people appear to have forgotten and downplay the intensity of the hot state. Studies show that after experiencing a hot state and returning to a cold state, people are generally worse at predicting what their behavior would be if they returned to the hot state.

The data indicates that the hot-cold empathy gap works two ways across time, prospective and retrospective. The prospective gap leads people to misjudge their future behavior if they re-experience a hot state they have experienced before, such as sexual arousal. The hypothesis here is that the memory that people have of their own hot state experience is softened or distorted, leading them to misjudge themselves in the past and their future hot state behavior.

The retrospective empathy gap is also hypothesized to involve the same memory tricks, which can happen literally within a minute or two of a hot state situation such as feeling pain. People who experienced pain and then had the pain source withdrawn, immediately misjudge and overestimate their ability to handle the same pain again. The same phenomena applies to hunger, addiction and depression. The cold state mind and what it knows is unable to access the hot state mind, making the hot state version of a person incomprehensible. The hot state mind cannot access the cold state logic. One woman, Irene, in a cold state said about this about her own hot state sexual arousal experiences: "I don't know that girl."

That was cold Irene talking about hot Irene.

This phenomenon also applies to other people. The empathy gap can literally blind us to how other people feel and why they do some of the things they do.


The Empathy Gap and Politics
Maybe this restates the obvious, but it still is worth saying. When politicians, special interests, ideologues and others use dark free speech (lies, deceit, emotional manipulation) (collectively 'bad people') to create false realities, leverage flawed reasoning and win support, they are generally trying to push listeners into a hot state. Fear is probably the most powerful emotion that bad people have in their dark free speech arsenal. Anger, bigotry, disgust, distrust and intolerance are other powerful emotions that bad people play on to try foment hot states and irrationality.

People in hot states are more susceptible to lies, deceit and flawed reasoning, including logic fallacies. That is why it is important to at least try to maintain emotional control when engaging in politics. And when control is lost, it is usually best to walk away until control is regained. The cooling off period can be very useful to help maintain rationality, even if it requires backing away overnight.


Footnote:
1. Lowenstein also writes:
"Affect has the capacity to transform us, as human beings, profoundly; in different affective states, it is almost as if we are different people. Affect influences virtually every aspect of human functioning: perception, attention, inference, learning, memory, goal choice, physiology, reflexes, self-concept, and so on. Indeed, it has been argued that the very function of affect is to orchestrate a comprehensive response to critical situations that were faced repeatedly in the evolutionary past (Cosmides & Tooby, 2000)."


Friday, August 9, 2019

Some Thoughts on Political Reasoning and the Rationality and Morality of Politics

Stuff just keeps falling on the trail

Political reasoning (Germaine's definition, v. 1.0): Unconscious and conscious thinking about political issues and policies in view of cognitive and social psychological factors, including perceptions of relevant reality, truths and facts, personal ideology, personal morals, ethics or values, self-identity, social identity, and social institutions and norms the individual identifies with; it can be mostly rational by being reasonably based on significantly or mostly true objective reality, truths and facts and thinking or logic that reasonably flows from objective reality, truths and facts; it can be mostly irrational by being based on significantly or mostly false perceptions of truths and facts and/or significantly or mostly flawed thinking or logic, wherein what is reasonable or not is assessed from the point of view of service to the public interest (as I tried to 'objectively' define the concept)



In his 2018 book, Enlightenment Now: The Case for Reason, Science, Humanism and Progress, psychologist Steven Pinker discusses some context and suggests some tactics that might help rationalize politics to some extent relative to what it is now. This discussion is based on chapter 21, Reason.

He argues that although humans operate with cognitive and emotional biases that sometimes leads to error, that does not mean that (i) humans are completely irrational, or (ii) there is no point in trying to be more rational in our thinking and discourse. He argues that both ideas are false. Bias and error happens but not all the time because if that were the case, it would be impossible for anyone to say we are subject to bias and error. He argues: “The human brain is capable of reason, given the right circumstances; the problem is to identify those circumstances and put them more firmly in place.”

Fact checking: Pinker asserts that despite a common perception of America being in a ‘post-truth era’, that is false because societies have always been subject to lies, deceit, unsupported conspiracy theories, mass delusions and so forth. He points to the rise of fact checking in response to the rise of Trump as evidence of social progress. Poll data indicates that about 80% of the public is open to the idea of journalists questioning politicians, pundits and special interests about fact accuracy in live interviews. Fact-checking is increasingly popular with the public and complaints are increasing in cases where when fact checking is not made available.

In that regard, Dissident Politics is at or near the leading edge in advocating public refusal to listen to sources with an undeniable track record of chronic lying without real-time or near real-time fact checking. The cognitive power of unchallenged lies is too much to allow it to go unchallenged for any significant period of time. It makes sense to prefer a linguistic tactic called the truth sandwich to blunt the at least some of cognitive power of lies and deceit.

Moral irrationality: Pinker points to steady social progress citing the court case, Loving v. Virginia, 388 U.S. 1 (1967), a Supreme Court civil rights decision that struck down all state laws banning interracial marriage. He asserts that “moral irrationality” can be outgrown. By casting interracial marriage in terms of being morally irrational, he incorporates conceptions of what is morality rational and what isn't in his conception of social progress. That is an important point because it correctly sees politics as a matter of not just ice-cold facts and logic, but also hot moral values.

The affective (emotional-moral?) tipping point: Pinker argues about rationality and mindset change:

Wherever we get upset about the looniness of public discourse today, we should remind ourselves that people weren't so rational in the past, either.

Persuasion by facts and logic, the m

ost direct strategy, is not always futile. . . . . Feeling their identity threatened, belief holders double down and muster more ammunition to fend off the challenge. But since another part of the human mind keeps a person in touch with reality, as the counterevidence piles up the dissonance can mount until it becomes too much to bear and the opinion topples over, a phenomenon called the affective tipping point.

Pinker goes on to point out that once something becomes ‘public knowledge’ disbelievers begin to hit their personal affective tipping point and change their minds. That is in accord with evidence that Americans who disbelieve human-caused climate change are slowly changing their minds, one at a time. But that sort of mindset change also depends on each person's subjective cost-benefit assessment of the social damage they will incur for changing their minds. As one can see, assigning rationality and irrationality to political thinking is very complicated and fraught with ambiguity. That complexity and ambiguity is the very fertile soil that tyrants, liars, kleptocrats, oligarchs, deceivers, mass murderers and other brands of bad leaders take root and grow in. Therein lies the main source of unnecessary human misery, poverty, misery, racism, bigotry, hate and bloodshed that litters human history. That is an inescapable aspect of what it is to be biologically, psychologically and sociologically human.

Debiasing thinking and fostering critical thinking: Pinker observes that the “wheels of reason turn slowly” and it makes sense to apply torque to two sources of influence, public education and the professional media. He observes that although some or many people have been arguing for better teaching of critical thinking for decades, that job is tough:

People understand concepts only when they are to think them through, to discuss them with others, and to use them to solve problems. A second impediment to effective teaching is that pupils don't spontaneously transfer from one concrete example to others in the same abstract category. . . . . With these lessons about lessons under their belts, psychologists have recently devised debiasing programs that fortify logical and critical thinking curricula. They encourage students to spot, name and correct fallacies across a broad range of contexts. . . . . Tetlock has compiled the practices of successful forecasters into a set of guidelines for good judgment . . . . These and other programs are provably effective: students newfound wisdom outlasts the training session and transfers to new subjects.

This is extremely encouraging because it says that at least some people can learn to be more rational if they want to, and mental traits that facilitate rational, critical thinking have been identified and thus directly addressed in the teaching. There is no data that says that only some people can become more politically rational. If Peter Berger in his brilliant little 1963 book, Invitation to Sociology, is right, there is nothing this observer can see that prevents the building of powerful social institutions that hold objective facts, less biased political reasoning and critical thinking as the highest moral or ethical values.

Some such institutions may exist now, probably mostly scattered, fragmented academic groups, but they are not yet powerful influences on mainstream American politics and society. That needs to change. Those institutions need to be built ASAP.

Along those lines, there is reason for encouragement. Pinker argues:
Many psychologists have called on their field to “give debiasing away” as one of its greatest potential contributions to human welfare. . . . . As one writer noted, scientists often treat the public the way Englishmen treat foreigners: they speak more slowly and more loudly. Making the world more rational, then, is not just of training people to be better reasoners and setting them loose. . . . . Experiments have shown that the right rules can avert the Tragedy of the Belief Commons [what’s rational for every individual to believe can be irrational for the society as a whole to act upon] and force people to dissociate their reasoning from their identities. . . . . Scientists themselves have hit on a new strategy called adversarial collaboration, in which mortal enemies work together to get to the bottom of an issue, setting up empirical tests that they agree beforehand will settle it (citing Psychological Science, 12, 269-275, 2001).

From this observer’s point of view, Pinker is right that if psychologists can teach debiasing, it would be one of its greatest potential contributions to human welfare. Things are not as bleak as the news would have it. Humans still have a chance to outgrow their self-destructive tendencies, even if the toll along the way is in the hundreds of millions or billions of lives.

So, is that assessment too optimistic? Or, are humans doomed to an ultimate fate of enslavement, misery and maybe even self-annihilation with complete species extinction?



B&B orig: 2/15/19

Monday, August 8, 2022

Good news from science

This is a really big deal. The NIH is now funding research into ways to enhance scientific rigor. This should be a game changer. I hope it's not too little or too late. Steve Novella at Neorologica writes:
This is a great idea, and in fact is long overdue. The NIH is awarding various grants to establish educational materials and centers to teach principles of scientific rigor to researchers. This may seem redundant, but it absolutely isn’t.

At present principles of research are taught in basic form during scientific courses, but advanced principles are largely left to individual mentorship. This creates a great deal of variability in how well researchers really understand the principles of scientific rigor. As a result, a lot of research falls short of scientific ideals. This creates a great deal of waste in the system. NIH, as a funding institution, has a great deal of incentive to reduce this waste.

The primary mechanism will be to create teaching modules that then can be made freely available to educational and research institutions. These modules would cover: 

“biases in research; logical fallacies around causality; how to develop hypotheses; designing literature searches; identifying experimental variables; and reducing confounding variables in research.”

Sounds like a good start. The “biases in research” is a broad category, so I’m not sure how thorough coverage will be. I would explicitly include as an area of education – how to avoid p-hacking. Perhaps this could be part of a broader category on how to properly use statistic in research, the limits of the p-value, and the importance of using other statistical methods like effect sizes and Bayesian analysis.  
Prior research has shown that when asked about their research behavior, about a third of researchers admit (anonymously) to bad behavior that amounts to p-hacking. This is likely mostly innocent and naive. I lecture about this topic all the time myself, and I find that many researchers are unfamiliar with the more nuanced aspects of scientific rigor.  
And of course, once the NIH requires certification, this will almost certainly make it uniform within academia, at least on the biomedical side. Then we need other research granting institutions to replicate this, also requiring certification. It basically should become impossible to have a career as a researcher in any field without some basic certification in the principles of research rigor.
OMG, someone outside Dissident Politics is actually taking logic fallacies seriously? I must have died and got reluctantly shoved up to heaven. Next after science, politics needs to tackle this same plague on democracy, humanity and civilization.

No, it is not the case that science and politics can be dealt with the same way. They are different. But it is the case that the data and reasoning behind politics can be subject to the same kind of rigor, if politics is to be based more on fact and sound reasoning than it is now. Opinions will still differ, but the extent of difference due to irrationally disputed facts, e.g., stolen election vs. not stolen, differences in opinions ought to be significantly reduced. Everyone doing politics firmly believes their politics is based on real facts and sound reasoning. A lot of research indicates that just is not true for most people, most of the time.

Politics is mostly sloppy, not rigorous.

Tuesday, July 19, 2022

An observer’s comments on ineffective Democratic messaging

“. . . . the typical citizen drops down to a lower level of mental performance as soon as he enters the political field. He argues and analyzes in a way which he would readily recognize as infantile within the sphere of his real interests. . . . cherished ideas and judgments we bring to politics are stereotypes and simplifications with little room for adjustment as the facts change. . . . . the real environment is altogether too big, too complex, and too fleeting for direct acquaintance. We are not equipped to deal with so much subtlety, so much variety, so many permutations and combinations. Although we have to act in that environment, we have to reconstruct it on a simpler model before we can manage it.” -- social scientists Christopher Achen and Larry Bartels, Democracy for Realists: Why Elections Do Not Produce Responsive Government, 2016

Demagoguery (official definition): political activity or practices that seek support by appealing to the desires and prejudices of ordinary people rather than by using rational argument

Demagoguery (Germaine definition): any political, religious, commercial or other activity or practices that seek support by playing on and/or appealing to the ignorance, desires and/or prejudices of people rather than by using rational argument; demagoguery usually relies significantly or mostly on lies, slanders, irrational emotional manipulation, flawed motivated reasoning, logic fallacies, etc.; relevant inconvenient facts, truths and sound reasoning are usually ignored, denied or distorted into an appearance of false insignificance or false irrelevance


A Washington Post opinion piece by Paul Waldman says it better than I can:
Faced with demands to do something about the right-wing revolution the Supreme Court is inflicting on the country, congressional Democrats will hold votes on bills guaranteeing marriage equality and the right to contraception. These are protected at the moment, but many fear the court and Republicans will move to attack them sometime in the near future.

Since these bills will fall to Republican filibusters in the Senate, they are demonstration votes, meant not to become law (at least not yet), but in large part to force Republicans to vote against them and thereby reveal themselves to be out of step with public opinion. As many a Democrat has said, “Let’s get them on the record.” But “getting them on the record” doesn’t accomplish much if you don’t have a strategy to turn that unpopular vote into a weapon that can be used to actually punish those Republicans. And there’s little evidence Democrats have such a strategy.

Sure, they’ll issue some news releases and talk about it on cable news. And here or there the vote might find its way into a campaign mailer (“Congressman Klunk voted against contraception! Can the women of the Fifth District really trust Congressman Klunk?”). But I fear that too many Democrats think getting them on the record is enough by itself.

The reason is that unlike their Republican counterparts, Democrats tend to have far too much faith in the American voter.

People in Washington, especially Democrats, suffer from an ailment that is not confined to the nation’s capital. It plays out in all kinds of places and in politics at all levels. It’s the inability to see politics from the perspective of ordinary people.

This blindness isn’t a matter of elitism. The problem is that it’s hard to put yourself in the mind of someone whose worldview is profoundly different from your own. If you care about politics, it’s almost impossible to understand how the average person — even the average voter — thinks about the work you do and the world you inhabit.

Here’s the problem: Most Americans have only a fraction of the understanding you do about these things — not because they’re dumb or ignorant but mainly because they just don’t care. They worry about other things, especially their jobs and their families. When they have free time they’d rather watch a ballgame or gossip with a friend than read about whether certain provisions of Build Back Better might survive in some process called “reconciliation.”

In fact, the very idea of “issues” — where a thing happening in the world is translated into something the government might implement policies to address — was somewhat foreign to them. Because I was young and enthusiastic but not schooled in subtle communication strategies, I couldn’t get beyond my own perspective and persuade them of anything.

.... most Democrats I know are still captive to the hope that politics can be rational and deliberative, ultimately producing reasonable outcomes.

Republicans have no such illusions. They usually start from the assumption that voters don’t pay attention and should be reached by the simplest, most emotionally laden appeals they can devise. So Republicans don’t bother with 10-point policy plans; they just hit voters with, “Democrats want illegals to take your job, kill your wife, and pervert your kids,” and watch the votes pour in.
If Waldman is right, how can one craft messages with the emotional impact of Republican messaging without demagoguing it or lying?

I think it is now possible for Dems to do gut-wrenching messaging without much or any demagoguery or lies. Just be blunt and relentless about reality. Be candid about the thoroughly morally rotted, fascist Republican Party, its cruel Christian nationalist dogma, its rapacious laissez-faire capitalist dogma and the radical right propaganda Leviathan, e.g., Faux News, that the stinking anti-democratic threat significantly or mostly rests on. Just say it straight without lies or slanders. There is plenty of evidence in the public record to support harsh, emotional but truthful messaging.


Qs: 
1. Is Waldman right? 
2. Is there such a thing as gut-wrenching messaging without much or any demagoguery or lies, or does wrenching guts always require demagoguery and/or lies?
3. Is demagoguery still demagoguery even if it is based on truth and sound reasoning? (I think not)

Sunday, September 6, 2020

Asymmetric Warfare: Propaganda Has a Huge Advantage




“.... what Gilbert demonstrated is that if the brain is overloaded, it will accept lies as truth. The reason is that when the brain becomes taxed, it essentially shuts down. .... As Gilbert explained explains it, “when resource-depleted persons are exposed to .… propositions they would normally disbelieve, their ability to reject those propositions is markedly reduced. .... We wear helmets to protect our brains from physical injury but no such device exists to prevent us from mental entanglements [lies and manipulation]. Until then, the best we can do is to avoid shallow forms of information or anything that is likely to contain a lie.”

This general concept of an imbalance of power has been on my mind for at least 3-4 years. It seems timely and urgent now. Liars, emotional manipulators and purveyors of flawed reasoning are out in force on both the hard core left and right. We are absolutely awash in lies, deceit, manipulation and logic fallacies. People who support trump claim this true for people who oppose the president and essentially all of the mainstream media. They believe that extreme crackpot liar sources such as Breitbart, Rush Limbaugh and Fox News are telling the truth. People on the hard core left rely on extreme crackpot liar sources such as Sputnik News and RT News because they are perceived as telling the truth.

Both sides absolutely rip mainstream sources such as the New York Times, Washington Post, CNN, MSNBC, CBS and NPR as extremist liars and deceivers.[1] In my lifetime, I've never seen anything close to this kind of extreme polarization and bitter disagreement. Facts are subjective and personal, not objective. At present, there is probably no way to bridge the different perceptions of reality, possibly absent a massive shock accompanied by mass destruction and/or death. But maybe even a disaster won't help much. The liars are probably never going to go away. There is little or no penalty for lying and destroying civil society.


What does it all mean?
Obviously opinions will differ. In my opinion, it means we are in very serious trouble. Authoritarianism on the left and right is pressing hard to fill the power vacuum left as democracy recedes. Power flows from the government tasked with protecting the people and their liberties to powerful authoritarian ideologues, special interests and the politicians they can buy or coerce. The relentless attacks from the left and right and their lies and false realities clearly is slowly pushing democracy, truth and the rule of law aside.

Authoritarians the world over, including Putin, Trump and Xi, are rejoicing over the fall of truth for so many people. Their tyrant power increases in step with our ebbing democratic power.

What does it mean for political and broader discourse? It means that people who still believe in values like truth, democracy and the rule of law need to up their messaging game.[2] They need to do that because they are at a major disadvantage in messaging wars. Why? Because the messaging that people who rely on facts, truths and sound reasoning is far more constrained and significantly less persuasive than lies, emotional manipulation and bogus reasoning.

The human mind evolved to respond strongly to irrational emotional manipulation, especially fear, but also to other negative emotions such as anger, disgust, distrust, intolerance and bigotry. The best way to foment those things is to lie. Truth tends to have less emotional impact because reality is usually less awful and threatening than lies and irrational manipulation can easily make it seem. If truth and sound reason have X persuasive power, my estimate is that lies, deceit, manipulation and irrationality have about 3-5X persuasive power.

Why might that be? Because if the world of rhetoric truth and sound reason is Y big, the world of lies, deceit, manipulation and irrationality is at least about 30-50Y, and most useful stuff is at least 3-5Y. Just consider how freeing it is to just blow off facts, true truths and sound reasoning. Even I can make up some whoppers if I just blow off facts and reason.[3]

In other words, the good guys are fighting with one hand tied. The fight isn't fair. Maybe that just reflects the fact that politics isn't usually fair. Neither is life.


Footnotes:
1. The left and right attack and reject even the fact checkers as a pack of lying liars. No source is respectable any more except those that convey their own versions of reality, truth and reason. That's a huge win for things like actual and aspiring demagogues, tyrants, kleptocrats and liars. It is a huge loss for things like American democracy, the rule of law, truth, reason and civil society.

2. That assumes we are not going to engage in a full-blown civil war with tens of millions of deaths and mass destruction of infrastructure running in the tens of trillions.

3. For example, by blowing off facts I could argue this: The president demands that pro-military personnel interests buy at least $400 million per year from his commercial properties each year, even if the price is inflated two-fold for such special guests. In return for such "honest" business and the enhanced profits that would flow to the president, he has agreed to not cut military and veterans salaries and benefits by 50%.

Or, I could argue that the president secretly promised his supporters at least 25% lower taxes for voting for him, while taxes for Biden voters will be increased by 50%.

Well, at least I hope those are whoppers.

Sunday, March 8, 2020

Election Tactics 2020: Infiltrate and Smear


Eric Prince - sleazeball and 007 wannabe

The New York Times reports that the Trump campaign is hiring professional spies to infiltrate Democratic congressional campaigns, labor organizations and other anti-Trump groups. Presumably, the campaign of whoever the dems nominate for president will be infiltrated too. Given the president’s unfettered reliance on dark free speech (lies, deceit, unwarranted opacity, unwarranted emotional manipulation, logic fallacies, etc.), it is reasonable to expect that this tactic will lead to words being taken out of context and/or misconstrued in various ways to damage democratic candidates.

The war of dark free speech and sleaze is intensifying. We at new levels of extreme sleaze and lies the Trump Party and the president are willing to engage in to stay in power.

“WASHINGTON — Erik Prince, the security contractor with close ties to the Trump administration, has in recent years helped recruit former American and British spies for secretive intelligence-gathering operations that included infiltrating Democratic congressional campaigns, labor organizations and other groups considered hostile to the Trump agenda, according to interviews and documents. 
One of the former spies, an ex-MI6 officer named Richard Seddon, helped run a 2017 operation to copy files and record conversations in a Michigan office of the American Federation of Teachers, one of the largest teachers’ unions in the nation. Mr. Seddon directed an undercover operative to secretly tape the union’s local leaders and try to gather information that could be made public to damage the organization, documents show. 
Using a different alias the next year, the same undercover operative infiltrated the congressional campaign of Abigail Spanberger, then a former C.I.A. officer who went on to win an important House seat in Virginia as a Democrat. The campaign discovered the operative and fired her. 
Both operations were run by Project Veritas, a conservative group that has gained attention using hidden cameras and microphones for sting operations on news organizations, Democratic politicians and liberal advocacy groups. Mr. Seddon’s role in the teachers’ union operation — detailed in internal Project Veritas emails that have emerged from the discovery process of a court battle between the group and the union — has not previously been reported, nor has Mr. Prince’s role in recruiting Mr. Seddon for the group’s activities.”
Everything that people in a democratic campaign say will now be misconstrued and ruthlessly used against them. The real power of dark free speech to destroy democracies and the rule of law will become clearer in the coming months. We live in interesting times.