Etiquette



DP Etiquette

First rule: Don't be a jackass.

Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.
Showing posts sorted by date for query logic fallacies. Sort by relevance Show all posts
Showing posts sorted by date for query logic fallacies. Sort by relevance Show all posts

Friday, April 11, 2025

InfoWars and messaging wars: Do we deserve the politicians and government we get?

I think some of us do. The ones who voted for djt and MAGA Republicans, and the ones who didn't vote for Harris arguably deserve what they are getting. But what about the rest of us?

As usual, there are complexities.[1] For example, (i) gerrymandering gives political parties the power to choose their voters, rather than voters choosing their representatives, (ii) the electoral college means that for president some votes have more power than others, (iii) primary elections tend to produce more radical candidates, which tend on the political right to be corrupt, anti-democratic and pro-authoritarian, and (iv) voter power depends on accurate information, which means that mass media has a lot of power to either empower or disempower voters. 

Things like lying, slandering (or insulting), distorting information, hiding information, irrational emotional manipulation and asserting flawed reasoning shifts power from deceived people to the deceivers. It takes from deceived and manipulated voters their power to use their vote to get who and what they want. That shields politicians from accountability. 

In theory, the federal government could ban gerrymandering because it is fundamentally anti-democratic. But that's not going to happen. We need to get rid of the electoral college, but that's also a pipe dream. Also impossible, are (i) imposing ethics laws with real teeth on the Supreme Court, (ii)  disincentivizing (taxing?) lies and crackpottery in mass and social media, (iii) relying on some sort of morality check to independently rank political candidates on the basis of pro-morality traits like honesty, reliance on sound reasoning, and sufficient relevant experience needed for competence in the job (inexperienced = unqualified or at least underqualified = a possible indicator of bad moral character for trying to get the job). 



Upping the messaging game
What a mess. There are lots of deceived and manipulated Americans. Many of them, probably a large majority, cannot be coaxed into reality by fact and sound reasoning alone. That seems to be more a fact than an opinion. Since major reform efforts are doomed for the foreseeable future, all that's left that is non-violent appears to be better messaging against the rising tide of MAGA demagoguery, authoritarianism and corruption.

What's a better messaging strategy that does not veer into dark free speech such as demagoguery, lies slanders, etc.? The only thing that is available seems to strong appeals to emotion packaged with just enough fact and sound reasoning to shift the message from one that is mostly an appeal to evidence and rationality to one that mostly appeals to emotional. 

Unless I already do mostly appeal to emotion, I will try to consciously shift my content from appeals to evidence and reason to mostly appeals to emotion. 


Yabut whadabout logic fallacy?
According to Wikipedia, appealing to emotion, or argumentum ad passiones, is an informal fallacy characterized by the manipulation of the recipient's emotions in order to win an argument, especially in the absence of factual evidence. This kind of appeal to emotion is irrelevant to or distracting from the facts of the argument (a so-called "red herring") and encompasses several logical fallacies, including appeal to consequences, appeal to fear, flattery, pity, ridicule, or spite, and wishful thinking. .... It is only fallacious when the emotions that are elicited are irrelevant to evaluating the truth of the conclusion and serve to distract from rational consideration of relevant premises or information. .... The power of emotions to influence judgment, including political attitudes, has been recognized since classical antiquity. Aristotle, in his treatise Rhetoric, described emotional arousal as critical to persuasion. .... Disagreeing with Seneca the Younger that emotion destroys reason, the 18th century Scottish philosopher George Campbell argued, instead, that emotions were allies of reason, and that they aid in the assimilation of knowledge. However, Campbell warned of the malleability of emotion and the consequent risk in terms of suggestibility:

[Emotions] are not supplanters of reason, or even rivals in her sway; they are her handmaids, by whose ministry she is enabled to usher truth into the heart, and procure it to favorable reception. As handmaids, they are liable to be seduced by sophistry in the garb of reason, and sometimes are made ignorantly to lend their aid in the introduction of falsehood.


Qs: 1) Under current political, social, religious and commercial circumstances, is appealing to emotion necessarily a logic fallacy, or can emotional appeals be mostly honest and mostly moral because they are strongly or completely aligned with facts, true truths and sound reasoning, i.e., not unreasonably distracting from rational consideration. 2) Are current political, social, religious and commercial circumstances relevant here to honesty and morality? 3) Do I already mostly appeal to emotion?


But what if that child really is starving?


Footnote:
1. Pxy:
Q: In view of the human condition, including human cognitive biology and social behavior, and current political circumstances, how true is the following assertion?: “we get the politicians (or government) we deserve." Take into account the relative persuasive power of honest speech (facts, truths and sound reasoning) compared to dishonest speech (demagoguery, lies, slanders, crackpot conspiracies, etc.) under modern circumstances, e.g., social media, giant corporations with vast power, political radicalization, etc.

A: .... long analysis ....

Cognitive constraints 

Confirmation Bias and Political Reasoning
Almost every political actor, from ordinary citizens to syndicated pundits, regularly falls victim to confirmation bias—the tendency to seek information that confirms existing beliefs while disregarding contradictory evidence. This cognitive shortcut allows people to maintain confidence in their political positions despite opposing views, creating echo chambers that reinforce polarization.

The Focusing Illusion in Political Attention
Daniel Kahneman's "focusing illusion" demonstrates that what we focus on becomes disproportionately important in our minds. As described by politicwise.org, "what you focus on becomes important to you. What is important to you becomes important to your politics." This explains why media coverage shapes political priorities, often elevating sensational issues over substantive policy matters.

Conclusion
The statement "we get the politicians we deserve" contains an important kernel of truth about collective responsibility in democratic systems. Citizens do shape their political environment through their choices and non-choices. However, this perspective must be balanced against recognition of the systemic constraints and cognitive limitations that influence political outcomes.

True political agency requires not just moral virtue from citizens but also information environments that enable meaningful deliberation. Without addressing the structural advantages that dishonest speech currently enjoys in our media ecosystem, we will continue to see a disconnect between the government we deserve and the government we actually get.

Thursday, January 2, 2025

An important rhetoric lesson: Criticism vs other forms of negative rhetoric

A comment here and a post at Law Dork by Chris Geidner raises the issue of criticism vs other forms of negative rhetoric such as mockery, insults, irrationality, assertions of logic fallacies (crackpottery), lying, slandering and disrespect, including violation of the Principle of Charity in rhetoric.
 


Geidner raises this issue in the context of chief justice Roberts criticizing legitimate vs illegitimate criticism of judges:

John Roberts attacks court criticism that 
he decides lacks a credible basis as illegitimate
Conflating violence against judges with broad criticism the court faces for its extremism, the chief justice ultimately sends a chilling end-of-year report
 
Chief Justice John Roberts decided to take on critics of the U.S. Supreme Court in his annual end-of-year report on Tuesday with a disingenuous half-response that is nonetheless instructive — and disturbing — for what he does say.

On the last day of the year, the chief justice of the United States traditionally releases his end-of-year report. It generally addresses a topic of the year in a vague and uninspiring way, leading to little coverage and even less change. This year, however, the nine pages from Roberts come across as more of a lashing out than a reasoned report.

While acknowledging that “the courts are no more infallible than any other branch,” Roberts spent the second half of the report conflating violence and lies with legitimate criticism. He does so, moreover, while completely ignoring the ethical questions that have swirled around the court and Roberts’s leadership of it over the past two years, as well as substantive opposition to the court’s rulings.

The end result is a chilling, if vague, condemnation by Roberts of the widespread opposition to the extremism exhibited by the high court in its decisions and the ethical failings of justices responsible for those decisions.

Roberts writes:

I feel compelled to address four areas of illegitimate activity that, in my view, do threaten the independence of judges on which the rule of law depends: (1) violence, (2) intimidation, (3) disinformation, and (4) threats to defy lawfully entered judgments.

This is Roberts’s point in this once-a-year moment he is given — to highlight what he views as “illegitimate” criticism of the court.

Then, in the low-water mark of Roberts’s report, he made what I think is an extremely concerning comment, coming from the head of the federal judiciary:

Public officials, too, regrettably have engaged in recent attempts to intimidate judges—for example, suggesting political bias in the judge’s adverse rulings without a credible basis for such allegations.

Putting aside the questionable, subjective nature of assessing whether there is a “credible basis” for such claims, by providing no examples, Roberts was damning all manner of utterly legitimate, appropriate, and even necessary speech from public officials as illegitimate intimidation.

So, what I take from Chief Justice John Roberts’s report to the nation is that judges are supposed to be able to handle criticism, but not too much and not in a way that Roberts doesn’t like, and he will only vaguely tell us what that means, but if criticism crosses that invisible line it is illegitimate.

Got it.  
The words “ethics” or “ethical” do not appear even once in Roberts’s report.
After reading the year-end report (here), I basically agree with Geidner’s analysis. Roberts’ report is an immoral, partisan, authoritarian demagoguery. The intentional vagueness that permeates his report amounts to a logic fallacy called the Fallacy of Vagueness. This fallacy occurs when an argument depends upon the vagueness of its terms, leading to confusion or misinterpretation. Confusion arises when an argument’s validity or persuasiveness relies on terms that are not clearly defined or have borderline cases where it’s unclear whether they apply or not.

Alternatively, Roberts may be primarily using the Ambiguity Fallacy, e.g., in referring to disinformation and intimidation[1] as examples of “illegitimate” activity that threatens the independence of judges. Assertions of truth by one source can turn out to be disinformation. 

Criticism vs other forms of negative rhetoric is an issue I’ve thought about for years. When does legitimate criticism cross the line into irrational or "unprincipled" negative rhetoric? 

In my own writing, I criticize a lot but try not to cross the line into irrational or unprincipled negative rhetoric such as slanders, mockery, insults or disrespect toward the targets of my criticisms. Applying the Principle of Charity in mind helps with limiting disrespect. 

But no matter how principled me or anyone can try to be, at least some targets will see unprincipled negative rhetoric or rhetorical or logic fallacy. They call foul where none was intended. Sometimes they will be right because a person failed to stay on the side of principled criticism. In the case of honest mistake, all a person can do is accept and correct them. But sometimes, probably usually, it is not possible to come to agreement. Minds and perceptions of reality rarely change.


Both the “fallacy of vagueness” and the “fallacy of ambiguity” involve unclear language in an argument. The key distinction is that vagueness refers to a term with unclear boundaries or borderline cases, where the meaning is not precisely defined, while ambiguity means a term has multiple, distinct meanings that could be interpreted differently in the same context; essentially, vagueness is about “how much” of something, while ambiguity is about “which thing” is being referred to.


Q: Does my rhetoric too often stray from rational or principled criticisms into some form of irrational or unprincipled negative rhetoric?



Footnote:
1. The concept of disinformation is contested. Although there are common elements in the definitions of disinformation—such as the intent to deceive or cause harm—the term's application and interpretation are subject to significant debate and variation. This reflects not only the complexity of the issue but also the diverse contexts in which disinformation arises, from political campaigns to public health crises. 

The concept of intimidation is similarly contested.

Monday, August 8, 2022

Good news from science

This is a really big deal. The NIH is now funding research into ways to enhance scientific rigor. This should be a game changer. I hope it's not too little or too late. Steve Novella at Neorologica writes:
This is a great idea, and in fact is long overdue. The NIH is awarding various grants to establish educational materials and centers to teach principles of scientific rigor to researchers. This may seem redundant, but it absolutely isn’t.

At present principles of research are taught in basic form during scientific courses, but advanced principles are largely left to individual mentorship. This creates a great deal of variability in how well researchers really understand the principles of scientific rigor. As a result, a lot of research falls short of scientific ideals. This creates a great deal of waste in the system. NIH, as a funding institution, has a great deal of incentive to reduce this waste.

The primary mechanism will be to create teaching modules that then can be made freely available to educational and research institutions. These modules would cover: 

“biases in research; logical fallacies around causality; how to develop hypotheses; designing literature searches; identifying experimental variables; and reducing confounding variables in research.”

Sounds like a good start. The “biases in research” is a broad category, so I’m not sure how thorough coverage will be. I would explicitly include as an area of education – how to avoid p-hacking. Perhaps this could be part of a broader category on how to properly use statistic in research, the limits of the p-value, and the importance of using other statistical methods like effect sizes and Bayesian analysis.  
Prior research has shown that when asked about their research behavior, about a third of researchers admit (anonymously) to bad behavior that amounts to p-hacking. This is likely mostly innocent and naive. I lecture about this topic all the time myself, and I find that many researchers are unfamiliar with the more nuanced aspects of scientific rigor.  
And of course, once the NIH requires certification, this will almost certainly make it uniform within academia, at least on the biomedical side. Then we need other research granting institutions to replicate this, also requiring certification. It basically should become impossible to have a career as a researcher in any field without some basic certification in the principles of research rigor.
OMG, someone outside Dissident Politics is actually taking logic fallacies seriously? I must have died and got reluctantly shoved up to heaven. Next after science, politics needs to tackle this same plague on democracy, humanity and civilization.

No, it is not the case that science and politics can be dealt with the same way. They are different. But it is the case that the data and reasoning behind politics can be subject to the same kind of rigor, if politics is to be based more on fact and sound reasoning than it is now. Opinions will still differ, but the extent of difference due to irrationally disputed facts, e.g., stolen election vs. not stolen, differences in opinions ought to be significantly reduced. Everyone doing politics firmly believes their politics is based on real facts and sound reasoning. A lot of research indicates that just is not true for most people, most of the time.

Politics is mostly sloppy, not rigorous.

Sunday, July 31, 2022

The science of propaganda, spin and doubt: A short summary

At the least, the information in this post should be mandatory knowledge for both a high school degree and for any post high school credential. If a person does not know this, they are more susceptible to the dark arts than is justifiable in American democracy. -- Germaine, 2022


Context
Lots of books and thousands of research articles have been written on propaganda and why and how it works so well. Propaganda became sophisticated in America a couple of years before World War 1. To get the US into WW1, president Woodrow Wilson created the Committee on Public Information. The CPI was a gigantic US government deceit and emotional manipulation machine. Tens of thousands of spinning con artists worked for it. Wilson's goal was to con the American people into supporting American entry into the war and feeling emotionally justified, e.g., making the world safe for democracy. Some of the greatest propagandists of the 20th century, maybe of all time, worked on that effort. It was a smashing success.

Wilson's massive public disinformation effort jump-started modern propaganda ("public relations") in support of businesses and commerce (discussed here). Business leaders watching how effective propaganda could be to get people to walk into a brutal war quickly realized that good propaganda wasn't just for governments to use to deceive people into making the ultimate self-sacrifice. It could be used by businesses to deceive both customers and governments. It was, and still is, a freaking super rich gold mine chock full of diamonds, platinum, lithium and all the hot, juicy cheeseburgers that T**** could ever eat.


A short summary of propaganda tactics
In 2021, two researchers, Rebecca Goldberg and Laura Vandenberg, at the University of Massachusetts, Department of Environmental Health Sciences, and School of Public Health and Health Sciences, published a very nice summary of spin or propaganda tactics from 5 major sources.[1] Their paper is entitledThe science of spin: targeted strategies to manufacture doubt with detrimental effects on environmental and public health.

The paper's abstract includes these comments:
Results: We recognized 28 unique tactics used to manufacture doubt. Five of these tactics were used by all five organizations, suggesting that they are key features of manufactured doubt. The intended audience influences the strategy used to misinform, and logical fallacies contribute to their efficacy.

Conclusions: This list of tactics can be used by others to build a case that an industry or group is deliberately manipulating information associated with their actions or products. Improved scientific and rhetorical literacy could be used to render them less effective, depending on the audience targeted, and ultimately allow for the protection of both environmental health and public health more generally.

The list of tactics that special interests who used them is shown below in Table 1 from the article. Table 2 lists the logic fallacies the propagandists tend to rely on.





Tactics or strategies 1, 2, 3, 8 and 21 were all used by all five sources of deceit and doubt.
  • 1. Attack Study Design: To emphasize study design flaws in A** that have only minimal effects on outcomes. Flaws include issues related to bias, confounding, or sample size
  • 2. Gain Support from Reputable Individuals: Recruit experts or influential people in certain fields (politicians, industry, journals, doctors, scientists, health officials) to defend B** in order to gain broader support
  • 3. Misrepresent data: Cherry-pick data, design studies to fail, or conduct meta-analyses to dilute the work of A
  • 8. Employ Hyperbolic or Absolutist Language: Discuss scientific findings in absolutist terms or with hyperbole, use buzzwords to differentiate between “strong” and “poor” science (i.e. sound science, junk science, etc.),
  • 21. Influence Government/Laws: Gain inappropriate proximity to regulatory bodies and encourage pro-B policy
** “A” refers to information generated to combat scientific evidence and facts
“B” refers to information generated to promote narratives that are favorable to the industry




Acknowledgement: Thanks to Larry Motuz for bringing the work of these two researchers to my attention.


Footnote: 
1. The researchers describe the five sources of propaganda like this:
The first, Big Tobacco, is widely considered to have “written the playbook” on manufactured doubt [1]. The tobacco industry has managed to maintain its clientele for many decades in part due to manufactured scientific controversy about the health effects of active and secondhand smoking [1, 2, 4, 6, 10,11,12,13].

The other industries we examined include the coal industry, whose employees often suffer from black lung disease [14], yet the industry has avoided awarding compensation to many affected miners by wielding disproportionate influence in the courtroom [15,16,17,18,19]; the sugar industry, which distracted from its role contributing to metabolic and cardiovascular diseases [20] by deflecting blame toward dietary fat as a plausible alternative cause for rising population-level chronic disease rates [21,22,23,24,25]; the agrochemical business, Syngenta, manufacturer of the herbicide atrazine [26,27,28], which conducted personal attacks against a vocal critic of atrazine whose research revealed disruptive effects on the endocrine systems of aquatic animals [29, 30]; and the Marshall Institute, a conservative think tank comprised of Cold War physicists eager to maintain their proximity to government, and associated scientists who deliberately misrepresented information to the government to both minimize and normalize the effects of fossil fuels on global temperatures [1, 4, 31].

Tuesday, July 19, 2022

An observer’s comments on ineffective Democratic messaging

“. . . . the typical citizen drops down to a lower level of mental performance as soon as he enters the political field. He argues and analyzes in a way which he would readily recognize as infantile within the sphere of his real interests. . . . cherished ideas and judgments we bring to politics are stereotypes and simplifications with little room for adjustment as the facts change. . . . . the real environment is altogether too big, too complex, and too fleeting for direct acquaintance. We are not equipped to deal with so much subtlety, so much variety, so many permutations and combinations. Although we have to act in that environment, we have to reconstruct it on a simpler model before we can manage it.” -- social scientists Christopher Achen and Larry Bartels, Democracy for Realists: Why Elections Do Not Produce Responsive Government, 2016

Demagoguery (official definition): political activity or practices that seek support by appealing to the desires and prejudices of ordinary people rather than by using rational argument

Demagoguery (Germaine definition): any political, religious, commercial or other activity or practices that seek support by playing on and/or appealing to the ignorance, desires and/or prejudices of people rather than by using rational argument; demagoguery usually relies significantly or mostly on lies, slanders, irrational emotional manipulation, flawed motivated reasoning, logic fallacies, etc.; relevant inconvenient facts, truths and sound reasoning are usually ignored, denied or distorted into an appearance of false insignificance or false irrelevance


A Washington Post opinion piece by Paul Waldman says it better than I can:
Faced with demands to do something about the right-wing revolution the Supreme Court is inflicting on the country, congressional Democrats will hold votes on bills guaranteeing marriage equality and the right to contraception. These are protected at the moment, but many fear the court and Republicans will move to attack them sometime in the near future.

Since these bills will fall to Republican filibusters in the Senate, they are demonstration votes, meant not to become law (at least not yet), but in large part to force Republicans to vote against them and thereby reveal themselves to be out of step with public opinion. As many a Democrat has said, “Let’s get them on the record.” But “getting them on the record” doesn’t accomplish much if you don’t have a strategy to turn that unpopular vote into a weapon that can be used to actually punish those Republicans. And there’s little evidence Democrats have such a strategy.

Sure, they’ll issue some news releases and talk about it on cable news. And here or there the vote might find its way into a campaign mailer (“Congressman Klunk voted against contraception! Can the women of the Fifth District really trust Congressman Klunk?”). But I fear that too many Democrats think getting them on the record is enough by itself.

The reason is that unlike their Republican counterparts, Democrats tend to have far too much faith in the American voter.

People in Washington, especially Democrats, suffer from an ailment that is not confined to the nation’s capital. It plays out in all kinds of places and in politics at all levels. It’s the inability to see politics from the perspective of ordinary people.

This blindness isn’t a matter of elitism. The problem is that it’s hard to put yourself in the mind of someone whose worldview is profoundly different from your own. If you care about politics, it’s almost impossible to understand how the average person — even the average voter — thinks about the work you do and the world you inhabit.

Here’s the problem: Most Americans have only a fraction of the understanding you do about these things — not because they’re dumb or ignorant but mainly because they just don’t care. They worry about other things, especially their jobs and their families. When they have free time they’d rather watch a ballgame or gossip with a friend than read about whether certain provisions of Build Back Better might survive in some process called “reconciliation.”

In fact, the very idea of “issues” — where a thing happening in the world is translated into something the government might implement policies to address — was somewhat foreign to them. Because I was young and enthusiastic but not schooled in subtle communication strategies, I couldn’t get beyond my own perspective and persuade them of anything.

.... most Democrats I know are still captive to the hope that politics can be rational and deliberative, ultimately producing reasonable outcomes.

Republicans have no such illusions. They usually start from the assumption that voters don’t pay attention and should be reached by the simplest, most emotionally laden appeals they can devise. So Republicans don’t bother with 10-point policy plans; they just hit voters with, “Democrats want illegals to take your job, kill your wife, and pervert your kids,” and watch the votes pour in.
If Waldman is right, how can one craft messages with the emotional impact of Republican messaging without demagoguing it or lying?

I think it is now possible for Dems to do gut-wrenching messaging without much or any demagoguery or lies. Just be blunt and relentless about reality. Be candid about the thoroughly morally rotted, fascist Republican Party, its cruel Christian nationalist dogma, its rapacious laissez-faire capitalist dogma and the radical right propaganda Leviathan, e.g., Faux News, that the stinking anti-democratic threat significantly or mostly rests on. Just say it straight without lies or slanders. There is plenty of evidence in the public record to support harsh, emotional but truthful messaging.


Qs: 
1. Is Waldman right? 
2. Is there such a thing as gut-wrenching messaging without much or any demagoguery or lies, or does wrenching guts always require demagoguery and/or lies?
3. Is demagoguery still demagoguery even if it is based on truth and sound reasoning? (I think not)

Monday, July 11, 2022

Personal thoughts: Is it even possible to debate demagoguery?

Demagoguery (official definition): political activity or practices that seek support by appealing to the desires and prejudices of ordinary people rather than by using rational argument


Demagoguery (Germaine definition): any political, religious, commercial or other activity or practices that seek support by playing on and/or appealing to the ignorance, desires and/or prejudices of people rather than by using rational argument; demagoguery usually relies significantly or mostly on lies, slanders, irrational emotional manipulation, flawed motivated reasoning, logic fallacies, etc.; relevant inconvenient facts, truths and sound reasoning are usually ignored, denied or distorted into an appearance of false insignificance or false irrelevance



Way back in 2014, when cowboys with six shooters were duking it out against cattle rustling T. rex lizards, Bill Nye the science guy publicly debated young Earth believer Tom Ham, a crackpot Christian nationalist. He is a demagogue by Germaine's definition. Ham, the founder and chief executive officer of Young Earth creationist ministry and Answers in Genesis, challenged Nye to debate the question "Is Creation A Viable Model of Origins?" The debate was held at Ham's "Creation Museum" in Petersburg, Kentucky.




Before the debate, Team R&R (reality and reason) urged Nye not to debate because there was nothing to debate. Many in the scientific community criticized Nye's decision to debate, arguing that it lent undue credibility to the creationist worldview. Ham argued crackpottery like cowboys duking it out with dinosaurs in the wild, wild West. Obviously, Team R&R had a point. But Nye debated anyway. As expected, things ended just as they started. Minds did not change. But, Ham did get some publicity for his "museum" and probably made some extra money.


Rock solid proof that cowboys and 
dinosaurs co-existed in the 1800s


Over the years, it slowly dawned that, like the Nye-Ham nonsense, debating demagoguery is pointless, but probably unavoidable in most situations. Such debates are arguably more harmful than beneficial as Team R&R argued. But, maybe not as harmful as not engaging with demagoguery at all. There is nothing to debate when demagogues deny or distort important facts, resort to flawed reasoning and so forth. But they are there, influencing public opinion, well funded, and not going away.

Much (most?) of the harm arises from false balancing (false equivalence, bothsidesism). By simply debating with a demagogue, the demagogue's false assertions (lies), flawed reasoning and whatnot are treated with seriousness and respect they do not deserve. In the hands of a skilled demagogue, false balancing can feel or seem like rational thinking, especially when it appeals to prejudices, comforting false beliefs and the like. 

We easily mistake psychological comfort for rationality, i.e., nonsense has to be rational because it feels so right. But when relevant facts and the reasoning applied to them heavily favor one side and heavily undermines the other, a basis for rationality just isn't there. But the basis for false belief is still there, i.e., people still want to feel good about themselves and their beliefs, even when there is no basis for it. That never goes away. That is the demagogue's target.

The problem is that by ignoring the demagogue and not trying to counter the lies and nonsense, Team R&R leaves the public opinion playing field uncontested for the demagogues to slime all over. Demagoguery is rampant in major issues including climate change, climate regulations, gun regulations, the scope and meaning of the Constitution, civil liberties, and abortion. 


Slimed by demagoguery &
the ground gets slippery


I suppose little or none of this is new to most folks here at Dissident Politics. It's all come up multiple times. Guess it doesn't hurt to repeat some things. 

Saturday, October 9, 2021

Dark free speech tactics: Sealioning, Gish gallop and other popular deceit and manipulation tactics



“The mind is divided into parts, like a rider (controlled processes) on an elephant (automatic processes). The rider evolved to serve the elephant. . . . . intuitions come first, strategic reasoning second. Therefore, if you want to change someone’s mind about a moral or political issue, talk to the elephant first. .... Republicans understand moral psychology. Democrats don’t. Republicans have long understood that the elephant is in charge of political behavior, not the rider, and they know how elephants work. Their slogans, political commercials and speeches go straight for the gut . . . . Republicans don’t just aim to cause fear, as some Democrats charge. They trigger the full range of intuitions described by Moral Foundations Theory.” -- Psychologist Johnathan Haidt, The Righteous Mind: Why Good People are Divided by Politics and Religion, 2012


Dark free speech (DFS): Constitutionally or legally protected (1) lies and deceit to distract, misinform, confuse, polarize and/or demoralize, (2) unwarranted opacity to hide inconvenient truths, facts and corruption (lies and deceit of omission), (3) unwarranted emotional manipulation (i) to obscure the truth and blind the mind to lies and deceit, and (ii) to provoke irrational, reason-killing emotions and feelings, including fear, hate, anger, disgust, distrust, intolerance, cynicism, pessimism and all kinds of bigotry including racism, and (4) ideologically-driven motivated reasoning and other ideologically-driven biases that unreasonably distort reality and reason. Germaine, ~2016 or thereabouts


There are lots of ways to engage in debate in ways that can feel right and principled, but are in effect ways to subvert principled focused debate into far less rational or focused engagements. Provoking frustration, impatience and anger are common goals of subverting rhetorical tactics. Logic fallacies are a common tactic of people that have to rely on weak or non-existent fact, truth and/or reasoning positions, e.g., the 2020 election was stolen. Denying, distorting or irrationally downplaying inconvenient facts and truths are also popular and usually present in some form in nearly all DFS. 

Here is how some of these things are described.

Sealioning (also spelled sea-lioning and sea lioning) is a type of trolling or harassment that consists of pursuing people with persistent requests for evidence or repeated questions, while maintaining a pretense of civility and sincerity.[1][2][3][4] It may take the form of "incessant, bad-faith invitations to engage in debate".[5] The term originated with a 2014 strip of the webcomic Wondermark by David Malki.

The troll feigns ignorance and politeness, so that if the target is provoked into making an angry response, the troll can then act as the aggrieved party.[7][8] Sealioning can be performed by a single troll or by multiple ones acting in concert.[9] The technique of sealioning has been compared to the Gish gallop and metaphorically described as a denial-of-service attack targeted at human beings.[10]

An essay in the collection Perspectives on Harmful Speech Online, published by the Berkman Klein Center for Internet & Society at Harvard, noted:

Rhetorically, sealioning fuses persistent questioning—often about basic information, information easily found elsewhere, or unrelated or tangential points—with a loudly-insisted-upon commitment to reasonable debate. It disguises itself as a sincere attempt to learn and communicate. Sealioning thus works both to exhaust a target's patience, attention, and communicative effort, and to portray the target as unreasonable. While the questions of the "sea lion" may seem innocent, they're intended maliciously and have harmful consequences. — Amy Johnson, Berkman Klein Center for Internet & Society (May 2019) (emphasis added

The Gish gallop is a rhetorical technique in which a debater attempts to overwhelm an opponent by excessive number of arguments, without regard for the accuracy or strength of those arguments. The term was coined by Eugenie Scott, who named it after Duane Gish. Scott argued that Gish used the technique frequently when challenging the scientific fact of evolution.[1][2] It is similar to a method used in formal debate called spreading.

During a Gish gallop, a debater confronts an opponent with a rapid series of many specious arguments, half-truths, and misrepresentations in a short space of time, which makes it impossible for the opponent to refute all of them within the format of a formal debate.[3][4] In practice, each point raised by the "Gish galloper" takes considerably more time to refute or fact-check than it did to state in the first place.[5] The technique wastes an opponent's time and may cast doubt on the opponent's debating ability for an audience unfamiliar with the technique, especially if no independent fact-checking is involved[6] or if the audience has limited knowledge of the topics.
In the case of the Gush gallop, the dark free speech proponent can plays on a person's ignorance to make arguments and asserted facts or truths seem at least plausible. It shifts the burden to the principled participant to fact check, which often takes more time and effort than is reasonable and is often frustrating, which tends to degrade the quality and social usefulness of the debate.


Whataboutism: Whataboutism or whataboutery (as in "what about…?") is a variant of the tu quoque logical fallacy, which attempts to discredit an opponent's position by charging hypocrisy without directly refuting or disproving the argument (Germaine: or without showing its relevance). 

Whataboutism is usually embedded in false narratives implied through irrelevant questions. When cornered, there are two typical strategies. One, claim "I'm just asking questions! Two, claim "I can't prove it, but it sounds right!"


Wikipedia on false balance or bothsidesism: False balance, also bothsidesism, is a media bias in which journalists present an issue as being more balanced between opposing viewpoints than the evidence supports. Journalists may present evidence and arguments out of proportion to the actual evidence for each side, or may omit information that would establish one side's claims as baseless. False balance has been cited as a cause of misinformation.[1]

False balance is a bias, which usually stems from an attempt to avoid bias, and gives unsupported or dubious positions an illusion of respectability. It creates a public perception that some issues are scientifically contentious, though in reality they aren't, therefore creating doubt about the scientific state of research, and can be exploited by interest groups such as corporations like the fossil fuel industry or the tobacco industry, or ideologically motivated activists such as vaccination opponents or creationists.[2]

Examples of false balance in reporting on science issues include the topics of man-made versus natural climate change, the health effects of tobacco, the alleged relation between thiomersal and autism,[3] and evolution versus intelligent design.

A fallacy is reasoning that is logically incorrect, undermines the logical validity of an argument, or is recognized as unsound. All forms of human communication can contain fallacies.

Because of their variety, fallacies are challenging to classify. They can be classified by their structure (formal fallacies) or content (informal fallacies). Informal fallacies, the larger group, may then be subdivided into categories such as improper presumption, faulty generalization, error in assigning causation and relevance, among others.

The use of fallacies is common when the speaker's goal of achieving common agreement is more important to them than utilizing sound reasoning. When fallacies are used, the premise should be recognized as not well-grounded, the conclusion as unproven (but not necessarily false), and the argument as unsound.

Informal fallacies

Informal fallacies – arguments that are logically unsound for lack of well-grounded premises.[14]
  • Argument to moderation (false compromise, middle ground, fallacy of the mean, argumentum ad temperantiam) – assuming that a compromise between two positions is always correct.[15]
  • Continuum fallacy (fallacy of the beard, line-drawing fallacy, sorites fallacy, fallacy of the heap, bald man fallacy, decision-point fallacy) – improperly rejecting a claim for being imprecise.[16]
  • Correlative-based fallacies
    • Suppressed correlative – a correlative is redefined so that one alternative is made impossible (e.g., "I'm not fat because I'm thinner than John.").[17]
  • Definist fallacy – defining a term used in an argument in a biased manner (e.g., using "loaded terms"). The person making the argument expects that the listener will accept the provided definition, making the argument difficult to refute.[18]
  • Divine fallacy (argument from incredulity) – arguing that, because something is so incredible or amazing, it must be the result of superior, divine, alien or paranormal agency.[19]
  • Double counting – counting events or occurrences more than once in probabilistic reasoning, which leads to the sum of the probabilities of all cases exceeding unity.
  • Equivocation – using a term with more than one meaning in a statement without specifying which meaning is intended.[20]
    • Ambiguous middle term – using a middle term with multiple meanings.[21]
    • Definitional retreat – changing the meaning of a word when an objection is raised.[22] Often paired with moving the goalposts (see below), as when an argument is challenged using a common definition of a term in the argument, and the arguer presents a different definition of the term and thereby demands different evidence to debunk the argument.
    • Motte-and-bailey fallacy – conflating two positions with similar properties, one modest and easy to defend (the "motte") and one more controversial (the "bailey").[23] The arguer first states the controversial position, but when challenged, states that they are advancing the modest position.[24][25]
    • Fallacy of accent – changing the meaning of a statement by not specifying on which word emphasis falls.
    • Persuasive definition – purporting to use the "true" or "commonly accepted" meaning of a term while, in reality, using an uncommon or altered definition.
    • (cf. the if-by-whiskey fallacy)
  • Ecological fallacy – inferring about the nature of an entity based solely upon aggregate statistics collected for the group to which that entity belongs.[26]
  • Etymological fallacy – assuming that the original or historical meaning of a word or phrase is necessarily similar to its actual present-day usage.[27]
  • Fallacy of composition – assuming that something true of part of a whole must also be true of the whole.[28]
  • Fallacy of division – assuming that something true of a composite thing must also be true of all or some of its parts.[29]
  • False attribution – appealing to an irrelevant, unqualified, unidentified, biased or fabricated source in support of an argument.
  • False authority (single authority) – using an expert of dubious credentials or using only one opinion to promote a product or idea. Related to the appeal to authority.
  • False dilemma (false dichotomy, fallacy of bifurcation, black-or-white fallacy) – two alternative statements are given as the only possible options when, in reality, there are more.[31]
  • False equivalence – describing two or more statements as virtually equal when they are not.
  • Slippery slope (thin edge of the wedge, camel's nose) – asserting that a proposed, relatively small, first action will inevitably lead to a chain of related events resulting in a significant and negative event and, therefore, should not be permitted.[43]
  • Special pleading – the arguer attempts to cite something as an exemption to a generally accepted rule or principle without justifying the exemption (e.g.: a defendant who murdered his parents asks for leniency because he is now an orphan).
  • Etc., etc., etc. 

Red herring fallacies

  • Ad hominem – attacking the arguer instead of the argument. (Note that "ad hominem" can also refer to the dialectical strategy of arguing on the basis of the opponent's own commitments. This type of ad hominem is not a fallacy.)
    • Circumstantial ad hominem – stating that the arguer's personal situation or perceived benefit from advancing a conclusion means that their conclusion is wrong.[70]
    • Poisoning the well – a subtype of ad hominem presenting adverse information about a target person with the intention of discrediting everything that the target person says.[71]
    • Appeal to motive – dismissing an idea by questioning the motives of its proposer.
    • Tone policing – focusing on emotion behind (or resulting from) a message rather than the message itself as a discrediting tactic.
    • Traitorous critic fallacy (ergo decedo, 'thus leave') – a critic's perceived affiliation is portrayed as the underlying reason for the criticism and the critic is asked to stay away from the issue altogether. Easily confused with the association fallacy ("guilt by association") below.
  • Appeal to authority (argument from authority, argumentum ad verecundiam) – an assertion is deemed true because of the position or authority of the person asserting it.[72][73]
  • Straw man fallacy – misrepresenting an opponent's argument by broadening or narrowing the scope of a premise and/or refuting a weaker version of their argument (e.g.: If someone says that killing animals is wrong because we are animals too saying "It is not true that humans have no moral worth" would be a strawman since they have not asserted that humans have no moral worth, rather that the moral worth of animals and humans are equivalent.)[105]
  • Texas sharpshooter fallacy – improperly asserting a cause to explain a cluster of data.[106] This fallacy is an informal fallacy which is committed when differences in data are ignored, but similarities are overemphasized. From this reasoning, a false conclusion is inferred.[1] This fallacy is the philosophical or rhetorical application of the multiple comparisons problem (in statistics) and apophenia (in cognitive psychology). It is related to the clustering illusion, which is the tendency in human cognition to interpret patterns where none actually exist. The name comes from a joke about a Texan who fires some gunshots at the side of a barn, then paints a target centered on the tightest cluster of hits and claims to be a sharpshooter.
  • Tu quoque ('you too' – appeal to hypocrisy, whataboutism) – stating that a position is false, wrong, or should be disregarded because its proponent fails to act consistently in accordance with it.[107]
  • Two wrongs make a right – assuming that, if one wrong is committed, another wrong will rectify it.

 As one can see, there are a heck of a lot of ways to derail focused, principled debate into fluff, false beliefs, social discord, etc. Skilled trolls, professional propagandists and most hard core ideologues are familiar with these tactics. Most people and interests that use dark free speech (~97% ?), do so without hesitation or moral qualm. Even people who try to stay principled can engage in logic fallacies without being aware if it. 


Given the way the human mind evolved to work, existing research evidence indicates that relative to principled debate grounded in honest speech, dishonest debate grounded in DFS can be and often is more persuasive. In my opinion, reasonable sounding DFS, usually not crackpottery like the trash that QAnon spews, tends to be about 2-4-fold more effective in influencing public opinion. Being limited to facts, true truths and sound reasoning forecloses a whole lot of rhetorical territory and tactics that can be used to describe real or fake facts, truths and reality. 

Some logic fallacies were discussed here several time before, e.g., this chapter review.

One moral argument holds that people who decide and act based on DFS, false beliefs, misinformation, disinformation and the like deprives them of the power to decide and act based on truth and reality. A counter moral argument is that the ends justify the means and thus lies, deceit, irrational emotional manipulation are morally justified. I consider the counter moral argument to be inherently anti-democratic and pro-authoritarian.


Questions: 
1. Is it reasonable to believe that DFS is more effective than honest speech in convincing people to believe things?

2. Since both DFS and honest speech are legal and constitutionally protected, are both morally equivalent?

Monday, February 1, 2021

The Application of Logos

 Logos is Greek term meaning "discourse" or "plea" and it's essentially argumentation.


We use it when we engage in debate. We can employ informal logic to articulate and critically examination positions through logos.


This is probably familiar to most of you.


If you're going to employ it helps to understand common fallacies that come up in debate. Things like burning straw men, appeals to hypocrisy, appeals to nature, appeals to tradition, appeals to emotion, appeals to authority, and even appeals to logical fallacies are often fallacious.


Here's the issue with it. It usually doesn't help, as per what I call John Stuart Mill's lament. He writes in "The Oppression of Women":

 The difficulty is that which exists in all cases in which there is a mass of feeling to be contended against. So long as opinion is strongly rooted in the feelings, it gains rather than loses instability by having a preponderating weight of argument against it. For if it were accepted as a result of argument, the refutation of the argument might shake the solidity of the conviction; but when it rests solely on feeling, worse it fares in argumentative contest, the more persuaded adherents are that their feeling must have some deeper ground, which the arguments do not reach; and while the feeling remains, it is always throwing up fresh intrenchments of argument to repair any breach made in the old. And there are so many causes tending to make the feelings connected with this subject the most intense and most deeply-rooted of those which gather round and protect old institutions and custom, that we need not wonder to find them as yet less undermined and loosened than any of the rest by the progress the great modern spiritual and social transition;

 

I only disagree with him on one aspect of this, and that is that it doesn't include thinking errors in his analysis. In fact, I'd say thoughts - more specifically thinking errors - are more profound than feelings in terms of causing us to hold incorrect beliefs. Feelings are where our investment in those thoughts are grounded. They work in tandem, but they are distinct, as I'm sure most any mental health professional familiar with cognitive behavioral therapy will tell you.

Given he wrote this in 1869 we can afford him some leeway in terms of how he conceptualizes the way we think, as he's close enough.

Untangling thinking errors is a personal thing. I've got loads of them due to a messy childhood and mental illness. The only way to untangle them is to want to. It has to start with the person themselves.

Logic isn't going to help instill the desire to change beliefs. Pain and loss due to those beliefs will as long as they can see the connection. Self-interest will. This makes debate almost futile except in the unfortunately rarer cases where all parties are interested in self-examination and self-correction, rather than self-preservation.

Take a page from Plato. Where logos is profoundly helpful - I'd argue most helpful - is when we debate ourselves - and do so honestly. Our ego spends much of its conscious time preserving our id. This includes defending our worldview, however flawed. We can apply critical thinking to our own internal rhetoric, and that is probably the most effective use of logos, because if you're willing to do so, you're receptive to change as a matter of course.

I'll go further and say that whether it's internal or external debate, another aspect of debating effectively is humility. If you already think you know everything you're going to defend it rather than be open to learning something new or being corrected. Humility is a foundational component - perhaps the foundational building block of wisdom, and it's central to allowing us to learn.

The question then becomes, are you capable of being humble and honest with yourself? It's not automatic. It takes work. Sometimes it even takes therapy, rather than a cathartic Internet debate. The work however, is good for you.

If you think you're immune to this, or think you've already mastered it then it will make you more susceptible to thinking errors in your complacency. None of us have mastered it because the kind of eternal and incessant vigilance required to check every one of our beliefs simply isn't human. We don't have the mental throughput to do that. That said, we can check the important ones, and be more open to others checking them on our behalf. Ultimately they're doing you a favor.