Etiquette



DP Etiquette

First rule: Don't be a jackass.

Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.
Showing posts sorted by date for query logic fallacies. Sort by relevance Show all posts
Showing posts sorted by date for query logic fallacies. Sort by relevance Show all posts

Monday, August 8, 2022

Good news from science

This is a really big deal. The NIH is now funding research into ways to enhance scientific rigor. This should be a game changer. I hope it's not too little or too late. Steve Novella at Neorologica writes:
This is a great idea, and in fact is long overdue. The NIH is awarding various grants to establish educational materials and centers to teach principles of scientific rigor to researchers. This may seem redundant, but it absolutely isn’t.

At present principles of research are taught in basic form during scientific courses, but advanced principles are largely left to individual mentorship. This creates a great deal of variability in how well researchers really understand the principles of scientific rigor. As a result, a lot of research falls short of scientific ideals. This creates a great deal of waste in the system. NIH, as a funding institution, has a great deal of incentive to reduce this waste.

The primary mechanism will be to create teaching modules that then can be made freely available to educational and research institutions. These modules would cover: 

“biases in research; logical fallacies around causality; how to develop hypotheses; designing literature searches; identifying experimental variables; and reducing confounding variables in research.”

Sounds like a good start. The “biases in research” is a broad category, so I’m not sure how thorough coverage will be. I would explicitly include as an area of education – how to avoid p-hacking. Perhaps this could be part of a broader category on how to properly use statistic in research, the limits of the p-value, and the importance of using other statistical methods like effect sizes and Bayesian analysis.  
Prior research has shown that when asked about their research behavior, about a third of researchers admit (anonymously) to bad behavior that amounts to p-hacking. This is likely mostly innocent and naive. I lecture about this topic all the time myself, and I find that many researchers are unfamiliar with the more nuanced aspects of scientific rigor.  
And of course, once the NIH requires certification, this will almost certainly make it uniform within academia, at least on the biomedical side. Then we need other research granting institutions to replicate this, also requiring certification. It basically should become impossible to have a career as a researcher in any field without some basic certification in the principles of research rigor.
OMG, someone outside Dissident Politics is actually taking logic fallacies seriously? I must have died and got reluctantly shoved up to heaven. Next after science, politics needs to tackle this same plague on democracy, humanity and civilization.

No, it is not the case that science and politics can be dealt with the same way. They are different. But it is the case that the data and reasoning behind politics can be subject to the same kind of rigor, if politics is to be based more on fact and sound reasoning than it is now. Opinions will still differ, but the extent of difference due to irrationally disputed facts, e.g., stolen election vs. not stolen, differences in opinions ought to be significantly reduced. Everyone doing politics firmly believes their politics is based on real facts and sound reasoning. A lot of research indicates that just is not true for most people, most of the time.

Politics is mostly sloppy, not rigorous.

Sunday, July 31, 2022

The science of propaganda, spin and doubt: A short summary

At the least, the information in this post should be mandatory knowledge for both a high school degree and for any post high school credential. If a person does not know this, they are more susceptible to the dark arts than is justifiable in American democracy. -- Germaine, 2022


Context
Lots of books and thousands of research articles have been written on propaganda and why and how it works so well. Propaganda became sophisticated in America a couple of years before World War 1. To get the US into WW1, president Woodrow Wilson created the Committee on Public Information. The CPI was a gigantic US government deceit and emotional manipulation machine. Tens of thousands of spinning con artists worked for it. Wilson's goal was to con the American people into supporting American entry into the war and feeling emotionally justified, e.g., making the world safe for democracy. Some of the greatest propagandists of the 20th century, maybe of all time, worked on that effort. It was a smashing success.

Wilson's massive public disinformation effort jump-started modern propaganda ("public relations") in support of businesses and commerce (discussed here). Business leaders watching how effective propaganda could be to get people to walk into a brutal war quickly realized that good propaganda wasn't just for governments to use to deceive people into making the ultimate self-sacrifice. It could be used by businesses to deceive both customers and governments. It was, and still is, a freaking super rich gold mine chock full of diamonds, platinum, lithium and all the hot, juicy cheeseburgers that T**** could ever eat.


A short summary of propaganda tactics
In 2021, two researchers, Rebecca Goldberg and Laura Vandenberg, at the University of Massachusetts, Department of Environmental Health Sciences, and School of Public Health and Health Sciences, published a very nice summary of spin or propaganda tactics from 5 major sources.[1] Their paper is entitledThe science of spin: targeted strategies to manufacture doubt with detrimental effects on environmental and public health.

The paper's abstract includes these comments:
Results: We recognized 28 unique tactics used to manufacture doubt. Five of these tactics were used by all five organizations, suggesting that they are key features of manufactured doubt. The intended audience influences the strategy used to misinform, and logical fallacies contribute to their efficacy.

Conclusions: This list of tactics can be used by others to build a case that an industry or group is deliberately manipulating information associated with their actions or products. Improved scientific and rhetorical literacy could be used to render them less effective, depending on the audience targeted, and ultimately allow for the protection of both environmental health and public health more generally.

The list of tactics that special interests who used them is shown below in Table 1 from the article. Table 2 lists the logic fallacies the propagandists tend to rely on.





Tactics or strategies 1, 2, 3, 8 and 21 were all used by all five sources of deceit and doubt.
  • 1. Attack Study Design: To emphasize study design flaws in A** that have only minimal effects on outcomes. Flaws include issues related to bias, confounding, or sample size
  • 2. Gain Support from Reputable Individuals: Recruit experts or influential people in certain fields (politicians, industry, journals, doctors, scientists, health officials) to defend B** in order to gain broader support
  • 3. Misrepresent data: Cherry-pick data, design studies to fail, or conduct meta-analyses to dilute the work of A
  • 8. Employ Hyperbolic or Absolutist Language: Discuss scientific findings in absolutist terms or with hyperbole, use buzzwords to differentiate between “strong” and “poor” science (i.e. sound science, junk science, etc.),
  • 21. Influence Government/Laws: Gain inappropriate proximity to regulatory bodies and encourage pro-B policy
** “A” refers to information generated to combat scientific evidence and facts
“B” refers to information generated to promote narratives that are favorable to the industry




Acknowledgement: Thanks to Larry Motuz for bringing the work of these two researchers to my attention.


Footnote: 
1. The researchers describe the five sources of propaganda like this:
The first, Big Tobacco, is widely considered to have “written the playbook” on manufactured doubt [1]. The tobacco industry has managed to maintain its clientele for many decades in part due to manufactured scientific controversy about the health effects of active and secondhand smoking [1, 2, 4, 6, 10,11,12,13].

The other industries we examined include the coal industry, whose employees often suffer from black lung disease [14], yet the industry has avoided awarding compensation to many affected miners by wielding disproportionate influence in the courtroom [15,16,17,18,19]; the sugar industry, which distracted from its role contributing to metabolic and cardiovascular diseases [20] by deflecting blame toward dietary fat as a plausible alternative cause for rising population-level chronic disease rates [21,22,23,24,25]; the agrochemical business, Syngenta, manufacturer of the herbicide atrazine [26,27,28], which conducted personal attacks against a vocal critic of atrazine whose research revealed disruptive effects on the endocrine systems of aquatic animals [29, 30]; and the Marshall Institute, a conservative think tank comprised of Cold War physicists eager to maintain their proximity to government, and associated scientists who deliberately misrepresented information to the government to both minimize and normalize the effects of fossil fuels on global temperatures [1, 4, 31].

Tuesday, July 19, 2022

An observer’s comments on ineffective Democratic messaging

“. . . . the typical citizen drops down to a lower level of mental performance as soon as he enters the political field. He argues and analyzes in a way which he would readily recognize as infantile within the sphere of his real interests. . . . cherished ideas and judgments we bring to politics are stereotypes and simplifications with little room for adjustment as the facts change. . . . . the real environment is altogether too big, too complex, and too fleeting for direct acquaintance. We are not equipped to deal with so much subtlety, so much variety, so many permutations and combinations. Although we have to act in that environment, we have to reconstruct it on a simpler model before we can manage it.” -- social scientists Christopher Achen and Larry Bartels, Democracy for Realists: Why Elections Do Not Produce Responsive Government, 2016

Demagoguery (official definition): political activity or practices that seek support by appealing to the desires and prejudices of ordinary people rather than by using rational argument

Demagoguery (Germaine definition): any political, religious, commercial or other activity or practices that seek support by playing on and/or appealing to the ignorance, desires and/or prejudices of people rather than by using rational argument; demagoguery usually relies significantly or mostly on lies, slanders, irrational emotional manipulation, flawed motivated reasoning, logic fallacies, etc.; relevant inconvenient facts, truths and sound reasoning are usually ignored, denied or distorted into an appearance of false insignificance or false irrelevance


A Washington Post opinion piece by Paul Waldman says it better than I can:
Faced with demands to do something about the right-wing revolution the Supreme Court is inflicting on the country, congressional Democrats will hold votes on bills guaranteeing marriage equality and the right to contraception. These are protected at the moment, but many fear the court and Republicans will move to attack them sometime in the near future.

Since these bills will fall to Republican filibusters in the Senate, they are demonstration votes, meant not to become law (at least not yet), but in large part to force Republicans to vote against them and thereby reveal themselves to be out of step with public opinion. As many a Democrat has said, “Let’s get them on the record.” But “getting them on the record” doesn’t accomplish much if you don’t have a strategy to turn that unpopular vote into a weapon that can be used to actually punish those Republicans. And there’s little evidence Democrats have such a strategy.

Sure, they’ll issue some news releases and talk about it on cable news. And here or there the vote might find its way into a campaign mailer (“Congressman Klunk voted against contraception! Can the women of the Fifth District really trust Congressman Klunk?”). But I fear that too many Democrats think getting them on the record is enough by itself.

The reason is that unlike their Republican counterparts, Democrats tend to have far too much faith in the American voter.

People in Washington, especially Democrats, suffer from an ailment that is not confined to the nation’s capital. It plays out in all kinds of places and in politics at all levels. It’s the inability to see politics from the perspective of ordinary people.

This blindness isn’t a matter of elitism. The problem is that it’s hard to put yourself in the mind of someone whose worldview is profoundly different from your own. If you care about politics, it’s almost impossible to understand how the average person — even the average voter — thinks about the work you do and the world you inhabit.

Here’s the problem: Most Americans have only a fraction of the understanding you do about these things — not because they’re dumb or ignorant but mainly because they just don’t care. They worry about other things, especially their jobs and their families. When they have free time they’d rather watch a ballgame or gossip with a friend than read about whether certain provisions of Build Back Better might survive in some process called “reconciliation.”

In fact, the very idea of “issues” — where a thing happening in the world is translated into something the government might implement policies to address — was somewhat foreign to them. Because I was young and enthusiastic but not schooled in subtle communication strategies, I couldn’t get beyond my own perspective and persuade them of anything.

.... most Democrats I know are still captive to the hope that politics can be rational and deliberative, ultimately producing reasonable outcomes.

Republicans have no such illusions. They usually start from the assumption that voters don’t pay attention and should be reached by the simplest, most emotionally laden appeals they can devise. So Republicans don’t bother with 10-point policy plans; they just hit voters with, “Democrats want illegals to take your job, kill your wife, and pervert your kids,” and watch the votes pour in.
If Waldman is right, how can one craft messages with the emotional impact of Republican messaging without demagoguing it or lying?

I think it is now possible for Dems to do gut-wrenching messaging without much or any demagoguery or lies. Just be blunt and relentless about reality. Be candid about the thoroughly morally rotted, fascist Republican Party, its cruel Christian nationalist dogma, its rapacious laissez-faire capitalist dogma and the radical right propaganda Leviathan, e.g., Faux News, that the stinking anti-democratic threat significantly or mostly rests on. Just say it straight without lies or slanders. There is plenty of evidence in the public record to support harsh, emotional but truthful messaging.


Qs: 
1. Is Waldman right? 
2. Is there such a thing as gut-wrenching messaging without much or any demagoguery or lies, or does wrenching guts always require demagoguery and/or lies?
3. Is demagoguery still demagoguery even if it is based on truth and sound reasoning? (I think not)

Monday, July 11, 2022

Personal thoughts: Is it even possible to debate demagoguery?

Demagoguery (official definition): political activity or practices that seek support by appealing to the desires and prejudices of ordinary people rather than by using rational argument


Demagoguery (Germaine definition): any political, religious, commercial or other activity or practices that seek support by playing on and/or appealing to the ignorance, desires and/or prejudices of people rather than by using rational argument; demagoguery usually relies significantly or mostly on lies, slanders, irrational emotional manipulation, flawed motivated reasoning, logic fallacies, etc.; relevant inconvenient facts, truths and sound reasoning are usually ignored, denied or distorted into an appearance of false insignificance or false irrelevance



Way back in 2014, when cowboys with six shooters were duking it out against cattle rustling T. rex lizards, Bill Nye the science guy publicly debated young Earth believer Tom Ham, a crackpot Christian nationalist. He is a demagogue by Germaine's definition. Ham, the founder and chief executive officer of Young Earth creationist ministry and Answers in Genesis, challenged Nye to debate the question "Is Creation A Viable Model of Origins?" The debate was held at Ham's "Creation Museum" in Petersburg, Kentucky.




Before the debate, Team R&R (reality and reason) urged Nye not to debate because there was nothing to debate. Many in the scientific community criticized Nye's decision to debate, arguing that it lent undue credibility to the creationist worldview. Ham argued crackpottery like cowboys duking it out with dinosaurs in the wild, wild West. Obviously, Team R&R had a point. But Nye debated anyway. As expected, things ended just as they started. Minds did not change. But, Ham did get some publicity for his "museum" and probably made some extra money.


Rock solid proof that cowboys and 
dinosaurs co-existed in the 1800s


Over the years, it slowly dawned that, like the Nye-Ham nonsense, debating demagoguery is pointless, but probably unavoidable in most situations. Such debates are arguably more harmful than beneficial as Team R&R argued. But, maybe not as harmful as not engaging with demagoguery at all. There is nothing to debate when demagogues deny or distort important facts, resort to flawed reasoning and so forth. But they are there, influencing public opinion, well funded, and not going away.

Much (most?) of the harm arises from false balancing (false equivalence, bothsidesism). By simply debating with a demagogue, the demagogue's false assertions (lies), flawed reasoning and whatnot are treated with seriousness and respect they do not deserve. In the hands of a skilled demagogue, false balancing can feel or seem like rational thinking, especially when it appeals to prejudices, comforting false beliefs and the like. 

We easily mistake psychological comfort for rationality, i.e., nonsense has to be rational because it feels so right. But when relevant facts and the reasoning applied to them heavily favor one side and heavily undermines the other, a basis for rationality just isn't there. But the basis for false belief is still there, i.e., people still want to feel good about themselves and their beliefs, even when there is no basis for it. That never goes away. That is the demagogue's target.

The problem is that by ignoring the demagogue and not trying to counter the lies and nonsense, Team R&R leaves the public opinion playing field uncontested for the demagogues to slime all over. Demagoguery is rampant in major issues including climate change, climate regulations, gun regulations, the scope and meaning of the Constitution, civil liberties, and abortion. 


Slimed by demagoguery &
the ground gets slippery


I suppose little or none of this is new to most folks here at Dissident Politics. It's all come up multiple times. Guess it doesn't hurt to repeat some things. 

Saturday, October 9, 2021

Dark free speech tactics: Sealioning, Gish gallop and other popular deceit and manipulation tactics



“The mind is divided into parts, like a rider (controlled processes) on an elephant (automatic processes). The rider evolved to serve the elephant. . . . . intuitions come first, strategic reasoning second. Therefore, if you want to change someone’s mind about a moral or political issue, talk to the elephant first. .... Republicans understand moral psychology. Democrats don’t. Republicans have long understood that the elephant is in charge of political behavior, not the rider, and they know how elephants work. Their slogans, political commercials and speeches go straight for the gut . . . . Republicans don’t just aim to cause fear, as some Democrats charge. They trigger the full range of intuitions described by Moral Foundations Theory.” -- Psychologist Johnathan Haidt, The Righteous Mind: Why Good People are Divided by Politics and Religion, 2012


Dark free speech (DFS): Constitutionally or legally protected (1) lies and deceit to distract, misinform, confuse, polarize and/or demoralize, (2) unwarranted opacity to hide inconvenient truths, facts and corruption (lies and deceit of omission), (3) unwarranted emotional manipulation (i) to obscure the truth and blind the mind to lies and deceit, and (ii) to provoke irrational, reason-killing emotions and feelings, including fear, hate, anger, disgust, distrust, intolerance, cynicism, pessimism and all kinds of bigotry including racism, and (4) ideologically-driven motivated reasoning and other ideologically-driven biases that unreasonably distort reality and reason. Germaine, ~2016 or thereabouts


There are lots of ways to engage in debate in ways that can feel right and principled, but are in effect ways to subvert principled focused debate into far less rational or focused engagements. Provoking frustration, impatience and anger are common goals of subverting rhetorical tactics. Logic fallacies are a common tactic of people that have to rely on weak or non-existent fact, truth and/or reasoning positions, e.g., the 2020 election was stolen. Denying, distorting or irrationally downplaying inconvenient facts and truths are also popular and usually present in some form in nearly all DFS. 

Here is how some of these things are described.

Sealioning (also spelled sea-lioning and sea lioning) is a type of trolling or harassment that consists of pursuing people with persistent requests for evidence or repeated questions, while maintaining a pretense of civility and sincerity.[1][2][3][4] It may take the form of "incessant, bad-faith invitations to engage in debate".[5] The term originated with a 2014 strip of the webcomic Wondermark by David Malki.

The troll feigns ignorance and politeness, so that if the target is provoked into making an angry response, the troll can then act as the aggrieved party.[7][8] Sealioning can be performed by a single troll or by multiple ones acting in concert.[9] The technique of sealioning has been compared to the Gish gallop and metaphorically described as a denial-of-service attack targeted at human beings.[10]

An essay in the collection Perspectives on Harmful Speech Online, published by the Berkman Klein Center for Internet & Society at Harvard, noted:

Rhetorically, sealioning fuses persistent questioning—often about basic information, information easily found elsewhere, or unrelated or tangential points—with a loudly-insisted-upon commitment to reasonable debate. It disguises itself as a sincere attempt to learn and communicate. Sealioning thus works both to exhaust a target's patience, attention, and communicative effort, and to portray the target as unreasonable. While the questions of the "sea lion" may seem innocent, they're intended maliciously and have harmful consequences. — Amy Johnson, Berkman Klein Center for Internet & Society (May 2019) (emphasis added

The Gish gallop is a rhetorical technique in which a debater attempts to overwhelm an opponent by excessive number of arguments, without regard for the accuracy or strength of those arguments. The term was coined by Eugenie Scott, who named it after Duane Gish. Scott argued that Gish used the technique frequently when challenging the scientific fact of evolution.[1][2] It is similar to a method used in formal debate called spreading.

During a Gish gallop, a debater confronts an opponent with a rapid series of many specious arguments, half-truths, and misrepresentations in a short space of time, which makes it impossible for the opponent to refute all of them within the format of a formal debate.[3][4] In practice, each point raised by the "Gish galloper" takes considerably more time to refute or fact-check than it did to state in the first place.[5] The technique wastes an opponent's time and may cast doubt on the opponent's debating ability for an audience unfamiliar with the technique, especially if no independent fact-checking is involved[6] or if the audience has limited knowledge of the topics.
In the case of the Gush gallop, the dark free speech proponent can plays on a person's ignorance to make arguments and asserted facts or truths seem at least plausible. It shifts the burden to the principled participant to fact check, which often takes more time and effort than is reasonable and is often frustrating, which tends to degrade the quality and social usefulness of the debate.


Whataboutism: Whataboutism or whataboutery (as in "what about…?") is a variant of the tu quoque logical fallacy, which attempts to discredit an opponent's position by charging hypocrisy without directly refuting or disproving the argument (Germaine: or without showing its relevance). 

Whataboutism is usually embedded in false narratives implied through irrelevant questions. When cornered, there are two typical strategies. One, claim "I'm just asking questions! Two, claim "I can't prove it, but it sounds right!"


Wikipedia on false balance or bothsidesism: False balance, also bothsidesism, is a media bias in which journalists present an issue as being more balanced between opposing viewpoints than the evidence supports. Journalists may present evidence and arguments out of proportion to the actual evidence for each side, or may omit information that would establish one side's claims as baseless. False balance has been cited as a cause of misinformation.[1]

False balance is a bias, which usually stems from an attempt to avoid bias, and gives unsupported or dubious positions an illusion of respectability. It creates a public perception that some issues are scientifically contentious, though in reality they aren't, therefore creating doubt about the scientific state of research, and can be exploited by interest groups such as corporations like the fossil fuel industry or the tobacco industry, or ideologically motivated activists such as vaccination opponents or creationists.[2]

Examples of false balance in reporting on science issues include the topics of man-made versus natural climate change, the health effects of tobacco, the alleged relation between thiomersal and autism,[3] and evolution versus intelligent design.

A fallacy is reasoning that is logically incorrect, undermines the logical validity of an argument, or is recognized as unsound. All forms of human communication can contain fallacies.

Because of their variety, fallacies are challenging to classify. They can be classified by their structure (formal fallacies) or content (informal fallacies). Informal fallacies, the larger group, may then be subdivided into categories such as improper presumption, faulty generalization, error in assigning causation and relevance, among others.

The use of fallacies is common when the speaker's goal of achieving common agreement is more important to them than utilizing sound reasoning. When fallacies are used, the premise should be recognized as not well-grounded, the conclusion as unproven (but not necessarily false), and the argument as unsound.

Informal fallacies

Informal fallacies – arguments that are logically unsound for lack of well-grounded premises.[14]
  • Argument to moderation (false compromise, middle ground, fallacy of the mean, argumentum ad temperantiam) – assuming that a compromise between two positions is always correct.[15]
  • Continuum fallacy (fallacy of the beard, line-drawing fallacy, sorites fallacy, fallacy of the heap, bald man fallacy, decision-point fallacy) – improperly rejecting a claim for being imprecise.[16]
  • Correlative-based fallacies
    • Suppressed correlative – a correlative is redefined so that one alternative is made impossible (e.g., "I'm not fat because I'm thinner than John.").[17]
  • Definist fallacy – defining a term used in an argument in a biased manner (e.g., using "loaded terms"). The person making the argument expects that the listener will accept the provided definition, making the argument difficult to refute.[18]
  • Divine fallacy (argument from incredulity) – arguing that, because something is so incredible or amazing, it must be the result of superior, divine, alien or paranormal agency.[19]
  • Double counting – counting events or occurrences more than once in probabilistic reasoning, which leads to the sum of the probabilities of all cases exceeding unity.
  • Equivocation – using a term with more than one meaning in a statement without specifying which meaning is intended.[20]
    • Ambiguous middle term – using a middle term with multiple meanings.[21]
    • Definitional retreat – changing the meaning of a word when an objection is raised.[22] Often paired with moving the goalposts (see below), as when an argument is challenged using a common definition of a term in the argument, and the arguer presents a different definition of the term and thereby demands different evidence to debunk the argument.
    • Motte-and-bailey fallacy – conflating two positions with similar properties, one modest and easy to defend (the "motte") and one more controversial (the "bailey").[23] The arguer first states the controversial position, but when challenged, states that they are advancing the modest position.[24][25]
    • Fallacy of accent – changing the meaning of a statement by not specifying on which word emphasis falls.
    • Persuasive definition – purporting to use the "true" or "commonly accepted" meaning of a term while, in reality, using an uncommon or altered definition.
    • (cf. the if-by-whiskey fallacy)
  • Ecological fallacy – inferring about the nature of an entity based solely upon aggregate statistics collected for the group to which that entity belongs.[26]
  • Etymological fallacy – assuming that the original or historical meaning of a word or phrase is necessarily similar to its actual present-day usage.[27]
  • Fallacy of composition – assuming that something true of part of a whole must also be true of the whole.[28]
  • Fallacy of division – assuming that something true of a composite thing must also be true of all or some of its parts.[29]
  • False attribution – appealing to an irrelevant, unqualified, unidentified, biased or fabricated source in support of an argument.
  • False authority (single authority) – using an expert of dubious credentials or using only one opinion to promote a product or idea. Related to the appeal to authority.
  • False dilemma (false dichotomy, fallacy of bifurcation, black-or-white fallacy) – two alternative statements are given as the only possible options when, in reality, there are more.[31]
  • False equivalence – describing two or more statements as virtually equal when they are not.
  • Slippery slope (thin edge of the wedge, camel's nose) – asserting that a proposed, relatively small, first action will inevitably lead to a chain of related events resulting in a significant and negative event and, therefore, should not be permitted.[43]
  • Special pleading – the arguer attempts to cite something as an exemption to a generally accepted rule or principle without justifying the exemption (e.g.: a defendant who murdered his parents asks for leniency because he is now an orphan).
  • Etc., etc., etc. 

Red herring fallacies

  • Ad hominem – attacking the arguer instead of the argument. (Note that "ad hominem" can also refer to the dialectical strategy of arguing on the basis of the opponent's own commitments. This type of ad hominem is not a fallacy.)
    • Circumstantial ad hominem – stating that the arguer's personal situation or perceived benefit from advancing a conclusion means that their conclusion is wrong.[70]
    • Poisoning the well – a subtype of ad hominem presenting adverse information about a target person with the intention of discrediting everything that the target person says.[71]
    • Appeal to motive – dismissing an idea by questioning the motives of its proposer.
    • Tone policing – focusing on emotion behind (or resulting from) a message rather than the message itself as a discrediting tactic.
    • Traitorous critic fallacy (ergo decedo, 'thus leave') – a critic's perceived affiliation is portrayed as the underlying reason for the criticism and the critic is asked to stay away from the issue altogether. Easily confused with the association fallacy ("guilt by association") below.
  • Appeal to authority (argument from authority, argumentum ad verecundiam) – an assertion is deemed true because of the position or authority of the person asserting it.[72][73]
  • Straw man fallacy – misrepresenting an opponent's argument by broadening or narrowing the scope of a premise and/or refuting a weaker version of their argument (e.g.: If someone says that killing animals is wrong because we are animals too saying "It is not true that humans have no moral worth" would be a strawman since they have not asserted that humans have no moral worth, rather that the moral worth of animals and humans are equivalent.)[105]
  • Texas sharpshooter fallacy – improperly asserting a cause to explain a cluster of data.[106] This fallacy is an informal fallacy which is committed when differences in data are ignored, but similarities are overemphasized. From this reasoning, a false conclusion is inferred.[1] This fallacy is the philosophical or rhetorical application of the multiple comparisons problem (in statistics) and apophenia (in cognitive psychology). It is related to the clustering illusion, which is the tendency in human cognition to interpret patterns where none actually exist. The name comes from a joke about a Texan who fires some gunshots at the side of a barn, then paints a target centered on the tightest cluster of hits and claims to be a sharpshooter.
  • Tu quoque ('you too' – appeal to hypocrisy, whataboutism) – stating that a position is false, wrong, or should be disregarded because its proponent fails to act consistently in accordance with it.[107]
  • Two wrongs make a right – assuming that, if one wrong is committed, another wrong will rectify it.

 As one can see, there are a heck of a lot of ways to derail focused, principled debate into fluff, false beliefs, social discord, etc. Skilled trolls, professional propagandists and most hard core ideologues are familiar with these tactics. Most people and interests that use dark free speech (~97% ?), do so without hesitation or moral qualm. Even people who try to stay principled can engage in logic fallacies without being aware if it. 


Given the way the human mind evolved to work, existing research evidence indicates that relative to principled debate grounded in honest speech, dishonest debate grounded in DFS can be and often is more persuasive. In my opinion, reasonable sounding DFS, usually not crackpottery like the trash that QAnon spews, tends to be about 2-4-fold more effective in influencing public opinion. Being limited to facts, true truths and sound reasoning forecloses a whole lot of rhetorical territory and tactics that can be used to describe real or fake facts, truths and reality. 

Some logic fallacies were discussed here several time before, e.g., this chapter review.

One moral argument holds that people who decide and act based on DFS, false beliefs, misinformation, disinformation and the like deprives them of the power to decide and act based on truth and reality. A counter moral argument is that the ends justify the means and thus lies, deceit, irrational emotional manipulation are morally justified. I consider the counter moral argument to be inherently anti-democratic and pro-authoritarian.


Questions: 
1. Is it reasonable to believe that DFS is more effective than honest speech in convincing people to believe things?

2. Since both DFS and honest speech are legal and constitutionally protected, are both morally equivalent?

Monday, February 1, 2021

The Application of Logos

 Logos is Greek term meaning "discourse" or "plea" and it's essentially argumentation.


We use it when we engage in debate. We can employ informal logic to articulate and critically examination positions through logos.


This is probably familiar to most of you.


If you're going to employ it helps to understand common fallacies that come up in debate. Things like burning straw men, appeals to hypocrisy, appeals to nature, appeals to tradition, appeals to emotion, appeals to authority, and even appeals to logical fallacies are often fallacious.


Here's the issue with it. It usually doesn't help, as per what I call John Stuart Mill's lament. He writes in "The Oppression of Women":

 The difficulty is that which exists in all cases in which there is a mass of feeling to be contended against. So long as opinion is strongly rooted in the feelings, it gains rather than loses instability by having a preponderating weight of argument against it. For if it were accepted as a result of argument, the refutation of the argument might shake the solidity of the conviction; but when it rests solely on feeling, worse it fares in argumentative contest, the more persuaded adherents are that their feeling must have some deeper ground, which the arguments do not reach; and while the feeling remains, it is always throwing up fresh intrenchments of argument to repair any breach made in the old. And there are so many causes tending to make the feelings connected with this subject the most intense and most deeply-rooted of those which gather round and protect old institutions and custom, that we need not wonder to find them as yet less undermined and loosened than any of the rest by the progress the great modern spiritual and social transition;

 

I only disagree with him on one aspect of this, and that is that it doesn't include thinking errors in his analysis. In fact, I'd say thoughts - more specifically thinking errors - are more profound than feelings in terms of causing us to hold incorrect beliefs. Feelings are where our investment in those thoughts are grounded. They work in tandem, but they are distinct, as I'm sure most any mental health professional familiar with cognitive behavioral therapy will tell you.

Given he wrote this in 1869 we can afford him some leeway in terms of how he conceptualizes the way we think, as he's close enough.

Untangling thinking errors is a personal thing. I've got loads of them due to a messy childhood and mental illness. The only way to untangle them is to want to. It has to start with the person themselves.

Logic isn't going to help instill the desire to change beliefs. Pain and loss due to those beliefs will as long as they can see the connection. Self-interest will. This makes debate almost futile except in the unfortunately rarer cases where all parties are interested in self-examination and self-correction, rather than self-preservation.

Take a page from Plato. Where logos is profoundly helpful - I'd argue most helpful - is when we debate ourselves - and do so honestly. Our ego spends much of its conscious time preserving our id. This includes defending our worldview, however flawed. We can apply critical thinking to our own internal rhetoric, and that is probably the most effective use of logos, because if you're willing to do so, you're receptive to change as a matter of course.

I'll go further and say that whether it's internal or external debate, another aspect of debating effectively is humility. If you already think you know everything you're going to defend it rather than be open to learning something new or being corrected. Humility is a foundational component - perhaps the foundational building block of wisdom, and it's central to allowing us to learn.

The question then becomes, are you capable of being humble and honest with yourself? It's not automatic. It takes work. Sometimes it even takes therapy, rather than a cathartic Internet debate. The work however, is good for you.

If you think you're immune to this, or think you've already mastered it then it will make you more susceptible to thinking errors in your complacency. None of us have mastered it because the kind of eternal and incessant vigilance required to check every one of our beliefs simply isn't human. We don't have the mental throughput to do that. That said, we can check the important ones, and be more open to others checking them on our behalf. Ultimately they're doing you a favor.

Sunday, October 11, 2020

Climate Science Denial: The Motte-and-Bailey Logic Fallacy

The motte is the structure on the high ground and the bailey is 
below and inside the fenced area:
the bailey is easier to attack than the motte
(10th century technology)


Wikipedia: The motte-and-bailey fallacy (named after the motte-and-bailey castle) is a form of argument and an informal fallacy where an arguer conflates two positions which share similarities, one modest and easy to defend (the "motte") and one much more controversial (the "bailey").[1] The arguer advances the controversial position, but when challenged, they insist that they are only advancing the more modest position.[2][3] Upon retreating to the motte, the arguer can claim that the bailey has not been refuted (because the critic refused to attack the motte)[1] or that the critic is unreasonable (by equating an attack on the bailey with an attack on the motte).


Employing logic fallacies to deceive, distract, disinform and so forth is a common tactic among purveyors of dark free speech or epistemic terrorism. In the vice presidential debate, Mike Pence used the motte-and-bailey fallacy to deceive and confuse people about climate change. At the Neurologica blog, Steve Novella explains it nicely:
“Pence represented the typical denial strategy. He started by saying that the climate is changing, we just don’t know why or what to do about it. This is the motte and bailey fallacy in action – pull back from the position that is untenable to defend an easier position, but don’t completely surrender the outer position. Pence was not about to deny that global warming is happening at all in that forum because he would be too easily eviscerated, so he just tried to muddy the waters on what may seem like an easier point.

But of course, he is completely wrong on both counts. We do know what is causing climate change, it is industrial release of CO2 and other greenhouse gases. At least there is a strong consensus of scientists who are 95% confident or more this is the major driver, and there is no tenable competing theory. That is what a scientific fact looks like. We also know what to do about it – decrease global emissions of CO2 and other greenhouse gases. And we know how to do that – change our energy infrastructure to contain more carbon neutral sources with the goal of decarbonizing energy. Change our transportation industry as much as possible over to electric (or perhaps hydrogen) vehicles. Advance other industrial processes that release significant amounts of CO2. And look for ways to improve energy efficiency and sequester carbon efficiently. It’s not like there aren’t actual detailed published plans for exactly what to do about it.

Pence, however, will rush from his perceived motte into the bailey of total denial when he feels he has an opening. So he also said that the “climate change alarmists” are warning about hurricanes, but we are having the same number of hurricanes today as we did 100 years ago. This is not literally true (there were six hurricanes so far this year in the North Atlantic, and four in 1920), and it looks from the graph like there is a small uptick, but let’s say it’s true enough that statistically there isn’t a significant change in the number of hurricanes. This is called lying with facts – give a fact out of context that creates a deliberately false impression. In this case the false impression is also a straw man, because climate scientists don’t claim that global warming increases the number of hurricanes. They claim (their models predict) that warming increases the power and negative effects from the hurricanes that do occur.

Pence next tried to take credit for dropping CO2 release from the US, as if this is tied to pulling out of the Paris Accord. It is true that CO2 emissions are decreasing, but this is a trend that has been fairly linear since 2005. Between 2005 and 2018 US CO2 emissions dropped 12%. This is largely due to shifting energy production to less CO2 producing methods, including rising renewables. But also, I will acknowledge, this is partly due to a shift from coal to natural gas. There has been a huge drop in coal as a percentage of US energy. Pence selectively used this fact to defend natural gas, glossing over the fact that this is a greater knock against coal, which he does not want to criticize.

Admittedly a live debate is not the place to get into all these details, but pretty much everything Pence said on the climate was misleading and tracked with fossil fuel industry talking points rather than the scientific consensus.”

A couple of things merit comment. 

First, Trump, Pence and the GOP generally have been ruthlessly using logic flaws, lies and deceptive rhetoric for decades to confuse people and sow doubt in the face of contrary climate science evidence they cannot refute using either evidence (facts) or sound reasoning (~logic). Since they do that with climate science, it seems reasonable to believe that they would do that for all other things they dislike or want to deny, science-related or not.

Second, special interests with threatened economic interests have been doing the same thing for decades. 

Third, conservative politicians and special interests who distort or deny realities based on science or anything else are deeply immoral in their unwarranted distortions and denials. In this regard, they are moral cowards.

Sunday, September 6, 2020

Asymmetric Warfare: Propaganda Has a Huge Advantage




“.... what Gilbert demonstrated is that if the brain is overloaded, it will accept lies as truth. The reason is that when the brain becomes taxed, it essentially shuts down. .... As Gilbert explained explains it, “when resource-depleted persons are exposed to .… propositions they would normally disbelieve, their ability to reject those propositions is markedly reduced. .... We wear helmets to protect our brains from physical injury but no such device exists to prevent us from mental entanglements [lies and manipulation]. Until then, the best we can do is to avoid shallow forms of information or anything that is likely to contain a lie.”

This general concept of an imbalance of power has been on my mind for at least 3-4 years. It seems timely and urgent now. Liars, emotional manipulators and purveyors of flawed reasoning are out in force on both the hard core left and right. We are absolutely awash in lies, deceit, manipulation and logic fallacies. People who support trump claim this true for people who oppose the president and essentially all of the mainstream media. They believe that extreme crackpot liar sources such as Breitbart, Rush Limbaugh and Fox News are telling the truth. People on the hard core left rely on extreme crackpot liar sources such as Sputnik News and RT News because they are perceived as telling the truth.

Both sides absolutely rip mainstream sources such as the New York Times, Washington Post, CNN, MSNBC, CBS and NPR as extremist liars and deceivers.[1] In my lifetime, I've never seen anything close to this kind of extreme polarization and bitter disagreement. Facts are subjective and personal, not objective. At present, there is probably no way to bridge the different perceptions of reality, possibly absent a massive shock accompanied by mass destruction and/or death. But maybe even a disaster won't help much. The liars are probably never going to go away. There is little or no penalty for lying and destroying civil society.


What does it all mean?
Obviously opinions will differ. In my opinion, it means we are in very serious trouble. Authoritarianism on the left and right is pressing hard to fill the power vacuum left as democracy recedes. Power flows from the government tasked with protecting the people and their liberties to powerful authoritarian ideologues, special interests and the politicians they can buy or coerce. The relentless attacks from the left and right and their lies and false realities clearly is slowly pushing democracy, truth and the rule of law aside.

Authoritarians the world over, including Putin, Trump and Xi, are rejoicing over the fall of truth for so many people. Their tyrant power increases in step with our ebbing democratic power.

What does it mean for political and broader discourse? It means that people who still believe in values like truth, democracy and the rule of law need to up their messaging game.[2] They need to do that because they are at a major disadvantage in messaging wars. Why? Because the messaging that people who rely on facts, truths and sound reasoning is far more constrained and significantly less persuasive than lies, emotional manipulation and bogus reasoning.

The human mind evolved to respond strongly to irrational emotional manipulation, especially fear, but also to other negative emotions such as anger, disgust, distrust, intolerance and bigotry. The best way to foment those things is to lie. Truth tends to have less emotional impact because reality is usually less awful and threatening than lies and irrational manipulation can easily make it seem. If truth and sound reason have X persuasive power, my estimate is that lies, deceit, manipulation and irrationality have about 3-5X persuasive power.

Why might that be? Because if the world of rhetoric truth and sound reason is Y big, the world of lies, deceit, manipulation and irrationality is at least about 30-50Y, and most useful stuff is at least 3-5Y. Just consider how freeing it is to just blow off facts, true truths and sound reasoning. Even I can make up some whoppers if I just blow off facts and reason.[3]

In other words, the good guys are fighting with one hand tied. The fight isn't fair. Maybe that just reflects the fact that politics isn't usually fair. Neither is life.


Footnotes:
1. The left and right attack and reject even the fact checkers as a pack of lying liars. No source is respectable any more except those that convey their own versions of reality, truth and reason. That's a huge win for things like actual and aspiring demagogues, tyrants, kleptocrats and liars. It is a huge loss for things like American democracy, the rule of law, truth, reason and civil society.

2. That assumes we are not going to engage in a full-blown civil war with tens of millions of deaths and mass destruction of infrastructure running in the tens of trillions.

3. For example, by blowing off facts I could argue this: The president demands that pro-military personnel interests buy at least $400 million per year from his commercial properties each year, even if the price is inflated two-fold for such special guests. In return for such "honest" business and the enhanced profits that would flow to the president, he has agreed to not cut military and veterans salaries and benefits by 50%.

Or, I could argue that the president secretly promised his supporters at least 25% lower taxes for voting for him, while taxes for Biden voters will be increased by 50%.

Well, at least I hope those are whoppers.

Monday, August 3, 2020

The Human Mind and the Hot-Cold Empathy Gap

Prior research has shown that people mispredict their own behavior and preferences across affective states. When people are in an affectively “cold” state, they fail to fully appreciate how “hot” states will affect their own preferences and behavior. When in hot states, they underestimate the influence of those states and, as a result, overestimate the stability of their current preferences. The same biases apply interpersonally; for example, people who are not affectively aroused underappreciate the impact of hot states on other people’s behavior. After reviewing research documenting such intrapersonal and interpersonal hot– cold empathy gaps, this article examines their consequences for medical, and specifically cancer-related, decision making, showing, for example, that hot– cold empathy gaps can lead healthy persons to expose themselves excessively to health risks and can cause health care providers to undertreat patients for pain. -- George Loewenstein, Carnegie Mellon University, Health Psychology, Vol. 24, No. 4(Suppl.), S49 –S56, 2005 [1]


The Hot-Cold Empathy Gap
An NPR broadcast of Hidden Brain, discussed research on strong physiological (hunger, sexual arousal, pain) and emotional states (fear, anger, disgust) that can move people's minds from cold states to hot states. In hot states, physiology and/or emotions control, and at the same time memory of cold state knowledge and logic or reasoning are unavailable to shape behavior. In hot states, things just happen, and sometimes (usually?) they are bad or dumb things.

The comments below are mostly based on the broadcast from the start to about 20:40 and ~50:00 to 53:00. Maybe most people here will already understand all of this. Nonetheless, it should help to keep this important aspect of the human mind in easily accessed memory.


People in a cold state tend to misjudge what their behavior would be when they are in a hot state. Men's behavior when sexually aroused changes compared to when not aroused. When arousal passes people appear to have forgotten and downplay the intensity of the hot state. Studies show that after experiencing a hot state and returning to a cold state, people are generally worse at predicting what their behavior would be if they returned to the hot state.

The data indicates that the hot-cold empathy gap works two ways across time, prospective and retrospective. The prospective gap leads people to misjudge their future behavior if they re-experience a hot state they have experienced before, such as sexual arousal. The hypothesis here is that the memory that people have of their own hot state experience is softened or distorted, leading them to misjudge themselves in the past and their future hot state behavior.

The retrospective empathy gap is also hypothesized to involve the same memory tricks, which can happen literally within a minute or two of a hot state situation such as feeling pain. People who experienced pain and then had the pain source withdrawn, immediately misjudge and overestimate their ability to handle the same pain again. The same phenomena applies to hunger, addiction and depression. The cold state mind and what it knows is unable to access the hot state mind, making the hot state version of a person incomprehensible. The hot state mind cannot access the cold state logic. One woman, Irene, in a cold state said about this about her own hot state sexual arousal experiences: "I don't know that girl."

That was cold Irene talking about hot Irene.

This phenomenon also applies to other people. The empathy gap can literally blind us to how other people feel and why they do some of the things they do.


The Empathy Gap and Politics
Maybe this restates the obvious, but it still is worth saying. When politicians, special interests, ideologues and others use dark free speech (lies, deceit, emotional manipulation) (collectively 'bad people') to create false realities, leverage flawed reasoning and win support, they are generally trying to push listeners into a hot state. Fear is probably the most powerful emotion that bad people have in their dark free speech arsenal. Anger, bigotry, disgust, distrust and intolerance are other powerful emotions that bad people play on to try foment hot states and irrationality.

People in hot states are more susceptible to lies, deceit and flawed reasoning, including logic fallacies. That is why it is important to at least try to maintain emotional control when engaging in politics. And when control is lost, it is usually best to walk away until control is regained. The cooling off period can be very useful to help maintain rationality, even if it requires backing away overnight.


Footnote:
1. Lowenstein also writes:
"Affect has the capacity to transform us, as human beings, profoundly; in different affective states, it is almost as if we are different people. Affect influences virtually every aspect of human functioning: perception, attention, inference, learning, memory, goal choice, physiology, reflexes, self-concept, and so on. Indeed, it has been argued that the very function of affect is to orchestrate a comprehensive response to critical situations that were faced repeatedly in the evolutionary past (Cosmides & Tooby, 2000)."


Sunday, August 2, 2020

SOMETHING TO THINK ABOUT

Thinking about thinking (without the BS)



The first logic class I ever took was in a philosophy course in college. And that’s part of the problem.
I don’t mean the problem with me, though there are many. I mean the problem with our politics, our civics, and just the way we get along (or don’t) right now. One way we could help with all that, believe it or not, would be to teach logic the way we teach math: Start early, keep at it, and make it required. I’ve taught logic to fourth graders, proof you don’t need a Ph.D. to share the basics and get kids in the habit of evaluating claims and thinking about their own thinking.
One deceptively simple definition of logic is "the study of correct reasoning, especially regarding making inferences."
Logic is about understanding what follows from something else, what must be true, given a certain premise. It’s about the leap from A to B, or in logic parlance, from p to q, as in “if p, then q.” Logic is what takes us from a premise, via inference, to a conclusion. Let’s say all cats have tails. In that universe, if it’s a cat, then it must have a tail. Get it? Of course you do.
But speaking of cute (we hope), imagine a toddler who lives with a cat and recently learned the word “kitty.” One day, the toddler is cruising around in the back of mom’s car and spots a fuzzy, four-legged animal. The toddler joyously points at this poodle and yells “Kitty! Kitty!” Mom smiles and chooses not to shatter the happy moment with a distracted-driving lecture on logical fallacies.
I, however, have no such qualms (sorry kid): This toddler, perhaps forgivably, assumed all cute fuzzy four-legged animals are “kitty.” That’s a common flaw in logic, a logical fallacy, and not just among toddlers; it’s often called hasty generalization or overgeneralization. And this type of fallacy and others are everywhere. They’re used, believed, repeated, broadcast, printed, and repeated some more, sometimes knowingly, sometimes unknowingly. Once you’re familiar with them, you see them everywhere, especially in election season. I’ll bet a beer and a biscuit that after reading the prime offenders below, you’ll notice them regularly between now and November, and maybe for the rest of your life (again, sorry, but you’re better off). So here are just seven of many deadly logic sins, a most-wanted list of tried-and-true, mass-misleading fallacies, simplified and combined for easy reading:
Fancy Latin name: ad hominem ("to the person")
Simple description: Attacking the person, not the argument or position.
Example: In a debate, Candidate A makes a policy recommendation. Opposing Candidate B says, “What do you know? You’re just a [insert any term seen as denigrating]!” Candidate B has certainly disparaged Candidate A but in no way addressed the policy suggestion. Fallacious fail.
A similarly invalid and unfair cousin of ad hominem is guilt by association. A more positive but equally fallacious relative is appeal to authority. (Seen any attack ads or endorsements lately?)
Fancy Latin name: post hoc ergo propter hoc ("after this therefore because of this")
Simpler science-y description: correlation is not causation.
So-simplified-it-actually-had-to-be-longer explanation: Just because event A precedes event B does not mean A caused B.
Example: In February of a U.S. president’s first term, the unemployment rate falls sharply. The president declares, “See! I’m a job-creating president!” In reality, it’s unlikely that the president — though his paddle is bigger than the average citizen’s — significantly changed the course of the supertanker that is the U.S. economy in one month. There are likely other reasons or causes for the improvement.
Yummier example: Crime rates rise as ice-cream consumption rises (that’s generally true, by the way). Fallacious reasoning: Clearly, ice cream is making people go insane with pleasure and commit crimes, plus ice-cream addicts are jacking people to get ice-cream money.
Actually, it’s just that ice-cream consumption and crime rates both tend to rise in summer. Along these lines, with the clear exception of my magic Boston Celtics socks, your lucky hat, lucky shoes, or — apologies to an AL.com Pulitzer-Prize-winning columnist — lucky fish before Alabama games did not cause your team to win. Unless, of course, you literally (and accurately) threw it in the opposing team’s faces at a key moment during a game.
Fancy name: false dichotomy
Simple name: either-or thinking
Simple description: Simplistically presenting the complex, gray-area world as if there are only two choices.
Real example: After the 9-11 terror attacks, some political leaders said, in effect or exactly word-for-word, “If you’re not with us, you’re with the terrorists.” Uhm … actually, no. Someone can hate the terrorists and be against what you’re doing, too. Reality is not nearly as simple as your kindergarten-level portrayal. It almost never is. Advertising often relies on a false dichotomy, too: Use this product or you’re a chump. Again, no. I can avoid your product as if it’s a smelly guy with a bad cough and a machete and yet still not be a chump. Matter of fact, since you tried that fake, weak, fallacious Jedi mind-trick to try to capitalize on insecurity, using your product is what would actually make me a chump.
Simple name: straw man
Simple description: Distorting an opposing argument so you can more easily knock it down.
Example: Candidate A says, “Foreign aid often includes products that U.S. businesses make, then get paid for, and even so, it accounts for less than 1% of the national budget. I’m OK with keeping foreign aid expenditures where they are.”
Candidate B responds indignantly, knowing a loud show of emotion will be broadcast all over, "Why do you care more about foreigners than you do about U.S. citizens?!?"
That, of course, is not what Candidate A said, but it might soon be spread around the world.
By the way. this response also includes another type of logical fallacy, a non sequitur, Latin for “does not follow.” Fallacious panderers often get their money’s worth by using several fallacies simultaneously.
Simple name: overgeneralization
Simple description: Drawing a conclusion based on too little evidence.
Toddler example: See above how every cute fuzzy four-legged animal equals a kitty.
Adult but still cat-lover example: “My cat has a tail, and so does every other cat I’ve seen, so all cats have tails.” (Understandable, but wrong. See Manx cats, mutations, rocking chairs.)
One dangerous brand of overgeneralization is stereotyping, or unfairly attributing a quality to an entire group of people, like “all Asians are _____,” or “women are _____.” Stereotypes are sometimes positive, often negative, but always wrong with specific, actual people. They’re also straightforward examples of how simplistic, sloppy thinking can hurt people.
Simple name or description: slippery slope
Simple description: You assume, without evidence, that one event will lead to other, often undesirable, events.
Real example: A well-known pundit in 2009 repeatedly said that allowing same-sex marriage could lead to humans marrying animals, including goats, ducks, dolphins and turtles. If it came down to it, I guess I’d choose a dolphin (I value intelligence and love to swim), but to my knowledge, there have been no hot-zones of inter-species matrimony since gay marriage became legal. Likewise, no matter your views on the subject, we can all agree that few if any human-turtle hybrids are walking around, which helps show the fallaciousness of that particular slippery-slope argument.
Simple names: false equivalence or false analogy.
Simple description: You assume things that are alike in one way are alike in other ways.
This fallacy is painfully common in politics and media perception. It’s even a crutch or a byproduct of overworked, lazy or otherwise compromised news producers: “I don’t care that 99.9% of the field is saying X! Get that bombastic suspiciously funded contrarian who’s saying Y in the studio and give him equal time — that’ll make for interesting (and misleading) TV!” Or, “You’re saying my political party is corrupt. So is yours!” Or, “You’re saying my news source is slanted. So is yours!” This reflexive both-side-ism appeals to our American egalitarianism. But facts aren’t egalitarian. As the heartless killer Marlo in “The Wire” explained, they’re one way, not another way. It’s highly unlikely that Political Party A and Political Party B commit identical transgressions and to an identical degree. It’s also highly unlikely that News Outlets C and D are biased, inaccurate, misleading or damaging in the same way, to the same degree, and to the same number of people.
These are some of the most common errors in logic that can mislead us even from true premises to false conclusions. But even airtight logic can bring us to false conclusions if a premise is false. Logic matters, and the facts it depends on matter, too.
Learning about logic, which is what joins facts into the web of how we understand the world, is one type of a valuable but rare endeavor: thinking about our own thinking. I know some of you would love to get that clueless uncle or gullible Facebook friend thinking, period, but thinking about our own thinking does improve thinking in general. It makes it less automatic, less reflexive, less taken for granted, and less impervious to the insane idea that we might be wrong. That’s crucial because, in addition to swimming in logical fallacies and purposeful misinformation, we’re all lugging around an unfortunate filter psychologists call “confirmation bias.” It’s one of the most important truths anyone can grasp: We all tend to accept evidence that supports what we already believe but dismiss what would undercut our beliefs. Given that backdrop, skilled media manipulators, and bias-boosting social-media algorithms, bad logic that seems like common sense is all the more seductive and misleading.
Carsen is a reporter and editor turned teacher who lives in Birmingham.
https://www.al.com/news/2020/08/thinking-about-thinking-without-the-bs.html