Etiquette



DP Etiquette

First rule: Don't be a jackass. Most people are good.

Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.
Showing posts sorted by relevance for query logic fallacies. Sort by date Show all posts
Showing posts sorted by relevance for query logic fallacies. Sort by date Show all posts

Saturday, October 9, 2021

Dark free speech tactics: Sealioning, Gish gallop and other popular deceit and manipulation tactics



“The mind is divided into parts, like a rider (controlled processes) on an elephant (automatic processes). The rider evolved to serve the elephant. . . . . intuitions come first, strategic reasoning second. Therefore, if you want to change someone’s mind about a moral or political issue, talk to the elephant first. .... Republicans understand moral psychology. Democrats don’t. Republicans have long understood that the elephant is in charge of political behavior, not the rider, and they know how elephants work. Their slogans, political commercials and speeches go straight for the gut . . . . Republicans don’t just aim to cause fear, as some Democrats charge. They trigger the full range of intuitions described by Moral Foundations Theory.” -- Psychologist Johnathan Haidt, The Righteous Mind: Why Good People are Divided by Politics and Religion, 2012


Dark free speech (DFS): Constitutionally or legally protected (1) lies and deceit to distract, misinform, confuse, polarize and/or demoralize, (2) unwarranted opacity to hide inconvenient truths, facts and corruption (lies and deceit of omission), (3) unwarranted emotional manipulation (i) to obscure the truth and blind the mind to lies and deceit, and (ii) to provoke irrational, reason-killing emotions and feelings, including fear, hate, anger, disgust, distrust, intolerance, cynicism, pessimism and all kinds of bigotry including racism, and (4) ideologically-driven motivated reasoning and other ideologically-driven biases that unreasonably distort reality and reason. Germaine, ~2016 or thereabouts


There are lots of ways to engage in debate in ways that can feel right and principled, but are in effect ways to subvert principled focused debate into far less rational or focused engagements. Provoking frustration, impatience and anger are common goals of subverting rhetorical tactics. Logic fallacies are a common tactic of people that have to rely on weak or non-existent fact, truth and/or reasoning positions, e.g., the 2020 election was stolen. Denying, distorting or irrationally downplaying inconvenient facts and truths are also popular and usually present in some form in nearly all DFS. 

Here is how some of these things are described.

Sealioning (also spelled sea-lioning and sea lioning) is a type of trolling or harassment that consists of pursuing people with persistent requests for evidence or repeated questions, while maintaining a pretense of civility and sincerity.[1][2][3][4] It may take the form of "incessant, bad-faith invitations to engage in debate".[5] The term originated with a 2014 strip of the webcomic Wondermark by David Malki.

The troll feigns ignorance and politeness, so that if the target is provoked into making an angry response, the troll can then act as the aggrieved party.[7][8] Sealioning can be performed by a single troll or by multiple ones acting in concert.[9] The technique of sealioning has been compared to the Gish gallop and metaphorically described as a denial-of-service attack targeted at human beings.[10]

An essay in the collection Perspectives on Harmful Speech Online, published by the Berkman Klein Center for Internet & Society at Harvard, noted:

Rhetorically, sealioning fuses persistent questioning—often about basic information, information easily found elsewhere, or unrelated or tangential points—with a loudly-insisted-upon commitment to reasonable debate. It disguises itself as a sincere attempt to learn and communicate. Sealioning thus works both to exhaust a target's patience, attention, and communicative effort, and to portray the target as unreasonable. While the questions of the "sea lion" may seem innocent, they're intended maliciously and have harmful consequences. — Amy Johnson, Berkman Klein Center for Internet & Society (May 2019) (emphasis added

The Gish gallop is a rhetorical technique in which a debater attempts to overwhelm an opponent by excessive number of arguments, without regard for the accuracy or strength of those arguments. The term was coined by Eugenie Scott, who named it after Duane Gish. Scott argued that Gish used the technique frequently when challenging the scientific fact of evolution.[1][2] It is similar to a method used in formal debate called spreading.

During a Gish gallop, a debater confronts an opponent with a rapid series of many specious arguments, half-truths, and misrepresentations in a short space of time, which makes it impossible for the opponent to refute all of them within the format of a formal debate.[3][4] In practice, each point raised by the "Gish galloper" takes considerably more time to refute or fact-check than it did to state in the first place.[5] The technique wastes an opponent's time and may cast doubt on the opponent's debating ability for an audience unfamiliar with the technique, especially if no independent fact-checking is involved[6] or if the audience has limited knowledge of the topics.
In the case of the Gush gallop, the dark free speech proponent can plays on a person's ignorance to make arguments and asserted facts or truths seem at least plausible. It shifts the burden to the principled participant to fact check, which often takes more time and effort than is reasonable and is often frustrating, which tends to degrade the quality and social usefulness of the debate.


Whataboutism: Whataboutism or whataboutery (as in "what about…?") is a variant of the tu quoque logical fallacy, which attempts to discredit an opponent's position by charging hypocrisy without directly refuting or disproving the argument (Germaine: or without showing its relevance). 

Whataboutism is usually embedded in false narratives implied through irrelevant questions. When cornered, there are two typical strategies. One, claim "I'm just asking questions! Two, claim "I can't prove it, but it sounds right!"


Wikipedia on false balance or bothsidesism: False balance, also bothsidesism, is a media bias in which journalists present an issue as being more balanced between opposing viewpoints than the evidence supports. Journalists may present evidence and arguments out of proportion to the actual evidence for each side, or may omit information that would establish one side's claims as baseless. False balance has been cited as a cause of misinformation.[1]

False balance is a bias, which usually stems from an attempt to avoid bias, and gives unsupported or dubious positions an illusion of respectability. It creates a public perception that some issues are scientifically contentious, though in reality they aren't, therefore creating doubt about the scientific state of research, and can be exploited by interest groups such as corporations like the fossil fuel industry or the tobacco industry, or ideologically motivated activists such as vaccination opponents or creationists.[2]

Examples of false balance in reporting on science issues include the topics of man-made versus natural climate change, the health effects of tobacco, the alleged relation between thiomersal and autism,[3] and evolution versus intelligent design.

A fallacy is reasoning that is logically incorrect, undermines the logical validity of an argument, or is recognized as unsound. All forms of human communication can contain fallacies.

Because of their variety, fallacies are challenging to classify. They can be classified by their structure (formal fallacies) or content (informal fallacies). Informal fallacies, the larger group, may then be subdivided into categories such as improper presumption, faulty generalization, error in assigning causation and relevance, among others.

The use of fallacies is common when the speaker's goal of achieving common agreement is more important to them than utilizing sound reasoning. When fallacies are used, the premise should be recognized as not well-grounded, the conclusion as unproven (but not necessarily false), and the argument as unsound.

Informal fallacies

Informal fallacies – arguments that are logically unsound for lack of well-grounded premises.[14]
  • Argument to moderation (false compromise, middle ground, fallacy of the mean, argumentum ad temperantiam) – assuming that a compromise between two positions is always correct.[15]
  • Continuum fallacy (fallacy of the beard, line-drawing fallacy, sorites fallacy, fallacy of the heap, bald man fallacy, decision-point fallacy) – improperly rejecting a claim for being imprecise.[16]
  • Correlative-based fallacies
    • Suppressed correlative – a correlative is redefined so that one alternative is made impossible (e.g., "I'm not fat because I'm thinner than John.").[17]
  • Definist fallacy – defining a term used in an argument in a biased manner (e.g., using "loaded terms"). The person making the argument expects that the listener will accept the provided definition, making the argument difficult to refute.[18]
  • Divine fallacy (argument from incredulity) – arguing that, because something is so incredible or amazing, it must be the result of superior, divine, alien or paranormal agency.[19]
  • Double counting – counting events or occurrences more than once in probabilistic reasoning, which leads to the sum of the probabilities of all cases exceeding unity.
  • Equivocation – using a term with more than one meaning in a statement without specifying which meaning is intended.[20]
    • Ambiguous middle term – using a middle term with multiple meanings.[21]
    • Definitional retreat – changing the meaning of a word when an objection is raised.[22] Often paired with moving the goalposts (see below), as when an argument is challenged using a common definition of a term in the argument, and the arguer presents a different definition of the term and thereby demands different evidence to debunk the argument.
    • Motte-and-bailey fallacy – conflating two positions with similar properties, one modest and easy to defend (the "motte") and one more controversial (the "bailey").[23] The arguer first states the controversial position, but when challenged, states that they are advancing the modest position.[24][25]
    • Fallacy of accent – changing the meaning of a statement by not specifying on which word emphasis falls.
    • Persuasive definition – purporting to use the "true" or "commonly accepted" meaning of a term while, in reality, using an uncommon or altered definition.
    • (cf. the if-by-whiskey fallacy)
  • Ecological fallacy – inferring about the nature of an entity based solely upon aggregate statistics collected for the group to which that entity belongs.[26]
  • Etymological fallacy – assuming that the original or historical meaning of a word or phrase is necessarily similar to its actual present-day usage.[27]
  • Fallacy of composition – assuming that something true of part of a whole must also be true of the whole.[28]
  • Fallacy of division – assuming that something true of a composite thing must also be true of all or some of its parts.[29]
  • False attribution – appealing to an irrelevant, unqualified, unidentified, biased or fabricated source in support of an argument.
  • False authority (single authority) – using an expert of dubious credentials or using only one opinion to promote a product or idea. Related to the appeal to authority.
  • False dilemma (false dichotomy, fallacy of bifurcation, black-or-white fallacy) – two alternative statements are given as the only possible options when, in reality, there are more.[31]
  • False equivalence – describing two or more statements as virtually equal when they are not.
  • Slippery slope (thin edge of the wedge, camel's nose) – asserting that a proposed, relatively small, first action will inevitably lead to a chain of related events resulting in a significant and negative event and, therefore, should not be permitted.[43]
  • Special pleading – the arguer attempts to cite something as an exemption to a generally accepted rule or principle without justifying the exemption (e.g.: a defendant who murdered his parents asks for leniency because he is now an orphan).
  • Etc., etc., etc. 

Red herring fallacies

  • Ad hominem – attacking the arguer instead of the argument. (Note that "ad hominem" can also refer to the dialectical strategy of arguing on the basis of the opponent's own commitments. This type of ad hominem is not a fallacy.)
    • Circumstantial ad hominem – stating that the arguer's personal situation or perceived benefit from advancing a conclusion means that their conclusion is wrong.[70]
    • Poisoning the well – a subtype of ad hominem presenting adverse information about a target person with the intention of discrediting everything that the target person says.[71]
    • Appeal to motive – dismissing an idea by questioning the motives of its proposer.
    • Tone policing – focusing on emotion behind (or resulting from) a message rather than the message itself as a discrediting tactic.
    • Traitorous critic fallacy (ergo decedo, 'thus leave') – a critic's perceived affiliation is portrayed as the underlying reason for the criticism and the critic is asked to stay away from the issue altogether. Easily confused with the association fallacy ("guilt by association") below.
  • Appeal to authority (argument from authority, argumentum ad verecundiam) – an assertion is deemed true because of the position or authority of the person asserting it.[72][73]
  • Straw man fallacy – misrepresenting an opponent's argument by broadening or narrowing the scope of a premise and/or refuting a weaker version of their argument (e.g.: If someone says that killing animals is wrong because we are animals too saying "It is not true that humans have no moral worth" would be a strawman since they have not asserted that humans have no moral worth, rather that the moral worth of animals and humans are equivalent.)[105]
  • Texas sharpshooter fallacy – improperly asserting a cause to explain a cluster of data.[106] This fallacy is an informal fallacy which is committed when differences in data are ignored, but similarities are overemphasized. From this reasoning, a false conclusion is inferred.[1] This fallacy is the philosophical or rhetorical application of the multiple comparisons problem (in statistics) and apophenia (in cognitive psychology). It is related to the clustering illusion, which is the tendency in human cognition to interpret patterns where none actually exist. The name comes from a joke about a Texan who fires some gunshots at the side of a barn, then paints a target centered on the tightest cluster of hits and claims to be a sharpshooter.
  • Tu quoque ('you too' – appeal to hypocrisy, whataboutism) – stating that a position is false, wrong, or should be disregarded because its proponent fails to act consistently in accordance with it.[107]
  • Two wrongs make a right – assuming that, if one wrong is committed, another wrong will rectify it.

 As one can see, there are a heck of a lot of ways to derail focused, principled debate into fluff, false beliefs, social discord, etc. Skilled trolls, professional propagandists and most hard core ideologues are familiar with these tactics. Most people and interests that use dark free speech (~97% ?), do so without hesitation or moral qualm. Even people who try to stay principled can engage in logic fallacies without being aware if it. 


Given the way the human mind evolved to work, existing research evidence indicates that relative to principled debate grounded in honest speech, dishonest debate grounded in DFS can be and often is more persuasive. In my opinion, reasonable sounding DFS, usually not crackpottery like the trash that QAnon spews, tends to be about 2-4-fold more effective in influencing public opinion. Being limited to facts, true truths and sound reasoning forecloses a whole lot of rhetorical territory and tactics that can be used to describe real or fake facts, truths and reality. 

Some logic fallacies were discussed here several time before, e.g., this chapter review.

One moral argument holds that people who decide and act based on DFS, false beliefs, misinformation, disinformation and the like deprives them of the power to decide and act based on truth and reality. A counter moral argument is that the ends justify the means and thus lies, deceit, irrational emotional manipulation are morally justified. I consider the counter moral argument to be inherently anti-democratic and pro-authoritarian.


Questions: 
1. Is it reasonable to believe that DFS is more effective than honest speech in convincing people to believe things?

2. Since both DFS and honest speech are legal and constitutionally protected, are both morally equivalent?

Tuesday, July 21, 2020

The Great Logic Fallacy of the Science Deniers



The current issue of Scientific American has a short but interesting article by Naomi Oreskes, The False Logic of Science Denial. Climate science deniers dislike her for trying to debunk climate science denial. She points out that logic fallacies are common and even scientists fall prey to them over things they really should know better than to fall prey to.[1] Oreskes writes:
"All this is to say that logical fallacies are everywhere and not always easily refuted. Truth, at least in science, is not self-evident. And this helps to explain why science denial is easy to generate and hard to slay. Today we live in a world where science denial, about everything from climate change to COVID-19, is rampant, informed by fallacies of all kinds. 
But there is a meta-fallacy—an über-fallacy if you will—that motivates these other, specific fallacies. It also explains why so many of the same people who reject the scientific evidence of anthropogenic climate change also question the evidence related to COVID-19. 
Given how common it is, it is remarkable that philosophers have failed to give it a formal name. But I think we can view it as a variety of what sociologists call implicatory denial. I interpret implicatory denial as taking this form: If P, then Q. But I don't like Q! Therefore, P must be wrong. This is the logic (or illogic) that underlies most science rejection. 
Climate change: I reject the suggestion that the “magic of the market” has failed and that we need government intervention to remedy the market failure. Evolutionary theory: I am offended by the suggestion that life is random and meaningless and that there is no God. COVID-19: I resent staying home, losing income or being told by the government what do to."

So there it is. Implicatory denial is much of the explanation[2] for why climate science is deniable. Deniers don't like the idea of human-caused climate change and/or the idea of government doing anything about it. Therefore, there is no climate change for whatever reason(s) makes it believable.

The same flawed logic applies to denying vaccine usefulness, COVID-19, or whatever other accepted science gets rejected.


Footnotes:
1. Oreskes argues that a common and "vexing" fallacy a among scientists is this: If theory P is correct, then Q is predicted. An experiment to see if Q pops up, and it does. Therefore, theory P is true. This conclusion is based on a logic fallacy. Q could pop up for one or more reasons unrelated to P. The frequency of that fallacy led philosopher Karl Popper to argue that the method of science should be falsification. Popper's logic was a theory cannot be proved to be true, because not every every circumstance can be tested. A single counterexample proves a theory false.

Oreskes asserts that Popper's theory was based on logic flaw. Specifically an experiment can failed for reasons unrelated to the theory being tested. Reasons for failure include insufficiently sensitivity to detect the predicted effect or analysis software throws out real data points that are programmed to be rejected as spurious. She argues that here is no logical resolution to this. So scientists generally deal with it through consilience. That means looks to find or infer the explanation that is the most consistent with evidence from a variety of sources That approach looks at a problem from a variety of angles to see what holds up best.

2. The paper that Oreskes cites for implicatory denial, Sociological Explanations for Climate Change Denial, asserts that it is one of two common forms of science denial. The other is called interpretative denial, where facts are accepted, but the interpretation of what the facts logically lead to a different (flawed) conclusion from what unbiased people would usually come to.

Over the years, my personal experience has been that implicatory denial is more common than interpretative denial. But that's just anecdote, not solid data.



Tuesday, February 25, 2020

Logic Fallacies: Hypocrisy and Whataboutism

One thing that I used to assert when it seemed reasonable was an allegation that politicians and other players were hypocrites about blatantly doing the same things or worse variants they bitterly criticized their political opposition for doing.[1] By the time the president won the electoral college in 2016, political hypocrisy on the right was simply mind-boggling. What about hypocrisy on the left? It was still there, but it had not reached the quantity and quality of hypocrisy the right routinely practiced right out in the open. There was and still is very little moral, political or social equivalence on this point between the left and right.

A logical fallacy is reasoning mistake or error that makes an argument invalid. Logical fallacies are non-sequiturs, i.e., arguments where the conclusion doesn't follow logically from what preceded it. In essence, a logic fallacy is an invalid connection between a premise(s) (fact(s)) and the conclusion, because the conclusion does not necessarily flow from the premises. Often the facts are disputed as not facts. The human mind did not evolve to do precise logic. People make various kinds of mistakes unless they are aware of the errors and consciously try to avoid them. Instead of using formal logic, humans usually rely on informal logic, which is probably best called reasoning.

One source says this about appeals to hypocrisy: “Tu Quoque [an appeal to hypocrisy] is a very common fallacy in which one attempts to defend oneself or another from criticism by turning the critique back against the accuser. This is a classic Red Herring since whether the accuser is guilty of the same, or a similar, wrong is irrelevant to the truth of the original charge. However, as a diversionary tactic, Tu Quoque can be very effective, since the accuser is put on the defensive, and frequently feels compelled to defend against the accusation.”

Defending ones-self from a hypocrisy charge makes the rhetorical mistake called stepping into an opponent’s frame, as mentioned here yesterday. That's probably why charges of hypocrisy in politics are almost always ignored and not even denied. Even a short, simple denial steps into the opponent’s frame, thereby strengthening the opponent’s argument.


Is alleging hypocrisy a logic fallacy?
Whataboutism or hypocrisy is a fallacy sometimes based on the argument that since someone or some group did something bad in the past, doing it now is justified. Sometimes that is true and sometimes it isn’t. An appeal to hypocrisy is an informal fallacy that intends to discredit an opponent’s argument by asserting the opponent’s failure to act consistently in accord with its conclusion(s). The logic looks like this:

1. Person A makes claim X, e.g., the president claims Hillary Clinton was sloppy about national security for using an unsecured personal server for official government business.
2. Person B asserts that A's actions or past claims are inconsistent with the truth of claim X, e.g., a critic claims the president is sloppy about national security for using an unsecured cell phone for official government business.
3. Therefore, X is false.

A Wikipedia article asserts that the conclusion, X is false, is “a fallacy because the moral character or actions of the opponent are generally irrelevant to the logic of the argument. It is often used as a red herring tactic and is a special case of the ad hominem [personal attack] fallacy, which is a category of fallacies in which a claim or argument is rejected on the basis of facts about the person presenting or supporting the claim or argument.”[2]

Is that true? Sometimes it is, but sometimes it doesn’t seem to be. Why? Because the moral character or actions of the opponent are clearly relevant to both the facts and the logic of the argument. In the national security sloppiness example above, X is true because the underlying facts and logic apply to the same concern, i.e., national security sloppiness. In Clinton’s server case, she was sloppy and X is clearly true. In the president’s cell phone case, he is still being sloppy. This particular appeal to hypocrisy therefore does not constitute seem to be a logic fallacy. It points out truth in two different situations.


Q: Does the foregoing analysis get it wrong? Is a charge of hypocrisy or whataboutism never logically sound because the underlying facts and logic always have to be evaluated independently?


Footnotes:
1. One example is the president criticizing the Clintons for having conflicts of interest due to their charity, while the president operates with conflicts of interest by continuing to profit from his for-profit businesses. The degree of the conflict the president is subject to is 100-fold to 1000-fold bigger financially than anything the Clinton charity ever constituted. Assuming the Clinton charity constituted an unacceptable conflict of interest, and it did, the situation for the president is far worse both qualitatively and quantitatively, but both situations constituted actual conflicts of interest.

Another example is how the GOP treated the impeachment of Bill Clinton for perjury (lying under oath) and an alleged obstruction of justice. The GOP enthusiastically pursued investigations into Clinton’s bad acts. By contrast, the GOP rejected and/or ignored evidence of obstruction of justice by the president, including blatant obstruction of congress during the impeachment inquiry. The GOP opposed any investigation by the House, Senate and the Department of Justice. The two situations are vastly different. Clinton’s bad acts constituted instances of bad judgment in lying under oath and immoral personal sexual behavior. On the other hand, the president’s bad acts go straight to corrupting governance and betraying the trust people put in him to be an honest politician while in office. The two situations are different but both still focus on differences in how evidence of bad acts is treated.

2. Wikipedia cites this as an example of the fallacy: “In the trial of Nazi war criminal Klaus Barbie, the controversial lawyer Jacques Vergès tried to present what was defined as a Tu Quoque Defence—i.e., that during the Algerian War, French officers such as General Jacques Massu had committed war crimes similar to those with which Barbie was being charged, and therefore the French state had no moral right to try Barbie. This defense was rejected by the court, which convicted Barbie.”

Sunday, August 2, 2020

SOMETHING TO THINK ABOUT

Thinking about thinking (without the BS)



The first logic class I ever took was in a philosophy course in college. And that’s part of the problem.
I don’t mean the problem with me, though there are many. I mean the problem with our politics, our civics, and just the way we get along (or don’t) right now. One way we could help with all that, believe it or not, would be to teach logic the way we teach math: Start early, keep at it, and make it required. I’ve taught logic to fourth graders, proof you don’t need a Ph.D. to share the basics and get kids in the habit of evaluating claims and thinking about their own thinking.
One deceptively simple definition of logic is "the study of correct reasoning, especially regarding making inferences."
Logic is about understanding what follows from something else, what must be true, given a certain premise. It’s about the leap from A to B, or in logic parlance, from p to q, as in “if p, then q.” Logic is what takes us from a premise, via inference, to a conclusion. Let’s say all cats have tails. In that universe, if it’s a cat, then it must have a tail. Get it? Of course you do.
But speaking of cute (we hope), imagine a toddler who lives with a cat and recently learned the word “kitty.” One day, the toddler is cruising around in the back of mom’s car and spots a fuzzy, four-legged animal. The toddler joyously points at this poodle and yells “Kitty! Kitty!” Mom smiles and chooses not to shatter the happy moment with a distracted-driving lecture on logical fallacies.
I, however, have no such qualms (sorry kid): This toddler, perhaps forgivably, assumed all cute fuzzy four-legged animals are “kitty.” That’s a common flaw in logic, a logical fallacy, and not just among toddlers; it’s often called hasty generalization or overgeneralization. And this type of fallacy and others are everywhere. They’re used, believed, repeated, broadcast, printed, and repeated some more, sometimes knowingly, sometimes unknowingly. Once you’re familiar with them, you see them everywhere, especially in election season. I’ll bet a beer and a biscuit that after reading the prime offenders below, you’ll notice them regularly between now and November, and maybe for the rest of your life (again, sorry, but you’re better off). So here are just seven of many deadly logic sins, a most-wanted list of tried-and-true, mass-misleading fallacies, simplified and combined for easy reading:
Fancy Latin name: ad hominem ("to the person")
Simple description: Attacking the person, not the argument or position.
Example: In a debate, Candidate A makes a policy recommendation. Opposing Candidate B says, “What do you know? You’re just a [insert any term seen as denigrating]!” Candidate B has certainly disparaged Candidate A but in no way addressed the policy suggestion. Fallacious fail.
A similarly invalid and unfair cousin of ad hominem is guilt by association. A more positive but equally fallacious relative is appeal to authority. (Seen any attack ads or endorsements lately?)
Fancy Latin name: post hoc ergo propter hoc ("after this therefore because of this")
Simpler science-y description: correlation is not causation.
So-simplified-it-actually-had-to-be-longer explanation: Just because event A precedes event B does not mean A caused B.
Example: In February of a U.S. president’s first term, the unemployment rate falls sharply. The president declares, “See! I’m a job-creating president!” In reality, it’s unlikely that the president — though his paddle is bigger than the average citizen’s — significantly changed the course of the supertanker that is the U.S. economy in one month. There are likely other reasons or causes for the improvement.
Yummier example: Crime rates rise as ice-cream consumption rises (that’s generally true, by the way). Fallacious reasoning: Clearly, ice cream is making people go insane with pleasure and commit crimes, plus ice-cream addicts are jacking people to get ice-cream money.
Actually, it’s just that ice-cream consumption and crime rates both tend to rise in summer. Along these lines, with the clear exception of my magic Boston Celtics socks, your lucky hat, lucky shoes, or — apologies to an AL.com Pulitzer-Prize-winning columnist — lucky fish before Alabama games did not cause your team to win. Unless, of course, you literally (and accurately) threw it in the opposing team’s faces at a key moment during a game.
Fancy name: false dichotomy
Simple name: either-or thinking
Simple description: Simplistically presenting the complex, gray-area world as if there are only two choices.
Real example: After the 9-11 terror attacks, some political leaders said, in effect or exactly word-for-word, “If you’re not with us, you’re with the terrorists.” Uhm … actually, no. Someone can hate the terrorists and be against what you’re doing, too. Reality is not nearly as simple as your kindergarten-level portrayal. It almost never is. Advertising often relies on a false dichotomy, too: Use this product or you’re a chump. Again, no. I can avoid your product as if it’s a smelly guy with a bad cough and a machete and yet still not be a chump. Matter of fact, since you tried that fake, weak, fallacious Jedi mind-trick to try to capitalize on insecurity, using your product is what would actually make me a chump.
Simple name: straw man
Simple description: Distorting an opposing argument so you can more easily knock it down.
Example: Candidate A says, “Foreign aid often includes products that U.S. businesses make, then get paid for, and even so, it accounts for less than 1% of the national budget. I’m OK with keeping foreign aid expenditures where they are.”
Candidate B responds indignantly, knowing a loud show of emotion will be broadcast all over, "Why do you care more about foreigners than you do about U.S. citizens?!?"
That, of course, is not what Candidate A said, but it might soon be spread around the world.
By the way. this response also includes another type of logical fallacy, a non sequitur, Latin for “does not follow.” Fallacious panderers often get their money’s worth by using several fallacies simultaneously.
Simple name: overgeneralization
Simple description: Drawing a conclusion based on too little evidence.
Toddler example: See above how every cute fuzzy four-legged animal equals a kitty.
Adult but still cat-lover example: “My cat has a tail, and so does every other cat I’ve seen, so all cats have tails.” (Understandable, but wrong. See Manx cats, mutations, rocking chairs.)
One dangerous brand of overgeneralization is stereotyping, or unfairly attributing a quality to an entire group of people, like “all Asians are _____,” or “women are _____.” Stereotypes are sometimes positive, often negative, but always wrong with specific, actual people. They’re also straightforward examples of how simplistic, sloppy thinking can hurt people.
Simple name or description: slippery slope
Simple description: You assume, without evidence, that one event will lead to other, often undesirable, events.
Real example: A well-known pundit in 2009 repeatedly said that allowing same-sex marriage could lead to humans marrying animals, including goats, ducks, dolphins and turtles. If it came down to it, I guess I’d choose a dolphin (I value intelligence and love to swim), but to my knowledge, there have been no hot-zones of inter-species matrimony since gay marriage became legal. Likewise, no matter your views on the subject, we can all agree that few if any human-turtle hybrids are walking around, which helps show the fallaciousness of that particular slippery-slope argument.
Simple names: false equivalence or false analogy.
Simple description: You assume things that are alike in one way are alike in other ways.
This fallacy is painfully common in politics and media perception. It’s even a crutch or a byproduct of overworked, lazy or otherwise compromised news producers: “I don’t care that 99.9% of the field is saying X! Get that bombastic suspiciously funded contrarian who’s saying Y in the studio and give him equal time — that’ll make for interesting (and misleading) TV!” Or, “You’re saying my political party is corrupt. So is yours!” Or, “You’re saying my news source is slanted. So is yours!” This reflexive both-side-ism appeals to our American egalitarianism. But facts aren’t egalitarian. As the heartless killer Marlo in “The Wire” explained, they’re one way, not another way. It’s highly unlikely that Political Party A and Political Party B commit identical transgressions and to an identical degree. It’s also highly unlikely that News Outlets C and D are biased, inaccurate, misleading or damaging in the same way, to the same degree, and to the same number of people.
These are some of the most common errors in logic that can mislead us even from true premises to false conclusions. But even airtight logic can bring us to false conclusions if a premise is false. Logic matters, and the facts it depends on matter, too.
Learning about logic, which is what joins facts into the web of how we understand the world, is one type of a valuable but rare endeavor: thinking about our own thinking. I know some of you would love to get that clueless uncle or gullible Facebook friend thinking, period, but thinking about our own thinking does improve thinking in general. It makes it less automatic, less reflexive, less taken for granted, and less impervious to the insane idea that we might be wrong. That’s crucial because, in addition to swimming in logical fallacies and purposeful misinformation, we’re all lugging around an unfortunate filter psychologists call “confirmation bias.” It’s one of the most important truths anyone can grasp: We all tend to accept evidence that supports what we already believe but dismiss what would undercut our beliefs. Given that backdrop, skilled media manipulators, and bias-boosting social-media algorithms, bad logic that seems like common sense is all the more seductive and misleading.
Carsen is a reporter and editor turned teacher who lives in Birmingham.
https://www.al.com/news/2020/08/thinking-about-thinking-without-the-bs.html

Friday, November 15, 2019

Chapter Review: Arguments and Logical Fallacies

This is a review of Chapter 10, Arguments and Logical Fallacies, of Steven Novella's 2018 book, The Skeptic’s Guide the the Universe: How to Know What’s Really Real in a World Increasingly Full of Fake. In this chapter, Novella marches through basic logic and the kinds of logical fallacies that people tend to rely to support their beliefs. The flaws are usually asserted unconsciously. The content of this chapter seems timely in view of the completely contradictory facts and arguments the two sides in the impeachment inquiry are hurling at each other. Novella points out that, in situations like this, one or both sides can be mostly wrong, but both cannot be mostly right.

The point of chapter 10 is to try to lay out the skills needed for critical thinking, something that humans are usually not good at unless they try to be good at it. Novella asserts: “Arguing is something that everyone does but few understand. Yet arguing is an essential skill of critical thinking.” Fortunately, the understanding needed is easy to grasp. Unfortunately, it takes time and sustained effort to learn to apply it.

Basic terminology
Logical fallacy: A logical fallacy is reasoning mistake or error that makes an argument invalid. All logical fallacies are non-sequiturs, which are arguments where the conclusion doesn't follow logically from what preceded it. Novella describes it like this: “A logical fallacy is an invalid connection between a premise and a conclusion, where the conclusion does not necessarily flow from the premise(s) but is argued as if it does.” The human mind did not evolve to do precise logic and people make various kinds of mistakes unless they are aware of the errors and consciously try to avoid them. Instead of using formal logic, humans usually rely on informal logic

Argument: An argument is what connects premises (facts) with conclusions (beliefs). Although people see arguments as something to be won and beliefs to be defended, that isn't how Novella sees it. Instead, an argument is an effort to find reasoned truth, not win points. To help people engaged in argument find truth, they would best try to find as much common ground as possible and then carefully proceed to engage with belief differences.

Assertion: An assertion is a stated or argued premise or conclusion without supporting evidence.

Premise: A premise is an asserted fact(s) that an argument is based on. These days, many, arguably most, political disagreements among people are pointless because they do not agree on the facts. There needs to be a logical connection showing the premises necessarily lead to the conclusion. If there are sufficient premises that are true and the logic is valid (and thus the argument is “sound”), then the conclusion must be true. For completeness, a conclusion based on an unsound argument can be true or false, e.g., all spheres are pretty, therefore the sun is a sphere.

Novella makes an important point: “There is no way to objectively resolve to resolve a difference of opinion regarding aesthetics, for example.” Thus to avoid bickering endlessly over inherently unresolvable, people can simply agree that the disagreement is unresolvable as a matter of aesthetics, moral choice and so forth. Inherent irresolvability appears to apply to many (most?) political disagreements where moral judgments are involved, e.g., what constitutes an impeachable act by a sitting president.[1]

Checking premises
The first thing to do when beginning to engage in argument, people would do well to check their premises or facts. Four problems can occur, (1) the asserted facts or premises are simply wrong, (2) the asserted facts or premises are possibly wrong and not verified enough, (3) a premise is hidden, e.g., evolution is false because there are not ‘enough’ transitional fossils, but the definition of transitional is different from the standard science definition, which makes the disagreement unresolvable, and (4) a premise(s) is based on a subjective judgment, e.g., an information source is ‘reliable’ without an independent assessment or a person asserting a premise that ‘feels’ correct.

Logical fallacies
1. Non-sequitur: All logical fallacies are non-sequiturs. The conclusion doesn't necessarily follow the premises. In giving his version of economic conditions in the US a few weeks ago, the president Tweeted: “Nobody has ever heard of numbers like that, so people want to find out: Why was it so corrupt during that election? And I want to find out more than anybody else.”  Here, the non-sequitur was a false connection between the economy in October of 2019 and the 2016 election.

2. Argument from authority: Appeal to authority can be probative, but it needs to be used carefully. Some non-experts in climate science, like me, tend to point to expert consensus about global warming, the human role in it and options to reduce it. Consensus expert opinion does carry some legitimate weight, but sometimes consensus is wrong. Sometimes, the appealed to authority really isn't an expert. Sometimes the appealed to expert is expert on one field but not the one at issue. Both undermines the persuasive power of the appeal.

3. Post hoc fallacy: This is among the most common fallacies. The fallacy goes like this: Since event Y followed event X, event Y must have been caused by event X. This argument is common in defenses of alternative medicines: “I took the pills and then felt better, therefore the pills worked.” The erroneous assumption is that because of their different positions on a timeline, the first event caused the second event.

The president used a post hoc fallacy when he asserted: “Since my election, Ford, Fiat Chrysler, General Motors, Sprint, SoftBank, Lockheed, Intel, Walmart and many others have announced that they will invest billions of dollars in the United States and will create tens of thousands of new American jobs.” Fact checkers found that those business decisions were make before the president was elected and not due to his role as president.

4. Whataboutism (tu quoque): This fallacy argues that since someone or some group did something in the past, doing it now is justified. The president and his supporters sometimes justify actions the president takes as justified because democrats did it. From my point of view, the whataboutism tactic appears to lead to a spiral down in civility and social norms. For example, the president asserted: “I will release my tax returns — against my lawyer’s wishes — when [Hillary Clinton] releases her 33,000 emails that have been deleted.”

5. Ad hominem fallacy: This is an argument that attacks the opponent or their motivations instead of their arguments or conclusions. Asserting that an opponent is closed-minded is a common form of this attack. Novella asserts that people accusing their opponent of being closed-minded tend to be “closed to the possibility that they are wrong.” In other words, there are times when a person one is engaged with is in fact closed-minded.

6. Appeal to ignorance (proving a negative, ad ignorantiam): This is a fairly common fallacy based on a belief that something is true because it has not been shown to be false. Proving a negative can be difficult to deal with and thus this fallacy can be difficult to deal with. For example, the president asserted the following to CNN about his election in 2016: “What PROOF do you have Donald Trump did not suffer from millions of FRAUD votes? Journalists? Do your job!” and “Pathetic – you have no sufficient evidence that Donald Trump did not suffer from voter fraud, shame! Bad reporter.”

7. False analogy: A comparison between two things are similar in one way are falsely claimed to be similar in a different way. An example is the president's complaint about how he sees his treatment by democrats: “All Republicans must remember what they are witnessing here — a lynching. But we will WIN!” The president is being investigated and criticized, but that is simply not the same as being lynched. The president's claim ignores the difference.

8. Slippery slope: This fallacy assumes that one action or policy will necessarily lead to other, worse outcomes. The mistake here is the belief that one action, e.g., a law that requires universal background checks for gun purchases, will lead inevitably to an extreme ultimate position, e.g., all guns in private hands will be taken away by force.

9. Straw man fallacy: Here, a person uses a weak version or caricature of an opponent's argument and then attacks that. The opponent may not even hold the asserted straw man position. Novella argues that critical thinking demands that the strongest version of an opponent’s argument should be assumed and addressed. Examples include assertions by the president that (1) Democrats “don’t mind executing babies AFTER birth” and (2) Democrats “have become the party of crime. [They] want to open our borders to a flood of deadly drugs and ruthless gangs [and] turn America into a giant sanctuary for criminal aliens and MS-13 thugs.”

The red herring fallacy is similar to the straw man, but it asserts a fact or premise that looks true but is either false or irrelevant. An example is the president’s Tweet two days after Attorney General Sessions recused himself from Justice Department investigations of Russian attacks on the 2016 election: “Terrible. Just found out that Obama had my ‘wires tapped’ in Trump Tower just before the victory.”

10. Tautology (begging the question): This fallacy relies on circular reasoning where the premise assumes the conclusion. Thus the argument is that since A = B, therefore A = B. The two sets of A = B tend to be worded differently, making them sometimes had to spot. One example is the president’s argument that the impeachment inquiry is illegitimate because he did nothing wrong. Another example is expressed in a legal memo the president relies on in his own defense: “The President’s actions here, by virtue of his position as the chief law enforcement officer, could neither constitutionally nor legally constitute obstruction because that would amount to him obstructing himself.” That falsely argues the president cannot obstruct justice because the justice department works for him. Since the President tells the DOJ what to do, the memo argues, and any action he takes is leading justice, not obstructing it.

There are other fallacies, but these account for most of the common ones.

Footnote:
1. Pragmatic rationalism compared to arguments & logical fallacies: For people familiar with the pragmatic rationalism anti-ideology ideology argued here from time to time, its moral basis will probably jump right out as being in full accord with logic and what critical thinking requires. Specifically, the first two moral values are (i) conscious effort to try to see facts with less bias or distortion, and (ii) conscious effort to try to apply less biased conscious reason (arguments) to the facts that people think they see. The broad scope of disagreements that are not logically or objectively resolvable accords with the idea asserted here many times is that the best that people in civil, rational political disagreement can do is try to reach stasis, the point at which each understands why they disagree. Based on disagreements in my experience, about 85% of disagreements arise because of disagreements over facts.