Etiquette



DP Etiquette

First rule: Don't be a jackass.

Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.
Showing posts sorted by relevance for query logic fallacies. Sort by date Show all posts
Showing posts sorted by relevance for query logic fallacies. Sort by date Show all posts

Saturday, October 9, 2021

Dark free speech tactics: Sealioning, Gish gallop and other popular deceit and manipulation tactics



“The mind is divided into parts, like a rider (controlled processes) on an elephant (automatic processes). The rider evolved to serve the elephant. . . . . intuitions come first, strategic reasoning second. Therefore, if you want to change someone’s mind about a moral or political issue, talk to the elephant first. .... Republicans understand moral psychology. Democrats don’t. Republicans have long understood that the elephant is in charge of political behavior, not the rider, and they know how elephants work. Their slogans, political commercials and speeches go straight for the gut . . . . Republicans don’t just aim to cause fear, as some Democrats charge. They trigger the full range of intuitions described by Moral Foundations Theory.” -- Psychologist Johnathan Haidt, The Righteous Mind: Why Good People are Divided by Politics and Religion, 2012


Dark free speech (DFS): Constitutionally or legally protected (1) lies and deceit to distract, misinform, confuse, polarize and/or demoralize, (2) unwarranted opacity to hide inconvenient truths, facts and corruption (lies and deceit of omission), (3) unwarranted emotional manipulation (i) to obscure the truth and blind the mind to lies and deceit, and (ii) to provoke irrational, reason-killing emotions and feelings, including fear, hate, anger, disgust, distrust, intolerance, cynicism, pessimism and all kinds of bigotry including racism, and (4) ideologically-driven motivated reasoning and other ideologically-driven biases that unreasonably distort reality and reason. Germaine, ~2016 or thereabouts


There are lots of ways to engage in debate in ways that can feel right and principled, but are in effect ways to subvert principled focused debate into far less rational or focused engagements. Provoking frustration, impatience and anger are common goals of subverting rhetorical tactics. Logic fallacies are a common tactic of people that have to rely on weak or non-existent fact, truth and/or reasoning positions, e.g., the 2020 election was stolen. Denying, distorting or irrationally downplaying inconvenient facts and truths are also popular and usually present in some form in nearly all DFS. 

Here is how some of these things are described.

Sealioning (also spelled sea-lioning and sea lioning) is a type of trolling or harassment that consists of pursuing people with persistent requests for evidence or repeated questions, while maintaining a pretense of civility and sincerity.[1][2][3][4] It may take the form of "incessant, bad-faith invitations to engage in debate".[5] The term originated with a 2014 strip of the webcomic Wondermark by David Malki.

The troll feigns ignorance and politeness, so that if the target is provoked into making an angry response, the troll can then act as the aggrieved party.[7][8] Sealioning can be performed by a single troll or by multiple ones acting in concert.[9] The technique of sealioning has been compared to the Gish gallop and metaphorically described as a denial-of-service attack targeted at human beings.[10]

An essay in the collection Perspectives on Harmful Speech Online, published by the Berkman Klein Center for Internet & Society at Harvard, noted:

Rhetorically, sealioning fuses persistent questioning—often about basic information, information easily found elsewhere, or unrelated or tangential points—with a loudly-insisted-upon commitment to reasonable debate. It disguises itself as a sincere attempt to learn and communicate. Sealioning thus works both to exhaust a target's patience, attention, and communicative effort, and to portray the target as unreasonable. While the questions of the "sea lion" may seem innocent, they're intended maliciously and have harmful consequences. — Amy Johnson, Berkman Klein Center for Internet & Society (May 2019) (emphasis added

The Gish gallop is a rhetorical technique in which a debater attempts to overwhelm an opponent by excessive number of arguments, without regard for the accuracy or strength of those arguments. The term was coined by Eugenie Scott, who named it after Duane Gish. Scott argued that Gish used the technique frequently when challenging the scientific fact of evolution.[1][2] It is similar to a method used in formal debate called spreading.

During a Gish gallop, a debater confronts an opponent with a rapid series of many specious arguments, half-truths, and misrepresentations in a short space of time, which makes it impossible for the opponent to refute all of them within the format of a formal debate.[3][4] In practice, each point raised by the "Gish galloper" takes considerably more time to refute or fact-check than it did to state in the first place.[5] The technique wastes an opponent's time and may cast doubt on the opponent's debating ability for an audience unfamiliar with the technique, especially if no independent fact-checking is involved[6] or if the audience has limited knowledge of the topics.
In the case of the Gush gallop, the dark free speech proponent can plays on a person's ignorance to make arguments and asserted facts or truths seem at least plausible. It shifts the burden to the principled participant to fact check, which often takes more time and effort than is reasonable and is often frustrating, which tends to degrade the quality and social usefulness of the debate.


Whataboutism: Whataboutism or whataboutery (as in "what about…?") is a variant of the tu quoque logical fallacy, which attempts to discredit an opponent's position by charging hypocrisy without directly refuting or disproving the argument (Germaine: or without showing its relevance). 

Whataboutism is usually embedded in false narratives implied through irrelevant questions. When cornered, there are two typical strategies. One, claim "I'm just asking questions! Two, claim "I can't prove it, but it sounds right!"


Wikipedia on false balance or bothsidesism: False balance, also bothsidesism, is a media bias in which journalists present an issue as being more balanced between opposing viewpoints than the evidence supports. Journalists may present evidence and arguments out of proportion to the actual evidence for each side, or may omit information that would establish one side's claims as baseless. False balance has been cited as a cause of misinformation.[1]

False balance is a bias, which usually stems from an attempt to avoid bias, and gives unsupported or dubious positions an illusion of respectability. It creates a public perception that some issues are scientifically contentious, though in reality they aren't, therefore creating doubt about the scientific state of research, and can be exploited by interest groups such as corporations like the fossil fuel industry or the tobacco industry, or ideologically motivated activists such as vaccination opponents or creationists.[2]

Examples of false balance in reporting on science issues include the topics of man-made versus natural climate change, the health effects of tobacco, the alleged relation between thiomersal and autism,[3] and evolution versus intelligent design.

A fallacy is reasoning that is logically incorrect, undermines the logical validity of an argument, or is recognized as unsound. All forms of human communication can contain fallacies.

Because of their variety, fallacies are challenging to classify. They can be classified by their structure (formal fallacies) or content (informal fallacies). Informal fallacies, the larger group, may then be subdivided into categories such as improper presumption, faulty generalization, error in assigning causation and relevance, among others.

The use of fallacies is common when the speaker's goal of achieving common agreement is more important to them than utilizing sound reasoning. When fallacies are used, the premise should be recognized as not well-grounded, the conclusion as unproven (but not necessarily false), and the argument as unsound.

Informal fallacies

Informal fallacies – arguments that are logically unsound for lack of well-grounded premises.[14]
  • Argument to moderation (false compromise, middle ground, fallacy of the mean, argumentum ad temperantiam) – assuming that a compromise between two positions is always correct.[15]
  • Continuum fallacy (fallacy of the beard, line-drawing fallacy, sorites fallacy, fallacy of the heap, bald man fallacy, decision-point fallacy) – improperly rejecting a claim for being imprecise.[16]
  • Correlative-based fallacies
    • Suppressed correlative – a correlative is redefined so that one alternative is made impossible (e.g., "I'm not fat because I'm thinner than John.").[17]
  • Definist fallacy – defining a term used in an argument in a biased manner (e.g., using "loaded terms"). The person making the argument expects that the listener will accept the provided definition, making the argument difficult to refute.[18]
  • Divine fallacy (argument from incredulity) – arguing that, because something is so incredible or amazing, it must be the result of superior, divine, alien or paranormal agency.[19]
  • Double counting – counting events or occurrences more than once in probabilistic reasoning, which leads to the sum of the probabilities of all cases exceeding unity.
  • Equivocation – using a term with more than one meaning in a statement without specifying which meaning is intended.[20]
    • Ambiguous middle term – using a middle term with multiple meanings.[21]
    • Definitional retreat – changing the meaning of a word when an objection is raised.[22] Often paired with moving the goalposts (see below), as when an argument is challenged using a common definition of a term in the argument, and the arguer presents a different definition of the term and thereby demands different evidence to debunk the argument.
    • Motte-and-bailey fallacy – conflating two positions with similar properties, one modest and easy to defend (the "motte") and one more controversial (the "bailey").[23] The arguer first states the controversial position, but when challenged, states that they are advancing the modest position.[24][25]
    • Fallacy of accent – changing the meaning of a statement by not specifying on which word emphasis falls.
    • Persuasive definition – purporting to use the "true" or "commonly accepted" meaning of a term while, in reality, using an uncommon or altered definition.
    • (cf. the if-by-whiskey fallacy)
  • Ecological fallacy – inferring about the nature of an entity based solely upon aggregate statistics collected for the group to which that entity belongs.[26]
  • Etymological fallacy – assuming that the original or historical meaning of a word or phrase is necessarily similar to its actual present-day usage.[27]
  • Fallacy of composition – assuming that something true of part of a whole must also be true of the whole.[28]
  • Fallacy of division – assuming that something true of a composite thing must also be true of all or some of its parts.[29]
  • False attribution – appealing to an irrelevant, unqualified, unidentified, biased or fabricated source in support of an argument.
  • False authority (single authority) – using an expert of dubious credentials or using only one opinion to promote a product or idea. Related to the appeal to authority.
  • False dilemma (false dichotomy, fallacy of bifurcation, black-or-white fallacy) – two alternative statements are given as the only possible options when, in reality, there are more.[31]
  • False equivalence – describing two or more statements as virtually equal when they are not.
  • Slippery slope (thin edge of the wedge, camel's nose) – asserting that a proposed, relatively small, first action will inevitably lead to a chain of related events resulting in a significant and negative event and, therefore, should not be permitted.[43]
  • Special pleading – the arguer attempts to cite something as an exemption to a generally accepted rule or principle without justifying the exemption (e.g.: a defendant who murdered his parents asks for leniency because he is now an orphan).
  • Etc., etc., etc. 

Red herring fallacies

  • Ad hominem – attacking the arguer instead of the argument. (Note that "ad hominem" can also refer to the dialectical strategy of arguing on the basis of the opponent's own commitments. This type of ad hominem is not a fallacy.)
    • Circumstantial ad hominem – stating that the arguer's personal situation or perceived benefit from advancing a conclusion means that their conclusion is wrong.[70]
    • Poisoning the well – a subtype of ad hominem presenting adverse information about a target person with the intention of discrediting everything that the target person says.[71]
    • Appeal to motive – dismissing an idea by questioning the motives of its proposer.
    • Tone policing – focusing on emotion behind (or resulting from) a message rather than the message itself as a discrediting tactic.
    • Traitorous critic fallacy (ergo decedo, 'thus leave') – a critic's perceived affiliation is portrayed as the underlying reason for the criticism and the critic is asked to stay away from the issue altogether. Easily confused with the association fallacy ("guilt by association") below.
  • Appeal to authority (argument from authority, argumentum ad verecundiam) – an assertion is deemed true because of the position or authority of the person asserting it.[72][73]
  • Straw man fallacy – misrepresenting an opponent's argument by broadening or narrowing the scope of a premise and/or refuting a weaker version of their argument (e.g.: If someone says that killing animals is wrong because we are animals too saying "It is not true that humans have no moral worth" would be a strawman since they have not asserted that humans have no moral worth, rather that the moral worth of animals and humans are equivalent.)[105]
  • Texas sharpshooter fallacy – improperly asserting a cause to explain a cluster of data.[106] This fallacy is an informal fallacy which is committed when differences in data are ignored, but similarities are overemphasized. From this reasoning, a false conclusion is inferred.[1] This fallacy is the philosophical or rhetorical application of the multiple comparisons problem (in statistics) and apophenia (in cognitive psychology). It is related to the clustering illusion, which is the tendency in human cognition to interpret patterns where none actually exist. The name comes from a joke about a Texan who fires some gunshots at the side of a barn, then paints a target centered on the tightest cluster of hits and claims to be a sharpshooter.
  • Tu quoque ('you too' – appeal to hypocrisy, whataboutism) – stating that a position is false, wrong, or should be disregarded because its proponent fails to act consistently in accordance with it.[107]
  • Two wrongs make a right – assuming that, if one wrong is committed, another wrong will rectify it.

 As one can see, there are a heck of a lot of ways to derail focused, principled debate into fluff, false beliefs, social discord, etc. Skilled trolls, professional propagandists and most hard core ideologues are familiar with these tactics. Most people and interests that use dark free speech (~97% ?), do so without hesitation or moral qualm. Even people who try to stay principled can engage in logic fallacies without being aware if it. 


Given the way the human mind evolved to work, existing research evidence indicates that relative to principled debate grounded in honest speech, dishonest debate grounded in DFS can be and often is more persuasive. In my opinion, reasonable sounding DFS, usually not crackpottery like the trash that QAnon spews, tends to be about 2-4-fold more effective in influencing public opinion. Being limited to facts, true truths and sound reasoning forecloses a whole lot of rhetorical territory and tactics that can be used to describe real or fake facts, truths and reality. 

Some logic fallacies were discussed here several time before, e.g., this chapter review.

One moral argument holds that people who decide and act based on DFS, false beliefs, misinformation, disinformation and the like deprives them of the power to decide and act based on truth and reality. A counter moral argument is that the ends justify the means and thus lies, deceit, irrational emotional manipulation are morally justified. I consider the counter moral argument to be inherently anti-democratic and pro-authoritarian.


Questions: 
1. Is it reasonable to believe that DFS is more effective than honest speech in convincing people to believe things?

2. Since both DFS and honest speech are legal and constitutionally protected, are both morally equivalent?

Tuesday, July 21, 2020

The Great Logic Fallacy of the Science Deniers



The current issue of Scientific American has a short but interesting article by Naomi Oreskes, The False Logic of Science Denial. Climate science deniers dislike her for trying to debunk climate science denial. She points out that logic fallacies are common and even scientists fall prey to them over things they really should know better than to fall prey to.[1] Oreskes writes:
"All this is to say that logical fallacies are everywhere and not always easily refuted. Truth, at least in science, is not self-evident. And this helps to explain why science denial is easy to generate and hard to slay. Today we live in a world where science denial, about everything from climate change to COVID-19, is rampant, informed by fallacies of all kinds. 
But there is a meta-fallacy—an über-fallacy if you will—that motivates these other, specific fallacies. It also explains why so many of the same people who reject the scientific evidence of anthropogenic climate change also question the evidence related to COVID-19. 
Given how common it is, it is remarkable that philosophers have failed to give it a formal name. But I think we can view it as a variety of what sociologists call implicatory denial. I interpret implicatory denial as taking this form: If P, then Q. But I don't like Q! Therefore, P must be wrong. This is the logic (or illogic) that underlies most science rejection. 
Climate change: I reject the suggestion that the “magic of the market” has failed and that we need government intervention to remedy the market failure. Evolutionary theory: I am offended by the suggestion that life is random and meaningless and that there is no God. COVID-19: I resent staying home, losing income or being told by the government what do to."

So there it is. Implicatory denial is much of the explanation[2] for why climate science is deniable. Deniers don't like the idea of human-caused climate change and/or the idea of government doing anything about it. Therefore, there is no climate change for whatever reason(s) makes it believable.

The same flawed logic applies to denying vaccine usefulness, COVID-19, or whatever other accepted science gets rejected.


Footnotes:
1. Oreskes argues that a common and "vexing" fallacy a among scientists is this: If theory P is correct, then Q is predicted. An experiment to see if Q pops up, and it does. Therefore, theory P is true. This conclusion is based on a logic fallacy. Q could pop up for one or more reasons unrelated to P. The frequency of that fallacy led philosopher Karl Popper to argue that the method of science should be falsification. Popper's logic was a theory cannot be proved to be true, because not every every circumstance can be tested. A single counterexample proves a theory false.

Oreskes asserts that Popper's theory was based on logic flaw. Specifically an experiment can failed for reasons unrelated to the theory being tested. Reasons for failure include insufficiently sensitivity to detect the predicted effect or analysis software throws out real data points that are programmed to be rejected as spurious. She argues that here is no logical resolution to this. So scientists generally deal with it through consilience. That means looks to find or infer the explanation that is the most consistent with evidence from a variety of sources That approach looks at a problem from a variety of angles to see what holds up best.

2. The paper that Oreskes cites for implicatory denial, Sociological Explanations for Climate Change Denial, asserts that it is one of two common forms of science denial. The other is called interpretative denial, where facts are accepted, but the interpretation of what the facts logically lead to a different (flawed) conclusion from what unbiased people would usually come to.

Over the years, my personal experience has been that implicatory denial is more common than interpretative denial. But that's just anecdote, not solid data.



Tuesday, February 25, 2020

Logic Fallacies: Hypocrisy and Whataboutism

One thing that I used to assert when it seemed reasonable was an allegation that politicians and other players were hypocrites about blatantly doing the same things or worse variants they bitterly criticized their political opposition for doing.[1] By the time the president won the electoral college in 2016, political hypocrisy on the right was simply mind-boggling. What about hypocrisy on the left? It was still there, but it had not reached the quantity and quality of hypocrisy the right routinely practiced right out in the open. There was and still is very little moral, political or social equivalence on this point between the left and right.

A logical fallacy is reasoning mistake or error that makes an argument invalid. Logical fallacies are non-sequiturs, i.e., arguments where the conclusion doesn't follow logically from what preceded it. In essence, a logic fallacy is an invalid connection between a premise(s) (fact(s)) and the conclusion, because the conclusion does not necessarily flow from the premises. Often the facts are disputed as not facts. The human mind did not evolve to do precise logic. People make various kinds of mistakes unless they are aware of the errors and consciously try to avoid them. Instead of using formal logic, humans usually rely on informal logic, which is probably best called reasoning.

One source says this about appeals to hypocrisy: “Tu Quoque [an appeal to hypocrisy] is a very common fallacy in which one attempts to defend oneself or another from criticism by turning the critique back against the accuser. This is a classic Red Herring since whether the accuser is guilty of the same, or a similar, wrong is irrelevant to the truth of the original charge. However, as a diversionary tactic, Tu Quoque can be very effective, since the accuser is put on the defensive, and frequently feels compelled to defend against the accusation.”

Defending ones-self from a hypocrisy charge makes the rhetorical mistake called stepping into an opponent’s frame, as mentioned here yesterday. That's probably why charges of hypocrisy in politics are almost always ignored and not even denied. Even a short, simple denial steps into the opponent’s frame, thereby strengthening the opponent’s argument.


Is alleging hypocrisy a logic fallacy?
Whataboutism or hypocrisy is a fallacy sometimes based on the argument that since someone or some group did something bad in the past, doing it now is justified. Sometimes that is true and sometimes it isn’t. An appeal to hypocrisy is an informal fallacy that intends to discredit an opponent’s argument by asserting the opponent’s failure to act consistently in accord with its conclusion(s). The logic looks like this:

1. Person A makes claim X, e.g., the president claims Hillary Clinton was sloppy about national security for using an unsecured personal server for official government business.
2. Person B asserts that A's actions or past claims are inconsistent with the truth of claim X, e.g., a critic claims the president is sloppy about national security for using an unsecured cell phone for official government business.
3. Therefore, X is false.

A Wikipedia article asserts that the conclusion, X is false, is “a fallacy because the moral character or actions of the opponent are generally irrelevant to the logic of the argument. It is often used as a red herring tactic and is a special case of the ad hominem [personal attack] fallacy, which is a category of fallacies in which a claim or argument is rejected on the basis of facts about the person presenting or supporting the claim or argument.”[2]

Is that true? Sometimes it is, but sometimes it doesn’t seem to be. Why? Because the moral character or actions of the opponent are clearly relevant to both the facts and the logic of the argument. In the national security sloppiness example above, X is true because the underlying facts and logic apply to the same concern, i.e., national security sloppiness. In Clinton’s server case, she was sloppy and X is clearly true. In the president’s cell phone case, he is still being sloppy. This particular appeal to hypocrisy therefore does not constitute seem to be a logic fallacy. It points out truth in two different situations.


Q: Does the foregoing analysis get it wrong? Is a charge of hypocrisy or whataboutism never logically sound because the underlying facts and logic always have to be evaluated independently?


Footnotes:
1. One example is the president criticizing the Clintons for having conflicts of interest due to their charity, while the president operates with conflicts of interest by continuing to profit from his for-profit businesses. The degree of the conflict the president is subject to is 100-fold to 1000-fold bigger financially than anything the Clinton charity ever constituted. Assuming the Clinton charity constituted an unacceptable conflict of interest, and it did, the situation for the president is far worse both qualitatively and quantitatively, but both situations constituted actual conflicts of interest.

Another example is how the GOP treated the impeachment of Bill Clinton for perjury (lying under oath) and an alleged obstruction of justice. The GOP enthusiastically pursued investigations into Clinton’s bad acts. By contrast, the GOP rejected and/or ignored evidence of obstruction of justice by the president, including blatant obstruction of congress during the impeachment inquiry. The GOP opposed any investigation by the House, Senate and the Department of Justice. The two situations are vastly different. Clinton’s bad acts constituted instances of bad judgment in lying under oath and immoral personal sexual behavior. On the other hand, the president’s bad acts go straight to corrupting governance and betraying the trust people put in him to be an honest politician while in office. The two situations are different but both still focus on differences in how evidence of bad acts is treated.

2. Wikipedia cites this as an example of the fallacy: “In the trial of Nazi war criminal Klaus Barbie, the controversial lawyer Jacques Vergès tried to present what was defined as a Tu Quoque Defence—i.e., that during the Algerian War, French officers such as General Jacques Massu had committed war crimes similar to those with which Barbie was being charged, and therefore the French state had no moral right to try Barbie. This defense was rejected by the court, which convicted Barbie.”

Thursday, February 6, 2020

Facts About Trump and His Presidency

Now that the impeachment is over, the president can get to the important stuff, like firing the people who testified in the House hearings or said dumb things like, yeah there was a quid pro quo you fool, get over it. One can also expect the stream of lies, unwarranted emotional manipulation and incoherence to continue with vigor and enthusiasm. In other words, fake reality is going to get even faker. It's time for some epistemology about reality and important related whatnot. Let’s start with deeply confused concepts like facts, truths and logic as opposed to lies, deceit, BS, untruths and motivated reasoning, a/k/a/, flawed logic. Pictures are always popular, so here’s one:



An epistemological definition of knowledge


We can reasonably expect the rhetoric that will flow from the president, his charming minions and many or most of his rank and file supporters will be some combination of lies of commission and/or omission, BS, false beliefs and emotional manipulation. Most of that, say about 95%, will not be in the scope of the truth or knowledge circles.

In a recent exchange with a Trump supporter, I pointed out some of what I saw as some of the  negative aspects of the president or what he has done so far. I mentioned things like lots of added federal Trump debt, a society torn apart and full of unfounded hate, rage and bigotry, a stock market reflects the irrational exuberance that the trickle up of wealth to the top has fomented, an economy based on wanton pollution, deep corruption and/or disregard for the massive damage to democracy and the rule of law, the rise of some sort of kleptocratic tyranny, and disregard for America's fading infrastructure.

In response the Trump supporter responded with things that all fell outside truth and knowledge circles. For example, supporter asserted the falsehoods that (i) America today is just as torn apart as it was under Obama, and (ii) race relations have improved dramatically under Trump. Other whoppers raised in defense of Trump were false assertions that (i) income gains under Trump have favored the poor twice as much as they have favored the rich, (ii) EPA rules under Trump are the same under Obama, (iii) the corruption comes from the Dems in the bogus impeachment and the Deep State in trying to oust Trump, (iv) Mueller exonerated Trump, and (v) the only kleptocracy is from Obama and the Bidens, i.e., “the entire family.” Also a whopper, was the assertion that Obama was responsible for America’s bad infrastructure. All of those fake reality assertions can be shown to be false and/or logical fallacies.

I was flabbergasted and discombobulated at the sheer incorrectness of the false assertions and the incoherence of whatever logic was rummaging around in that poor person's confused mind. Nonetheless, this is what passes for facts, truths and logic in Trumplandia. I just didn't want to spend the time looking all the sources up to show that confused mind all the fact, truth and logic errors its beliefs were based on. That takes a lot of time. It also unfairly shifts the burden of proof to minds better grounded in facts, truths and logic.


THE EPIPHANY!!
Then, out of nowhere, both of Germaine’s neurons fired at the same time and an idea was hatched. My thinking was something like this: “Self, this is nuts. I can't keep just keep going out to look for the evidence that shows Trumplandia is a construct built on fake facts, alt-reality and deranged logic. I will put all the evidence together in one happy post so that when some Trumplandia pops up, I can just provide the link to this OP with links to real facts, reality and logic.”

And that, gentle reader, is what this OP starts to do. I'll add to this OP over time to have a fact, truth and logic basis for responding to at least the fairly common Trumplandia blither, numb nuttery and Tomfoolery that is raised in defense of the president.

Request for help: If anyone has a link to reliable info that would help provide a basis to respond to build a fact, truth and logic compendium for the whole world to refer to, feel free to put it in a comment. Also, if I get something wrong, let me know and I'll fix it. If you have a topic that should be included, let me know. At least for now, this is organized by source of Trumplandia nuttery, e.g., deceit and lies or social division and discord. Over time this could get to be pretty long, but whatever. As the philosopher Popeye says, I yam what I yam and that’s all that I yam.


THE TRUMPLANDIA ANTIDOTE COMPENDIUM

Deceit and lying to the public

Facts: As of Dec. 10, his 1,055th day in office, Trump had made 15,413 false or misleading claims, according to the Fact Checker’s database that analyzes, categorizes and tracks every suspect statement he has uttered. That’s an average of more than 32 claims a day since our last update 62 days ago. 12/16/19

The Fact Checker has evaluated false statements President Trump has made repeatedly and analyzed how often he reiterates them. The claims included here – which we're calling "Bottomless Pinocchios" – are limited to ones that he has repeated 20 times and were rated as Three or Four Pinocchios by the Fact Checker. 1/19/20

President Trump’s State of the Union speech once again was chock-full of stretched facts and dubious figures. Many of these claims have been fact-checked repeatedly, yet the president persists in using them. Here, in the order in which he made them, are 31 statements by the president. 2/4/20

WASHINGTON (AP) — Exacting swift punishment against those who crossed him, an emboldened President Donald Trump ousted two government officials who had delivered damaging testimony against him during his impeachment hearings. The president took retribution just two days after his acquittal by the Senate. First came news Friday that Trump had ousted Lt. Col. Alexander Vindman, the decorated soldier and national security aide who played a central role in the Democrats’ impeachment case. Vindman’s lawyer said his client was escorted out of the White House complex Friday, told to leave in retaliation for ‘telling the truth.’ ‘The truth has cost Lt. Col. Alexander Vindman his job, his career, and his privacy,’ attorney David Pressman said in a statement. Vindman’s twin brother, Lt. Col. Yevgeny Vindman, also was asked to leave his job as a White House lawyer on Friday, the Army said in a statement. Both men were reassigned to the Army. 2/6/20


Trump's affinity for lies was known before the 
November 2016 election


Logic, truths & opinions-beliefs: Not only does the president make thousands of false or misleading statements, he refuses to correct his statements when the facts and logic are pointed out. From that, one can logically conclude that the president is a chronic liar. He cannot logically be given the benefit of any doubt that his statements are just mistakes in view of the sheer number of them, and his refusal to correct any of them. There is nothing mistaken in refusal to correct mistakes. Because he fires people around him who tell truths he does not want told, logic also supports a belief that the president has no concern over inconvenient facts, truths and logic. He simply ignores, denies or distorts whatever he wants that he judges to be inconvenient for himself. In view of the facts, one can reasonably conclude that there is no objective basis to believe anything the president says about himself, his private business dealings or his actions or rhetoric while in office. His determined efforts to hide truths, e.g., his tax returns, add to the evidence that the president is hostile to inconvenient facts, truths and logic, most likely because he has a lot to hide.

Trumplandia defenses: Common defenses (i) simply deny that the president lies, (ii) claim he doesn’t lie more than other politicians (as if that justifies any elected politician’s lies), (iii) claim the fact checkers are liars, socialists and/or democrats out to smear the president, or (iv) claim that his false statements are just exaggerations, not lies. The facts contradict all of those assertions. There is no significant evidence that undermines all of the objective evidence of the president's false and misleading statements. Exaggerations, often a mix of fact and lies or untruths, are intended to deceive or mislead, and can thus logically be considered as lies.

“” ‘’

Social division and discord
Facts: Social relations: The 2018 Presidential Greatness Survey by people who are expert in presidents and presidential politics ranked President Trump (1) last in terms of greatness, and (2) highest in terms of being polarizing and divisive. Jan. 2018

Three years ago, Pew Research Center found that the 2016 presidential campaign was “unfolding against a backdrop of intense partisan division and animosity.” Today, the level of division and animosity – including negative sentiments among partisans toward the members of the opposing party – has only deepened. 10/10/19

Race relations: An overwhelming majority of black voters — 85 percent — said in a new Hill-HarrisX poll that they would choose any Democratic presidential candidate over President Trump. The survey, which was released on Monday, found this sentiment to be particularly true among black voters along partisan lines. 10/7/19

Even before President Donald Trump’s racist tweets toward four Democratic congresswomen of color, Americans considered race relations in the United States to be generally bad — and said that Trump has been making them worse. .... And Americans think Trump is contributing to the problem. A Pew Research Center poll earlier this year showed 56% of Americans saying Trump has made race relations worse. 7/16/19

Logic, truths & opinions: The fact evidence supports a logical conclusion that


Trumplandia defenses: The president’s supporters claim that social and race relations have improved under Trump, reversing an alleged trend set by the divisive racist Barack Obama. Poll data does show that social and racial divisions increased under both Bush and Obama, with the trend continuing under the polarizing Trump.






Health care malarkey
Facts:


Logic, truths & opinions:


Trumplandia defenses: 


Attacks on science
Facts: https://www.nytimes.com/2019/11/07/climate/trump-alabama-sharpie-hurricane.html
https://www.nytimes.com/2019/06/08/climate/rod-schoonover-testimony.html
https://www.nytimes.com/2017/10/22/climate/epa-scientists.html


Logic, truths & opinions:


Trumplandia defenses: 




Wealth trickle up
Facts: 



Logic, truths & opinions:


Trumplandia defenses: 






Booming economy
Facts: 



Logic, truths & opinions:


Trumplandia defenses: 






Pollution & environment
Facts: A New York Times analysis, based on research from Harvard Law School, Columbia Law School and other sources, counts more than 90 environmental rules and regulations rolled back under Mr. Trump. Our list represents two types of policy changes: rules that were officially reversed and rollbacks still in progress. .... In some cases, the administration has failed to provide a strong legal argument in favor of proposed changes and agencies have skipped key steps in the rulemaking process, like notifying the public and asking for comment. In several cases, courts have ordered agencies to enforce their own rules. updated 12/21/19


WASHINGTON — The Trump administration is preparing to significantly limit the scientific and medical research that the government can use to determine public health regulations, overriding protests from scientists and physicians who say the new rule would undermine the scientific underpinnings of government policymaking. .... “This means the E.P.A. can justify rolling back rules or failing to update rules based on the best information to protect public health and the environment, which means more dirty air and more premature deaths,” said Paul Billings, senior vice president for advocacy at the American Lung Association. .... When gathering data for their research, known as the Six Cities study, scientists signed confidentiality agreements to track the private medical and occupational histories of more than 22,000 people in six cities. They combined that personal data with home air-quality data to study the link between chronic exposure to air pollution and mortality. .... But the fossil fuel industry and some Republican lawmakers have long criticized the analysis and a similar study by the American Cancer Society, saying the underlying data sets of both were never made public, preventing independent analysis of the conclusions. 11/11/19


Logic, truths & opinions:


Trumplandia defenses: 



The Mueller Report - Obstruction of Justice
Facts: The key question is how Robert Mueller and his team assessed the three elements “common to most of the relevant statutes” relating to obstruction of justice: an obstructive act, a nexus between the act and an official proceeding, and corrupt intent. As Mueller describes, the special counsel’s office “gathered evidence … relevant to the elements of those crimes and analyzed them within an elements framework—while refraining from reaching ultimate conclusions about whether crimes were committed,” because of the Office of Legal Counsel (OLC)’s guidelines against the indictment of a sitting president. 4/21/19







Logic, truths & opinions:



Trumplandia defenses: 






Federal debt
Facts: x




Logic, truths & opinions:



Trumplandia defenses: 






“” ‘’

Sunday, August 2, 2020

SOMETHING TO THINK ABOUT

Thinking about thinking (without the BS)



The first logic class I ever took was in a philosophy course in college. And that’s part of the problem.
I don’t mean the problem with me, though there are many. I mean the problem with our politics, our civics, and just the way we get along (or don’t) right now. One way we could help with all that, believe it or not, would be to teach logic the way we teach math: Start early, keep at it, and make it required. I’ve taught logic to fourth graders, proof you don’t need a Ph.D. to share the basics and get kids in the habit of evaluating claims and thinking about their own thinking.
One deceptively simple definition of logic is "the study of correct reasoning, especially regarding making inferences."
Logic is about understanding what follows from something else, what must be true, given a certain premise. It’s about the leap from A to B, or in logic parlance, from p to q, as in “if p, then q.” Logic is what takes us from a premise, via inference, to a conclusion. Let’s say all cats have tails. In that universe, if it’s a cat, then it must have a tail. Get it? Of course you do.
But speaking of cute (we hope), imagine a toddler who lives with a cat and recently learned the word “kitty.” One day, the toddler is cruising around in the back of mom’s car and spots a fuzzy, four-legged animal. The toddler joyously points at this poodle and yells “Kitty! Kitty!” Mom smiles and chooses not to shatter the happy moment with a distracted-driving lecture on logical fallacies.
I, however, have no such qualms (sorry kid): This toddler, perhaps forgivably, assumed all cute fuzzy four-legged animals are “kitty.” That’s a common flaw in logic, a logical fallacy, and not just among toddlers; it’s often called hasty generalization or overgeneralization. And this type of fallacy and others are everywhere. They’re used, believed, repeated, broadcast, printed, and repeated some more, sometimes knowingly, sometimes unknowingly. Once you’re familiar with them, you see them everywhere, especially in election season. I’ll bet a beer and a biscuit that after reading the prime offenders below, you’ll notice them regularly between now and November, and maybe for the rest of your life (again, sorry, but you’re better off). So here are just seven of many deadly logic sins, a most-wanted list of tried-and-true, mass-misleading fallacies, simplified and combined for easy reading:
Fancy Latin name: ad hominem ("to the person")
Simple description: Attacking the person, not the argument or position.
Example: In a debate, Candidate A makes a policy recommendation. Opposing Candidate B says, “What do you know? You’re just a [insert any term seen as denigrating]!” Candidate B has certainly disparaged Candidate A but in no way addressed the policy suggestion. Fallacious fail.
A similarly invalid and unfair cousin of ad hominem is guilt by association. A more positive but equally fallacious relative is appeal to authority. (Seen any attack ads or endorsements lately?)
Fancy Latin name: post hoc ergo propter hoc ("after this therefore because of this")
Simpler science-y description: correlation is not causation.
So-simplified-it-actually-had-to-be-longer explanation: Just because event A precedes event B does not mean A caused B.
Example: In February of a U.S. president’s first term, the unemployment rate falls sharply. The president declares, “See! I’m a job-creating president!” In reality, it’s unlikely that the president — though his paddle is bigger than the average citizen’s — significantly changed the course of the supertanker that is the U.S. economy in one month. There are likely other reasons or causes for the improvement.
Yummier example: Crime rates rise as ice-cream consumption rises (that’s generally true, by the way). Fallacious reasoning: Clearly, ice cream is making people go insane with pleasure and commit crimes, plus ice-cream addicts are jacking people to get ice-cream money.
Actually, it’s just that ice-cream consumption and crime rates both tend to rise in summer. Along these lines, with the clear exception of my magic Boston Celtics socks, your lucky hat, lucky shoes, or — apologies to an AL.com Pulitzer-Prize-winning columnist — lucky fish before Alabama games did not cause your team to win. Unless, of course, you literally (and accurately) threw it in the opposing team’s faces at a key moment during a game.
Fancy name: false dichotomy
Simple name: either-or thinking
Simple description: Simplistically presenting the complex, gray-area world as if there are only two choices.
Real example: After the 9-11 terror attacks, some political leaders said, in effect or exactly word-for-word, “If you’re not with us, you’re with the terrorists.” Uhm … actually, no. Someone can hate the terrorists and be against what you’re doing, too. Reality is not nearly as simple as your kindergarten-level portrayal. It almost never is. Advertising often relies on a false dichotomy, too: Use this product or you’re a chump. Again, no. I can avoid your product as if it’s a smelly guy with a bad cough and a machete and yet still not be a chump. Matter of fact, since you tried that fake, weak, fallacious Jedi mind-trick to try to capitalize on insecurity, using your product is what would actually make me a chump.
Simple name: straw man
Simple description: Distorting an opposing argument so you can more easily knock it down.
Example: Candidate A says, “Foreign aid often includes products that U.S. businesses make, then get paid for, and even so, it accounts for less than 1% of the national budget. I’m OK with keeping foreign aid expenditures where they are.”
Candidate B responds indignantly, knowing a loud show of emotion will be broadcast all over, "Why do you care more about foreigners than you do about U.S. citizens?!?"
That, of course, is not what Candidate A said, but it might soon be spread around the world.
By the way. this response also includes another type of logical fallacy, a non sequitur, Latin for “does not follow.” Fallacious panderers often get their money’s worth by using several fallacies simultaneously.
Simple name: overgeneralization
Simple description: Drawing a conclusion based on too little evidence.
Toddler example: See above how every cute fuzzy four-legged animal equals a kitty.
Adult but still cat-lover example: “My cat has a tail, and so does every other cat I’ve seen, so all cats have tails.” (Understandable, but wrong. See Manx cats, mutations, rocking chairs.)
One dangerous brand of overgeneralization is stereotyping, or unfairly attributing a quality to an entire group of people, like “all Asians are _____,” or “women are _____.” Stereotypes are sometimes positive, often negative, but always wrong with specific, actual people. They’re also straightforward examples of how simplistic, sloppy thinking can hurt people.
Simple name or description: slippery slope
Simple description: You assume, without evidence, that one event will lead to other, often undesirable, events.
Real example: A well-known pundit in 2009 repeatedly said that allowing same-sex marriage could lead to humans marrying animals, including goats, ducks, dolphins and turtles. If it came down to it, I guess I’d choose a dolphin (I value intelligence and love to swim), but to my knowledge, there have been no hot-zones of inter-species matrimony since gay marriage became legal. Likewise, no matter your views on the subject, we can all agree that few if any human-turtle hybrids are walking around, which helps show the fallaciousness of that particular slippery-slope argument.
Simple names: false equivalence or false analogy.
Simple description: You assume things that are alike in one way are alike in other ways.
This fallacy is painfully common in politics and media perception. It’s even a crutch or a byproduct of overworked, lazy or otherwise compromised news producers: “I don’t care that 99.9% of the field is saying X! Get that bombastic suspiciously funded contrarian who’s saying Y in the studio and give him equal time — that’ll make for interesting (and misleading) TV!” Or, “You’re saying my political party is corrupt. So is yours!” Or, “You’re saying my news source is slanted. So is yours!” This reflexive both-side-ism appeals to our American egalitarianism. But facts aren’t egalitarian. As the heartless killer Marlo in “The Wire” explained, they’re one way, not another way. It’s highly unlikely that Political Party A and Political Party B commit identical transgressions and to an identical degree. It’s also highly unlikely that News Outlets C and D are biased, inaccurate, misleading or damaging in the same way, to the same degree, and to the same number of people.
These are some of the most common errors in logic that can mislead us even from true premises to false conclusions. But even airtight logic can bring us to false conclusions if a premise is false. Logic matters, and the facts it depends on matter, too.
Learning about logic, which is what joins facts into the web of how we understand the world, is one type of a valuable but rare endeavor: thinking about our own thinking. I know some of you would love to get that clueless uncle or gullible Facebook friend thinking, period, but thinking about our own thinking does improve thinking in general. It makes it less automatic, less reflexive, less taken for granted, and less impervious to the insane idea that we might be wrong. That’s crucial because, in addition to swimming in logical fallacies and purposeful misinformation, we’re all lugging around an unfortunate filter psychologists call “confirmation bias.” It’s one of the most important truths anyone can grasp: We all tend to accept evidence that supports what we already believe but dismiss what would undercut our beliefs. Given that backdrop, skilled media manipulators, and bias-boosting social-media algorithms, bad logic that seems like common sense is all the more seductive and misleading.
Carsen is a reporter and editor turned teacher who lives in Birmingham.
https://www.al.com/news/2020/08/thinking-about-thinking-without-the-bs.html