Dark free speech (DFS): Constitutionally or legally protected (1) lies and deceit to distract, misinform, confuse, polarize and/or demoralize, (2) unwarranted opacity to hide inconvenient truths, facts and corruption (lies and deceit of omission), (3) unwarranted emotional manipulation (i) to obscure the truth and blind the mind to lies and deceit, and (ii) to provoke irrational, reason-killing emotions and feelings, including fear, hate, anger, disgust, distrust, intolerance, cynicism, pessimism and all kinds of bigotry including racism, and (4) ideologically-driven motivated reasoning and other ideologically-driven biases that unreasonably distort reality and reason. Germaine, ~2016 or thereabouts
There are lots of ways to engage in debate in ways that can feel right and principled, but are in effect ways to subvert principled focused debate into far less rational or focused engagements. Provoking frustration, impatience and anger are common goals of subverting rhetorical tactics. Logic fallacies are a common tactic of people that have to rely on weak or non-existent fact, truth and/or reasoning positions, e.g., the 2020 election was stolen. Denying, distorting or irrationally downplaying inconvenient facts and truths are also popular and usually present in some form in nearly all DFS.
Here is how some of these things are described.
Sealioning (also spelled sea-lioning and sea lioning) is a type of trolling or harassment that consists of pursuing people with persistent requests for evidence or repeated questions, while maintaining a pretense of civility and sincerity.[1][2][3][4] It may take the form of "incessant, bad-faith invitations to engage in debate".[5] The term originated with a 2014 strip of the webcomic Wondermark by David Malki.
The troll feigns ignorance and politeness, so that if the target is provoked into making an angry response, the troll can then act as the aggrieved party.[7][8] Sealioning can be performed by a single troll or by multiple ones acting in concert.[9] The technique of sealioning has been compared to the Gish gallop and metaphorically described as a denial-of-service attack targeted at human beings.[10]
An essay in the collection Perspectives on Harmful Speech Online, published by the Berkman Klein Center for Internet & Society at Harvard, noted:Rhetorically, sealioning fuses persistent questioning—often about basic information, information easily found elsewhere, or unrelated or tangential points—with a loudly-insisted-upon commitment to reasonable debate. It disguises itself as a sincere attempt to learn and communicate. Sealioning thus works both to exhaust a target's patience, attention, and communicative effort, and to portray the target as unreasonable. While the questions of the "sea lion" may seem innocent, they're intended maliciously and have harmful consequences. — Amy Johnson, Berkman Klein Center for Internet & Society (May 2019) (emphasis added
The Gish gallop is a rhetorical technique in which a debater attempts to overwhelm an opponent by excessive number of arguments, without regard for the accuracy or strength of those arguments. The term was coined by Eugenie Scott, who named it after Duane Gish. Scott argued that Gish used the technique frequently when challenging the scientific fact of evolution.[1][2] It is similar to a method used in formal debate called spreading.During a Gish gallop, a debater confronts an opponent with a rapid series of many specious arguments, half-truths, and misrepresentations in a short space of time, which makes it impossible for the opponent to refute all of them within the format of a formal debate.[3][4] In practice, each point raised by the "Gish galloper" takes considerably more time to refute or fact-check than it did to state in the first place.[5] The technique wastes an opponent's time and may cast doubt on the opponent's debating ability for an audience unfamiliar with the technique, especially if no independent fact-checking is involved[6] or if the audience has limited knowledge of the topics.
In the case of the Gush gallop, the dark free speech proponent can plays on a person's ignorance to make arguments and asserted facts or truths seem at least plausible. It shifts the burden to the principled participant to fact check, which often takes more time and effort than is reasonable and is often frustrating, which tends to degrade the quality and social usefulness of the debate.
Whataboutism: Whataboutism or whataboutery (as in "what about…?") is a variant of the tu quoque logical fallacy, which attempts to discredit an opponent's position by charging hypocrisy without directly refuting or disproving the argument (Germaine: or without showing its relevance).
Whataboutism is usually embedded in false narratives implied through irrelevant questions. When cornered, there are two typical strategies. One, claim "I'm just asking questions! Two, claim "I can't prove it, but it sounds right!"
Wikipedia on false balance or bothsidesism: False balance, also bothsidesism, is a media bias in which journalists present an issue as being more balanced between opposing viewpoints than the evidence supports. Journalists may present evidence and arguments out of proportion to the actual evidence for each side, or may omit information that would establish one side's claims as baseless. False balance has been cited as a cause of misinformation.[1]
False balance is a bias, which usually stems from an attempt to avoid bias, and gives unsupported or dubious positions an illusion of respectability. It creates a public perception that some issues are scientifically contentious, though in reality they aren't, therefore creating doubt about the scientific state of research, and can be exploited by interest groups such as corporations like the fossil fuel industry or the tobacco industry, or ideologically motivated activists such as vaccination opponents or creationists.[2]
Examples of false balance in reporting on science issues include the topics of man-made versus natural climate change, the health effects of tobacco, the alleged relation between thiomersal and autism,[3] and evolution versus intelligent design.
False balance is a bias, which usually stems from an attempt to avoid bias, and gives unsupported or dubious positions an illusion of respectability. It creates a public perception that some issues are scientifically contentious, though in reality they aren't, therefore creating doubt about the scientific state of research, and can be exploited by interest groups such as corporations like the fossil fuel industry or the tobacco industry, or ideologically motivated activists such as vaccination opponents or creationists.[2]
Examples of false balance in reporting on science issues include the topics of man-made versus natural climate change, the health effects of tobacco, the alleged relation between thiomersal and autism,[3] and evolution versus intelligent design.
A fallacy is reasoning that is logically incorrect, undermines the logical validity of an argument, or is recognized as unsound. All forms of human communication can contain fallacies.
Because of their variety, fallacies are challenging to classify. They can be classified by their structure (formal fallacies) or content (informal fallacies). Informal fallacies, the larger group, may then be subdivided into categories such as improper presumption, faulty generalization, error in assigning causation and relevance, among others.
The use of fallacies is common when the speaker's goal of achieving common agreement is more important to them than utilizing sound reasoning. When fallacies are used, the premise should be recognized as not well-grounded, the conclusion as unproven (but not necessarily false), and the argument as unsound.Informal fallacies
Informal fallacies – arguments that are logically unsound for lack of well-grounded premises.[14]
- Argument to moderation (false compromise, middle ground, fallacy of the mean, argumentum ad temperantiam) – assuming that a compromise between two positions is always correct.[15]
- Continuum fallacy (fallacy of the beard, line-drawing fallacy, sorites fallacy, fallacy of the heap, bald man fallacy, decision-point fallacy) – improperly rejecting a claim for being imprecise.[16]
- Correlative-based fallacies
- Suppressed correlative – a correlative is redefined so that one alternative is made impossible (e.g., "I'm not fat because I'm thinner than John.").[17]
- Definist fallacy – defining a term used in an argument in a biased manner (e.g., using "loaded terms"). The person making the argument expects that the listener will accept the provided definition, making the argument difficult to refute.[18]
- Divine fallacy (argument from incredulity) – arguing that, because something is so incredible or amazing, it must be the result of superior, divine, alien or paranormal agency.[19]
- Double counting – counting events or occurrences more than once in probabilistic reasoning, which leads to the sum of the probabilities of all cases exceeding unity.
- Equivocation – using a term with more than one meaning in a statement without specifying which meaning is intended.[20]
- Ambiguous middle term – using a middle term with multiple meanings.[21]
- Definitional retreat – changing the meaning of a word when an objection is raised.[22] Often paired with moving the goalposts (see below), as when an argument is challenged using a common definition of a term in the argument, and the arguer presents a different definition of the term and thereby demands different evidence to debunk the argument.
- Motte-and-bailey fallacy – conflating two positions with similar properties, one modest and easy to defend (the "motte") and one more controversial (the "bailey").[23] The arguer first states the controversial position, but when challenged, states that they are advancing the modest position.[24][25]
- Fallacy of accent – changing the meaning of a statement by not specifying on which word emphasis falls.
- Persuasive definition – purporting to use the "true" or "commonly accepted" meaning of a term while, in reality, using an uncommon or altered definition.
- Ecological fallacy – inferring about the nature of an entity based solely upon aggregate statistics collected for the group to which that entity belongs.[26]
- Etymological fallacy – assuming that the original or historical meaning of a word or phrase is necessarily similar to its actual present-day usage.[27]
- Fallacy of composition – assuming that something true of part of a whole must also be true of the whole.[28]
- Fallacy of division – assuming that something true of a composite thing must also be true of all or some of its parts.[29]
- False attribution – appealing to an irrelevant, unqualified, unidentified, biased or fabricated source in support of an argument.
- Fallacy of quoting out of context (contextotomy, contextomy; quotation mining) – selective excerpting of words from their original context to distort the intended meaning.[30]
- False authority (single authority) – using an expert of dubious credentials or using only one opinion to promote a product or idea. Related to the appeal to authority.
- False dilemma (false dichotomy, fallacy of bifurcation, black-or-white fallacy) – two alternative statements are given as the only possible options when, in reality, there are more.[31]
- False equivalence – describing two or more statements as virtually equal when they are not.
- Slippery slope (thin edge of the wedge, camel's nose) – asserting that a proposed, relatively small, first action will inevitably lead to a chain of related events resulting in a significant and negative event and, therefore, should not be permitted.[43]
- Special pleading – the arguer attempts to cite something as an exemption to a generally accepted rule or principle without justifying the exemption (e.g.: a defendant who murdered his parents asks for leniency because he is now an orphan).
- Etc., etc., etc.
Red herring fallacies
- Ad hominem – attacking the arguer instead of the argument. (Note that "ad hominem" can also refer to the dialectical strategy of arguing on the basis of the opponent's own commitments. This type of ad hominem is not a fallacy.)
- Circumstantial ad hominem – stating that the arguer's personal situation or perceived benefit from advancing a conclusion means that their conclusion is wrong.[70]
- Poisoning the well – a subtype of ad hominem presenting adverse information about a target person with the intention of discrediting everything that the target person says.[71]
- Appeal to motive – dismissing an idea by questioning the motives of its proposer.
- Tone policing – focusing on emotion behind (or resulting from) a message rather than the message itself as a discrediting tactic.
- Traitorous critic fallacy (ergo decedo, 'thus leave') – a critic's perceived affiliation is portrayed as the underlying reason for the criticism and the critic is asked to stay away from the issue altogether. Easily confused with the association fallacy ("guilt by association") below.
- Appeal to authority (argument from authority, argumentum ad verecundiam) – an assertion is deemed true because of the position or authority of the person asserting it.[72][73]
- Straw man fallacy – misrepresenting an opponent's argument by broadening or narrowing the scope of a premise and/or refuting a weaker version of their argument (e.g.: If someone says that killing animals is wrong because we are animals too saying "It is not true that humans have no moral worth" would be a strawman since they have not asserted that humans have no moral worth, rather that the moral worth of animals and humans are equivalent.)[105]
- Texas sharpshooter fallacy – improperly asserting a cause to explain a cluster of data.[106] This fallacy is an informal fallacy which is committed when differences in data are ignored, but similarities are overemphasized. From this reasoning, a false conclusion is inferred.[1] This fallacy is the philosophical or rhetorical application of the multiple comparisons problem (in statistics) and apophenia (in cognitive psychology). It is related to the clustering illusion, which is the tendency in human cognition to interpret patterns where none actually exist. The name comes from a joke about a Texan who fires some gunshots at the side of a barn, then paints a target centered on the tightest cluster of hits and claims to be a sharpshooter.
- Tu quoque ('you too' – appeal to hypocrisy, whataboutism) – stating that a position is false, wrong, or should be disregarded because its proponent fails to act consistently in accordance with it.[107]
- Two wrongs make a right – assuming that, if one wrong is committed, another wrong will rectify it.
As one can see, there are a heck of a lot of ways to derail focused, principled debate into fluff, false beliefs, social discord, etc. Skilled trolls, professional propagandists and most hard core ideologues are familiar with these tactics. Most people and interests that use dark free speech (~97% ?), do so without hesitation or moral qualm. Even people who try to stay principled can engage in logic fallacies without being aware if it.
Given the way the human mind evolved to work, existing research evidence indicates that relative to principled debate grounded in honest speech, dishonest debate grounded in DFS can be and often is more persuasive. In my opinion, reasonable sounding DFS, usually not crackpottery like the trash that QAnon spews, tends to be about 2-4-fold more effective in influencing public opinion. Being limited to facts, true truths and sound reasoning forecloses a whole lot of rhetorical territory and tactics that can be used to describe real or fake facts, truths and reality.
Some logic fallacies were discussed here several time before, e.g., this chapter review.
One moral argument holds that people who decide and act based on DFS, false beliefs, misinformation, disinformation and the like deprives them of the power to decide and act based on truth and reality. A counter moral argument is that the ends justify the means and thus lies, deceit, irrational emotional manipulation are morally justified. I consider the counter moral argument to be inherently anti-democratic and pro-authoritarian.
Questions:
1. Is it reasonable to believe that DFS is more effective than honest speech in convincing people to believe things?
2. Since both DFS and honest speech are legal and constitutionally protected, are both morally equivalent?