Friday, June 19, 2020

Book Review: The Misinformation Age




Introductory comments
The authors of this book rely on mostly on computer modeling to suggest means to combat dark free speech or propaganda. Their modeling analyses fit well with past events where false beliefs have arisen, usually by propagandists working for specific economic interests that (i) need to combat or negate scientific knowledge that threatens their profits, or (ii) want to create a false story that is not supported by evidence. Surprisingly to me, their conclusions usually match the beliefs I have arrived at by studying cognitive and social science and in my years of personal experience in dealing with politics. Since my current beliefs were not formed by computer modeling of how information flows in social groups or networks, I consider the book to be an example of consilience. Consilience is the phenomenon where evidence from independent, unrelated sources “converge” on the same conclusion, thereby usually (but not always) making it more likely to be true.

Probably because of how well this book fits with my own beliefs, I highly recommend this book and Sissela Bok's 1999 book, Lying: Moral Choice in Public and Private Life (a chapter review is here). Both are short, non-technical, easy to read books. Together those books allow people to get a solid handle on (1) how America managed to sink this far in terms of detachment form reality, rationality and honesty in politics and political discourse, and (2) the undeniable, profound immorality behind what got us here. I found these two books to be perfectly compatible. The Misinformation Age is weak on the relevant moral concerns, but Bok covers it extremely well.

Because there is so much great material in this book, I will probably write one or more separate reviews of the four chapters that The Misinformation Age contains. This review will be kept general and emphasize the main themes of how propaganda arises and persists and steps that can be taken to try to blunt it to some extent.


Book review
The 2019 book, The Misinformation Age: How False Beliefs Spread, was written by Cailin O’Connor and James Weatherall (O&W). They are professors of logic and philosophy of science at the University of California Irvine. The book is fairly short (186 pages) and easy to read for a general non-science audience. The book focuses mainly on how information flows in social networks and how it can be corrupted by both propagandists and well-meaning scientists. The book does not spend much time with human psychology or other factors such as personal biases or morals. Those topics are relevant and important but have been covered in other books. The key point the book tries to convey is that social factors are key to understanding how information spreads, especially false information.

Not surprisingly, the role that social media plays in the spread and persistence of misinformation is a theme that runs through the book. In essence, social media is a source of all sorts of networks that propagandists and special interests now routinely rely on to spread their message and influence public opinion and policy. O&W describe in detail five or six situations where important truth is subverted for a period of time, sometimes decades. This includes detailed discussion of how:
1. the tobacco industry created false doubts and stopped regulations about the bad health effects of cigarettes;
2. the chemical industry created false doubts and stopped regulations about the bad effects of chemicals (CFCs) on the ozone layer;
3. the electrical utility industry created false doubts and stopped regulations about the bad effects of acid rain from coal-fired power plants; and
4. all kinds of industries created false doubts and stopped regulations about the bad effects of carbon dioxide produced by burning of fossil fuels (this doubt and profound industry resistance to truth still exists today and still blocks meaningful policy action).

Common industry tactics to derail the spread on truth to both society and policy makers include (i) creating doubt about how solid the data is and advocating for delay for more data and analysis before regulations are put in place, (ii) creating controversy by recruiting respected experts to advocate false realities that industry wants people to believe, (iii) attacking scientists and claiming they are politically motivated to distract attention from the quality of the data that can't really be attacked directly, and (iv) funding scientists who are inclined to believe the industry's version of reality.

Regarding the false ozone propaganda, O&W write:
“Industry advocates ([e.g.., DuPont] urging a wait and see approach to CFCs into the late 1980s and beyond were right that the evidence linking CFCs to was not definitive. It still isn't. We cannot be absolutely certain about the ozone hole, about whether CFCs caused it, or even about whether ozone is essential for protecting human health. The reason we cannot be certain is that all the evidence we have for this claim is inductive -- and as Hume taught us inductive evidence cannot produce certainty.” 
The issue that O&W referred to was how propagandists literally question what truth is. That is a line of thinking that goes back to the Greek skeptics of a few millennia ago. They believed that maybe nothing can be truly known. From what I can tell, this reflects a human trait that evolution left us. It seems to me to be incredibly easy to create doubt about things that are true but most people are not familiar with. That happens all the time with complicated science-related issues including climate change and vaccines.[1] Because no one can know everything, including scientists, we have no choice but to rely on the knowledge of others. That means that we are susceptible to misinformation and cannot avoid it. O&W write:
“When we open channels for social communication, we immediately face a trade-off. .... Most of us get our false beliefs from the same places we get our true ones, and if we want the good stuff, we risk getting the bad as well.”


Suggested defense tactics
O&W suggest tactics that can reduce the ability of false information to spread and persist. One is to somehow get social media sites to change algorithms to expose small groups to be exposed to content from other sources. In essence this is a proposal to weaken silos or echo chambers where people are not exposed to contrary information or ideas. Another tactic is to recognize that some beliefs aren't worth supporting or even allowing on private platforms such as social media. O&W comment:
“.... we should stop thinking that the ‘marketplace of ideas’ can effectively sort fact from fiction. .... Unfortunately, this marketplace is a fiction, and a dangerous one. We do not want to limit free speech, but we do want to strongly advocate that those in positions of power or influence see their speech for what it is -- an exercise of power capable of doing real harm. It is irresponsible to advocate for unsupported views, and doing so needs to be thought of as a moral wrong, not just a harmless addition to some kind of ideal ‘marketplace.’”
A third suggestion is to change the way science is done or reported in several ways. One suggestion is to increase the statistical power of research protocols. That typically involves increasing the size of test groups. For statistical reasons, propagandists are able to find support for false beliefs more easily in small studies than in large studies. This is a common tactic that propagandists use to sow doubt about truth. Larger studies tend to give the ‘wrong’ result less often than small studies. Among members of the public, there is a widespread false belief that a single study establishes truth. In most cases that is not true and multiple studies are needed. By widely spreading false results that even independent scientists sometimes generate simply due to statistical variability in different systems under analysis, the few negative results can be seen to outweigh the majority of positives. There tends to be noise in most research and that noise is used to sow doubt.

To deal with human bias and psychology, O&W recommend complete cessation of industry funding for research. Both real world and modeling data show that even scientists who are honest tend to be influenced in the direction that industry wants the data to support. This problem is exacerbated by sophisticated propagandists who seek out scientists to fund based on their use of protocols that tend to give results that industry wants. This is a subtle form of influencing both the scientific community and the public to accept false beliefs or truths.

 O&W describe other tactics to try to blunt the power and persistence of false information. They see this as an endless war between forces of truth and forces that want society and/or government to accept a different but false reality. Once a defense tactic to block or neutralize lies and deceit is found, it creates a powerful incentive for propagandists to find a way to neutralize the defense. O&W admit that the problem looks bleak and complicated but they argue that defense of democracy requires nothing less than an aggressive defense, even if it is expensive or labor intensive.

One thing that O&W believe is reasonably certain is this: Propagandists are not going to stop doing what they are doing. The damage they are causing is sufficient to warrant a reevaluation of democracy and its institutions, which they believe are failing in the face of new mass communication technologies and the tidal wave of lies and deceit it is spreading throughout society and government.

I find their arguments persuasive.


Footnote:
1. I'm unsure that evidence related to ozone and adverse health effects really is purely inductive. Inductive means inference of general laws from particular instances. The evidence is rock solid that (i) ozone absorbs high energy UV light, (ii) some high energy UV (UVB) can cause skin cancer and eye damage and (iii) decreases in atmospheric ozone lead to increases in ground-level UV radiation. I guess I don't understand what induction is.


No comments:

Post a Comment