Pragmatic politics focused on the public interest for those uncomfortable with America's two-party system and its way of doing politics. Considering the interface of politics with psychology, cognitive science, social behavior, morality and history.
Thursday, August 22, 2019
Book Review: The Knowledge Illusion
Now if arguments were in themselves enough to make men good, they would justly . . . . have won very great rewards . . . . But as things are . . . . they are not able to encourage the many to nobility and goodness . . . . What argument would remold such people? It is hard, if not impossible, to remove by argument the traits that have long since been incorporated by character. Aristotle on the distinction between unconscious intuitive-emotional vs conscious deliberative thinking
Summary: The Knowledge Illusion: Why We Never Think Alone (Riverhead Books, New York, 2017), like the 1991 book, The User Illusion, focuses on how the human mind operates in a bubble of self-deceit about how much it knows and understands. The User Illusion, emphasizes human data processing power, information theory, the second law of thermodynamics and the physiology of cognition as it was understood at the time. The Knowledge Illusion uses current cognitive and social biology research to ask basically the same questions about the human condition. Both come to the essentially the same conclusion about the vast gulf between how little humans can and do know compared to how much they think they know.
The Knowledge Illusion builds on the existing concept of innate human limitations. The book describes profound insights about what human cognitive limitations mean for how we do politics and most everything else, and by clear implication, the well-being of the human species.
Review: The Knowledge Illusion was written by two cognitive scientists, Steve Sloman (cognitive, linguistic and psychological science professor, Brown University, Editor-In-Chief of the journal Cognition) and Philip Fernbach (professor of marketing, University of Colorado, Leeds School of Business). Fernbach's academic affiliation points out a segment of American society, marketing, that has long understood human cognitive and social and used that knowledge to sell the public. Along with politicians, political groups and special interests backed by professional public relations efforts, marketers are experts in human cognitive biology and how to appeal to the unconscious human mind to get what they want.
The Knowledge Illusion is very easy to read and well organized. It is written for a general audience. It uses a only a few technical terms, which makes it easy to focus on the ideas without much effort to digest terminology. The few core technical terms that are used are important and necessary to describe the book's core concepts. This book is well worth reading for anyone wanting easy access to some current insights about (i) how the human mind perceives, thinks about and deals with the world and politics, and (ii) how to see and do things differently.
The following illustrates where the current science stands.
1: A test for ignorance - the illusion of understanding: It wasn't until 1998 that a simple, reliable method to measure self-deceit was devised. This test has turned out to be very reliable: “We have been studying psychological phenomena for a long time and it is rare to come across one as robust as the illusion of understanding.” The basic test consists of the following three questions. 1. On a scale of 1 to 7 (1 = no understanding, 7 = complete understanding), how well do you understand X, where X is anything from how zippers or flush toilets work, or how well do you understand a political issue?
2. In as much detail as you can give, how does X work or what is X, e.g., how does a zipper work or what is the thinking behind climate change belief?
3. On the 1 to 7 scale, how well do you understand X?
What happens is that when most, not all, people find they know little or nothing about the topic at hand, their score drops. Their illusion of understanding (called the “illusion of explanatory depth”) has been broken. When these questions center on issues that implicate politics such as climate change or genetically modified foods, people with extreme beliefs tend to become less certain and less extreme.
Authors Sloman and Fernbach point out that this method of punching holes in personal belief works by using question 2 to force people to think outside their personal belief systems. The simple belief- or ideology-neutral question ‘how does it work?’ isn't psychologically threatening until people people begin to realize how little they actually know. That cognitive trick forces recognition of reality vs belief disconnects. By the time people understand their own ignorance, it is too late to raise personal belief defenses.
For political issues, this veil of ignorance-piercing cannot be done by providing explanations of climate change or genetically modified foods and then pointing to policies that make sense based on reality. That direct attack method simply doesn't work. Most people have to be ‘tricked’ into seeing their own ignorance, making external facts and logic unpersuasive.
2: Two minds and two operating systems: Unconscious-emotional and conscious deliberative: Sloman and Fernbach describe data showing that people who tend to think slowly and consciously do not show a statistically significant drop in their scores in the ignorance test described above. People who are fast, intuitive, unconscious thinkers, about 80% of adults, generally show a significant score drop in the 3-question ignorance test.
Interestingly, the following three question test is sufficient to distinguish unconscious, intuitive thinkers from conscious, reasoning thinkers (answers at footnote 1 below).
1. A bat and a ball cost $1.10. The bat costs $1 more than the ball. How much does the ball cost?
2. There is a patch of lily pads in a lake. The patch doubles in size every day. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?
3. If it takes 5 machines 5 minutes to make 5 widgets, how long does it take 100 machine to make 100 widgets?
People who get all three questions right are slow, conscious thinkers, while people who get one or more wrong are fast intuitive thinkers, e.g., various differences between the groups are measurable. The three questions are designed to make the incorrect answer jump right out, which is what most people respond with. By contrast, not responding with the wrong answer requires a mindset that, in essence, checks its work before answering. The slow thinkers do not change their scores in the ignorance test because they are more deliberative about what they think they know. Deliberative thinkers are better grounded in reality than intuitive thinkers.
3: We don't like seeing our personal illusions shattered: Shattering political illusions by coaxing people to think outside their belief systems elicits a backlash in response to (i) seeing reality for what it is, and (ii) how different reality is from what personal belief was. The implication for political leadership is obvious. Sloman and Fernbach sum it up like this:
“Unfortunately, the procedure does have a cost. Exposing people's illusions can upset them. . . . We had hoped that shattering the illusion of understanding would make people more curious and more open to new information . . . . This is not what we have found. If anything, people are less inclined to seek new information after finding out that they were wrong. . . . . people don't like having their illusion shattered. In the words of Voltaire: ‘Illusion is the first of all pleasures.’ . . . . People like to feel successful, not incompetent. . . . . A good leader must be able to help people realize their ignorance without making them feel stupid. This is not easy.”
Echoing Aristotle, Sloman and Fernbach observe that “scientific attitudes are not based on rational evaluation of evidence, and therefore providing information does not change them. Attitudes are determined instead by a host of contextual and cultural factors that make then largely immune to change. . . . . beliefs are deeply intertwined with other beliefs, shared cultural values, and our identities. . . . . The power that culture has over cognition just swamps [any] attempts at education.”
Importantly, the authors constantly point out that the world is far too complex for people to have broad, deep knowledge. They argue that, in view of amazingly severe human cognitive limitations, we have no choice but use other people and the world itself for data and analysis. The ramifications of that shoot through all of politics. That's where illusions of knowledge come from.
The Knowledge Illusion is highly recommended. There is much more to it, and this short review cannot do the book justice. In particular, this book will help (i) people with the moral courage to begin a serious, unsettling journey in self-reflection, and (ii) people interested in trying to understand why politics is what it is.
This stuff isn't for the faint of heart or for hard core political ideologues. For people open to it, this kind of knowledge can challenge and upset a person's worldview and self-image. That's not the ideologue's mindset.
Footnote:
1. (1) The ball costs 5 cents, (2) 47 days, (3) 5 minutes (each machine takes 5 minutes to make one item).
B&B orig: 9/5/17
No comments:
Post a Comment