Given the generality of their work on human cognition, thinking and decision-making, it is reasonable to expect that their work will heavily influence research in many other areas of human activity over time. Whether the new knowledge will translate to American society and its thinking and behavior appears to be very unlikely for the foreseeable future.
Daniel Khaneman
For anyone interested in politics, the question of how the field of psychology went from mostly nonsense to relevant, serious science that could no longer be ignored by the 1980s makes this book well worth the money and time. The book is written for a general audience and is an easy read. It is light on technical details but nonetheless clearly conveys the state of psychology and cognitive biology and how that moved from the end of the dark ages in the 1900s to core modern relevance.
The book's central theme revolves around the intense academic relationship between two basically incompatible geniuses. Tversky was an organized but arrogant, optimistic and self-confident master of mathematical psychology. By contrast, Khaneman was disorganized, pessimistic and riddled with self-doubt, but he did have an amazing capacity to see core problems in psychology (quirks of human thinking and behavior) that the rest of the field simply could not see. Khaneman's creative insights, and his ability to articulate and experimentally get at the root of a problem were, and probably still are, astounding. Tversky's capacities were similar.
Eventually their academic relationship came to a prolonged, unpleasant end. Tversky died in 1996 of cancer, some years thereafter. Khaneman is professor emeritus at Princeton.
The book's title, The Undoing Project, refers to the effort of the two scientists to"undo", among other things,
(i) the then-dominant 'utility theory' of decision making that dominated and underpinned economic theory and belief; and
(ii) the human mind's intense desire to, and ease of, erasing (undoing) "what was surprising or unexpected."
The rational man: One area their research profoundly affected was economics and its 1700s-vintage utility theory. The theory was based on the assumption that people were usually rational in the economic decisions they made. Khaneman-Tversky research found that wasn't true.[0] One source of systematic error was a human cognitive trait of a common 'belief in small numbers'. They found that people, including professional statisticians and experimental psychologists who should know better, often drew conclusions from amounts of evidence that are too small to draw any conclusions from. The data was clear that "people mistook even a very small part of a thing for the whole." The normal human belief is that ANY sample of a large population was more representative of the population than it really was. Humans simply did not evolve to think in terms of statistics.
Heuristics: Tversky and Khaneman's research identified four basic rules (heuristics) the human mind uses to help make decisions, even when there is uncertainty of an unknowable degree. In essence, the human mind is a pleasure machine.[1] People's biological desire to avoid a loss is greater than their desire to secure a similar gain. From an evolutionary point of view, that makes sense. During evolution, people who underestimate risk tended to get eliminated from the gene pool.
Amos Tversky
The blow back: Khaneman and Tversky lost faith in decision analysis in the context of wars that Israel fought. Khaneman expressed the problem in public talks he called "Cognitive Limitations and Public Decision Making." Affecting decision making was their attempt to inject the implications of their research into high-stakes, real world decision making and government. They tried to do that by forcing experts on decision making to assign odds of events of all possible outcomes, e.g., war, peace, border skirmishes or attacks by less than all adversaries all at once.
In practice, the exercise failed. Despite their successful efforts to get Israeli intelligence agencies and politicians to understand scenarios in terms of probabilities, the data and analysis fell on deaf ears. Specifically, Israeli intelligence estimates gave a 10% increased of risk of another war if Henry Kissinger's peace efforts with Syria failed. Despite the warning, Israeli foreign minister Yigal Allon wasn't impressed and didn't work to bolster Kissinger's peace efforts. Khaneman said "That was the moment I gave up on decision analysis. No one ever made a decision because of a number. They need a story. . . . the understanding of numbers is so weak that they don't communicate anything. Everyone feels that those probabilities are not real -- that they are just something on somebody's mind."
Lewis puts it like this: "He [Allon] preferred his own internal probability calculator: his gut."
One bright spot - the young: Both Tversky and Khaneman had taught the biology of judgment to elementary or high school students and the two wrote in an unpublished manuscript that "we found these experiences highly encouraging." Lewis writes: "Adult minds were too self-deceptive. Children's minds were a different matter."
Khaneman wrote: "We have attempted to teach people at various levels in government, army, etc. but achieved only limited success."
Under the current retrograde political conditions, the public schools option seems to be the ONLY path to possibly injecting this new knowledge into mainstream American politics and society.
The lost cause: Post truth politics: Unfortunately, the impact of the new knowledge of human cognition and social behavior on politics is weak. It's not non-existent, but current political conditions strongly disfavor rationality. There's a faint pulse, at least for now, but it will be easy to kill.[2]
For decision making based on modern cognitive and social biology, the obvious and probably only path to possibly reach that lofty goal is to require at least one semester, probably two, of instruction in human cognitive and social biology for all public schools. Absent that, it's highly likely (>95% chance ?) that politics will remain as irrational and fantasy-based as it is now and as it will be in at least the upcoming 4 or 8 years.
Lewis' book has lots of other gems in it, for example, describing the impact of emotional states such as potential hope or regret on perceived experiences or reality. The human mind has many ways of distorting both reality and reason. This book makes that crystal clear using both real life anecdotes and descriptions of research by Khameman, Tversky and others. Given the role of human emotions, reality (including fact) is mostly personal and subjective, not mostly objective.
And, there's this nugget: "To Danny the whole idea of proving that people weren't rational felt a bit like proving that people didn't have fur. Obviously people were not rational, in any meaningful sense of that term."[3]
Questions: Is it true or at least plausible that children can be taught to self-question but adults cannot? If so, is there any point to even discussing this kind of science in the context of politics because adults are a lost cause?
Footnotes:
0. A personal guess as to why psychology had to stay dark ages until about the mid 1900's (1960s and later): (a) more wealth allowed more decisions that weren't just survival based (data shows that the more survival-critical a decision is, the more rational it usually is and poverty or near survival living focuses the mind on what's needed to survive), and (b) the rise of machines that could analyze much more data than people with just fingers and toes, an abacus or a slide rule.
1. The mind also is an impressive false reality-creating machine. In the context of driving a car: "The brain is limited. There are gaps in our attention. The mind contrives to make those gaps invisible to us. We think we know things we don't. . . . . It's that they [people] don't appreciate the extent to which they are fallible."
2. Given his rhetoric and animosity for (i) all that went before and (ii) truth, it seems more likely than not that Donald Trump will act to kill Obama's 2015 Behavioral Science Insights Policy Directive, which was based on work by Khaneman and Tversky as adopted for politics by Richard Thaler, a behavioral scientist and economist.
3. And this bizarre attack from an academic critic in the 1979 who felt that Khaneman and Tversky were being too pessimistic about human cognitive limitations. Lewis wrote: "The masses are not equipped to grasp Amos and Danny's message. The subtleties were beyond them. People needed to be protected from misleading themselves into thinking that their minds were less trustworthy than they actually were. 'I do not know whether you realize just how far that message has spread, or how devastating its effects have been'. . . . Even sophisticated doctors were getting from Danny and Amos only the crude, simplified message that their minds could never be trusted.** What would become of medicine? Of intellectual authority? Of experts?" Critics' fear was obvious and palpable. In the current political climate, the knowledge that Khaneman and Tversky generated will probably fall on deaf ears, or maybe even be subject to vicious post truth political attacks.
** That attack was typical - critics often exaggerated what Khaneman and Tversky kept saying explicitly in their publications, i.e., the mind isn't always wrong, but it is subject to errors and they are often systematic (not random), predictable and uncomfortably frequent.
B&B orig: 1/16/17