Friday, January 15, 2016

Thinking, Fast and Slow; Book Review



Book review: Thinking, Fast and Slow
Daniel Kahneman
Farrar, Straus and Giroux, New York
Original publication: 2011

Dr. Kahneman, a psychologist, won a Nobel Prize in economics for his work on prospect theory, in which he began to generate a more accurate description of the biological basis of decision-making. That work is directly relevant to politics. The biology behind seeing and thinking distorts reality and that shapes political decision-making or policy choices.

Kahneman’s book is based on several decades of research by himself and other social scientists. It focuses on the contrast between two ways of perceiving reality and thinking about what we think we see. Those modes of cognition can be simultaneous and overlapping. Kahneman’s research led to his recognition of a mental "system 1" and "system 2". The two systems do not necessarily correspond to different parts or bits of brain, but are used to describe what is going on in our heads. System 1 is an instinctive, intuitive-emotional and fast way of seeing the world and thinking about it. System 1 operates mostly unconsciously and without conscious effort. Although we don’t know it, system 1 usually dominates our mental activity, perceptions of reality, judgments and choices.

Not nearly as rational as we think
By contrast, Kahneman’s system 2 is slower, but more logical and calculating. System 2 requires biologically measurable work, tires easily, and is lazy, preferring to do the least work needed to get at a solution, even if it’s wrong. System 2 is the conscious part of human cognition and what we are aware of when we look at the world or political issues and consciously think about them. Applying this logical aspect of human cognition requires motivation and conscious effort. Because this mode of thinking is what we are aware of, people tend to believe that our conscious, “rational” thoughts constitute the main or only way we think. For most people, this aspect of our biology fosters a hard to reject but false belief that we are quite rational and well-grounded in reality.

Thinking in system 1 and system 2 is shaped by powerful innate but unconscious biases that distort both facts (reality) and logic (common sense). In that regard, our innate biases can be considered to be “de-rationalizing” because they prevent us from seeing unbiased reality and applying unbiased common sense to what we think we see. Our innate biases powerfully shape policy choices to fit personal ideology and/or morals. Kahneman’s book describes the research that reveals the basis that lead people to place too much confidence in human judgment, including their own.

Biases with real bite
The list of unconscious, de-rationalizing cognitive biases is long and surprising. They include:

  • Kahneman’s powerful “what you see is all there is” bias (the “illusion of validity”) that leads to perceptions and choices (i) based on what we directly see, (ii) not based on relevant information we are not aware of, both of which (iii) tends to kill our motivation to look for information we are not aware of, especially information that could contradict what we believe or want to be true;
  • Framing choices that lead to different perceptions and choices depending simply on how an issue is presented, the effects of which alter judgments even though the underlying facts are identical regardless of how the issue or problem is framed;
  • An unconscious bait and switch bias that unknowingly substitutes an easy, intuitively answerable question for a hard one that requires conscious effort and System 2 logic, which reflects system 2 laziness;  
  • Loss aversion, a tendency to irrationally prefer avoiding losses over gains by unreasonably overweighting potential losses and underweighting potential gains;
  • An energy-based judgment bias from being hungry or having low blood sugar, which affects judgments - e.g., it affects sentencing decisions that theoretically impartial judges hand out to convicts; and
  • An illusion of understanding situations or issues even when a person doesn’t have enough information to understand. Humans fit past events or facts into a “logical” story and then we believe we understand, and can’t imagine things differently. The human mind is highly adept at (i) unconsciously and rapidly making “sense” out of things based on insufficient information and (ii) drawing often flawed judgments based thereon.


Being subjective-intuitive and irrational isn’t hopeless . . . .
For people looking for more objectivity and rationality in politics, human biology sounds like a very big impediment, and it is. Fortunately, that isn’t the whole story. Political ideologues whose personal ideology or morals distort facts and logic can become aware of their own reality-distorting biology and that self-awareness helps reduce the influence of irrational biases on perceptions of reality and common sense. Progress toward objectivity requires the moral courage to accept our biology for what it actually is, not what we think it is.

. . . . but change isn’t going to be easy
One unusually self-aware political ideologue explained the difficulty this way: “My libertarian beliefs have not always served me well. Like most people who hold strong ideological convictions, I find that, too often, my beliefs trump the scientific facts. This is called motivated reasoning, in which our brain reasons our way to supporting what we want to be true. Knowing about the existence of motivated reasoning, however, can help us overcome it when it is at odds with evidence.”

Although Khaneman is silent on the issue, susceptibility to cognitive distortion may vary between different political or ideological groups. That supposition generally accords with research showing that more intense personal ideological belief impairs judgment. The American public is in a period of increasing ideological rigidity and political polarization. That is an impediment to acceptance of political objectivity. 

Another impediment is the two-party system itself. Both parties, most of their politicians, most partisan pundits, most of the press-media most of the time and players with campaign contribution money, foster an image of partisan political rationality and opposition irrationality. Fostering strongly-held partisan ideological beliefs feeds our unconscious biases. Playing on our illusions of rationality serves to defend and maintain the status quo, e.g., it hides the usually tenuous to nonexistent connection between political rhetoric and reality and logic. That doesn’t serve the public interest.

If they want objectivity, which is an open question, the American people have a lot of introspecting and learning to do. Objectivists have their work cut out for them and a long, long way to go.

No comments:

Post a Comment