The 2017 book Moral Psychology: A Multidisciplinary Guide, edited by Benjamin Voyer and Tor Tarantola, is an academic publication intended to begin a process of unifying research on the extremely and subtle difficult topic of morality. Chapter 2, An Evolutionarily Informed Study of Moral Psychology, was written by Max Krasnow, an evolutionary psychologist at Harvard.
This book is written for an academic audience of researchers and scholars who pursue the study of morality-related topics such as what morals are, where they come from and why, and what they do. To advance knowledge to a higher level, researchers in a range of disciplines including psychology, philosophy, anthropology, sociology, evolutionary biology, cognitive biology, neuroscience and computer science need to become more aware of what other disciplines are doing and have discovered. This book will generally be quite difficult for a lay audience to understand. Given the complexity of morality, convergence and merging of knowledge from all disciplines will be necessary to move knowledge from its current level of infancy to a basis that can accelerate progress. Right now, progress is painfully slow.
Why morals are not intended or needed to reflect objective truth
Assuming that evolutionary forces shaped morals and moral behavior, Krasnow argues there are three closely related reasons that morality does not need to reflect objective truth. That does not mean that morals and behaviors that morals motivate are nonsense or useless. Disconnect from objective truth reflects the severely limited data processing bandwidth of the human brain in the face of an essentially infinitely complex world. That world includes humans, human societies, and technology and environments that change over time.Krasnow’s first reason for the reality disconnect is that it is often the case that knowing objective truth is completely irrelevant to survival. A human who understands that gravity is a distortion of space-time has no survival advantage over another who knows enough to not to walk over the edge of a cliff or fall out of a tree because things that are not supported by the ground or a tree will fall to ground, which that can hurt a lot. In such situations, random mutations will lead to increasing survival fitness, but not necessarily increasing objective knowledge. This evolutionary pressure applied to morals and moral behavior.
Krasnow’s second, related reason is that there is an evolutionary pressure asymmetry in penalties for some mistakes in perceptions of reality. A well-known example is the fear and automatic defense response. A person mistaking some ground-level movement caused by a breeze for a snake and jumping away, suffers only a minimal loss of effort to avoid a non-existent threat. Here the goal isn't objective accuracy. The goal is not getting bitten by a snake. By contrast, the penalty for ignoring a movement that turns out to be a snake can inflict a very high penalty for being wrong if the person gets bitten.
In terms of cooperative behavior with strangers, usually considered to be a moral thing to do, tentative, limited cooperation can lead to more cooperation. That can lead to a higher payoff if the stranger turns out to be trustworthy. If the stranger is untrustworthy from the outset, the penalty for misplaced trust is low. In Pleistocene times (about 2.6 million years ago until about 11,700 years ago) when morals are believed to be shaped most prominently, people lived in small groups and everyone knew everyone else. Under those social conditions, cheaters were spotted pretty fast. A significant level of cheating was not possible for long. By contrast, in modern times a Ponzi scheme that runs for years is possible because people don't know each other. Ponzi schemers play on the normal human default tendency to trust. Presumably, most people would consider a Ponzi scheme to be immoral to at least some extent.
Krasnow’s third reason focuses on the fact that humans are inherently social creatures. We cooperated in groups to survive and that meant engaging social behaviors that helped the group to survive. Thus appearances to others that a person in the group is cooperative and does good deeds, even if no payoff is obvious, typically leads to social acceptance in the group. That social acceptance enhances the good person’s survival fitness. Thus, beliefs, behaviors and opinions that signal pro-social cooperativity can conflict with the objective world but nonetheless still be selected for. Krasnow points out that this is tricky. If a pro-social acting person is seen to be insincere, it tends to dampen social acceptance. People tend to distrust phonies. Krasnow comments: “In the moral domain, the selection pressures responsible for our moral sentiments -- our concern for the sick, our outrage at the oppressor, etc. -- may be more about what these sentiments signal to others than anything to do with objective truth seeking.”
Krasnow argues that those three reasons are why we often disconnect morals and moral beliefs and behaviors from objective reality: “Taking these points together -- that the objective truth is often fitness irrelevant, that the right kind of error is often ecologically rational, and that the adaptive problem is at least sometimes about changing someone else’s behavior -- helps suggest a program for an evolutionarily informed study of human moral psychology. The first task is to identify the major filters -- that is, the adaptive problems -- that components of moral psychology have been designed to solve.”
No comments:
Post a Comment