Etiquette



DP Etiquette

First rule: Don't be a jackass.

Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.

Saturday, March 7, 2020

Chapter Review: An Evolutionarily Informed Study of Moral Psychology

Philosophers, research scientists, and scholars across the academy have been wrestling with how to define, measure and think about morality for thousands of years. .... Why is it that the study of morality such a difficult task? .... While the human mind is not usually considered an impediment to scientific progress, it may present particular barriers to accurate models of the nature of morality and moral psychology. This is not the first research question that has been hampered by the fact that science is done by humans. Often, the problem is that we have a powerful intuition or perception of how the world seems or ought to be that gets in the way of scientifically understanding how the world really is. .... Here and elsewhere, the fact that human intuition or perception does not well map the real world has made humans worse at real science. .... Whatever the specific design of our psychology turns out to be, results have been collected that make it very hard to believe that this design is simply a machine for uncovering the objective truth of the world.”--Max Krasnow, Moral Psychology: A Multidisciplinary Guide, 2017, page 29


The 2017 book Moral Psychology: A Multidisciplinary Guide, edited by Benjamin Voyer and Tor Tarantola, is an academic publication intended to begin a process of unifying research on the extremely and subtle difficult topic of morality. Chapter 2, An Evolutionarily Informed Study of Moral Psychology, was written by Max Krasnow, an evolutionary psychologist at Harvard.

This book is written for an academic audience of researchers and scholars who pursue the study of morality-related topics such as what morals are, where they come from and why, and what they do. To advance knowledge to a higher level, researchers in a range of disciplines including psychology, philosophy, anthropology, sociology, evolutionary biology, cognitive biology, neuroscience and computer science need to become more aware of what other disciplines are doing and have discovered. This book will generally be quite difficult for a lay audience to understand. Given the complexity of morality, convergence and merging of knowledge from all disciplines will be necessary to move knowledge from its current level of infancy to a basis that can accelerate progress. Right now, progress is painfully slow.


Why morals are not intended or needed to reflect objective truth
Assuming that evolutionary forces shaped morals and moral behavior, Krasnow argues there are three closely related reasons that morality does not need to reflect objective truth. That does not mean that morals and behaviors that morals motivate are nonsense or useless. Disconnect from objective truth reflects the severely limited data processing bandwidth of the human brain in the face of an essentially infinitely complex world. That world includes humans, human societies, and technology and environments that change over time.

Krasnow’s first reason for the reality disconnect is that it is often the case that knowing objective truth is completely irrelevant to survival. A human who understands that gravity is a distortion of space-time has no survival advantage over another who knows enough to not to walk over the edge of a cliff or fall out of a tree because things that are not supported by the ground or a tree will fall to ground, which that can hurt a lot. In such situations, random mutations will lead to increasing survival fitness, but not necessarily increasing objective knowledge. This evolutionary pressure applied to morals and moral behavior.

Krasnow’s second, related reason is that there is an evolutionary pressure asymmetry in penalties for some mistakes in perceptions of reality. A well-known example is the fear and automatic defense response. A person mistaking some ground-level movement caused by a breeze for a snake and jumping away, suffers only a minimal loss of effort to avoid a non-existent threat. Here the goal isn't objective accuracy. The goal is not getting bitten by a snake. By contrast, the penalty for ignoring a movement that turns out to be a snake can inflict a very high penalty for being wrong if the person gets bitten.

In terms of cooperative behavior with strangers, usually considered to be a moral thing to do, tentative, limited cooperation can lead to more cooperation. That can lead to a higher payoff if the stranger turns out to be trustworthy. If the stranger is untrustworthy from the outset, the penalty for misplaced trust is low. In Pleistocene times (about 2.6 million years ago until about 11,700 years ago) when morals are believed to be shaped most prominently, people lived in small groups and everyone knew everyone else. Under those social conditions, cheaters were spotted pretty fast. A significant level of cheating was not possible for long. By contrast, in modern times a Ponzi scheme that runs for years is possible because people don't know each other. Ponzi schemers play on the normal human default tendency to trust. Presumably, most people would consider a Ponzi scheme to be immoral to at least some extent.

Krasnow’s third reason focuses on the fact that humans are inherently social creatures. We cooperated in groups to survive and that meant engaging social behaviors that helped the group to survive. Thus appearances to others that a person in the group is cooperative and does good deeds, even if no payoff is obvious, typically leads to social acceptance in the group. That social acceptance enhances the good person’s survival fitness. Thus, beliefs, behaviors and opinions that signal pro-social cooperativity can conflict with the objective world but nonetheless still be selected for. Krasnow points out that this is tricky. If a pro-social acting person is seen to be insincere, it tends to dampen social acceptance. People tend to distrust phonies. Krasnow comments: “In the moral domain, the selection pressures responsible for our moral sentiments -- our concern for the sick, our outrage at the oppressor, etc. -- may be more about what these sentiments signal to others than anything to do with objective truth seeking.”

Krasnow argues that those three reasons are why we often disconnect morals and moral beliefs and behaviors from objective reality: “Taking these points together -- that the objective truth is often fitness irrelevant, that the right kind of error is often ecologically rational, and that the adaptive problem is at least sometimes about changing someone else’s behavior -- helps suggest a program for an evolutionarily informed study of human moral psychology. The first task is to identify the major filters -- that is, the adaptive problems -- that components of moral psychology have been designed to solve.”

No comments:

Post a Comment