Etiquette



DP Etiquette

First rule: Don't be a jackass.

Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.

Tuesday, August 13, 2019

Pragmatic ideology: The rational politics power shifting goal



I have described some of the core logic that underpins a pragmatic 'anti-bias' political ideology based on cognitive and social science knowledge. The point is an attempt to foster cognitive and social beliefs that would tend to reduce biases and distortions in (i) perceptions of reality and facts, and (ii) the subjective, personal conscious reason we apply to the reality and facts we think we see.

The basic anti-bias concept envisions replacing the morals or principles of standard 'pro-bias' ideologies**, e.g., liberalism, conservatism, socialism, capitalism, libertarianism, populism, etc, with core morals or principles that foster a more open, less biased mind set. Evidence that such a mind set can exist and can foster more rational, less biased thinking has also been described, e.g., Philip Tetlock's finding of superforecasters and their relatively open (~anti-bias) mind sets.

** Pro-bias ideologies are a major source of unconscious confirmation bias and the more powerful powerful unconscious fact and reason distorting bias called motivated reasoning.

None of that sheds light on any purpose for anti-bias political ideology or mind set. The science only provides a rational for the possibility (not certainty) of reducing subjectivity in politics by reducing distortions in perceptions of reality and in application of common sense. So, the question remains: What's the purpose?

A balance of power shift to the public interest:The purpose of an anti-bias ideology is to shift, to some meaningful extent, the balance of power from where it is now in America's representative democracy to the public interest (a public interest conception is described below). Specifically, social science research clearly shows that:

1. Power in the sense of dictating policy choices does not reside with voters or the will of the people -- average people or public sentiment have no statistically detectable impact on setting policy, while organized special interests (including both political parties) exert essentially all policy setting power; and
2. The most powerful tool the existing two-party status quo has at it's disposal is a constitutionally-protected free speech right to influence and distort average people's perceptions of reality and their conscious reason by fostering normal pro-bias cognitive and social identity traits by lying, deceiving, misinforming, irrational emotionalizing and etc.

In other words, the two-party system plays on normal human biology by deceiving people with misinformation, deceit, lies, emotional appeals and other spin tactics that are constitutionally protected free speech. The two-party system relentlessly and crucially relies heavily on (i) distorted perceptions of reality and facts, and (ii) distorted or flawed conscious reason that is applied to its distorted perceptions of reality and facts. The hypothesis is that we are being heavily manipulated by shrewd appeal to human cognitive and social biology.

If one accepts that it is basically true that we are being played and the resulting deceit keeps the balance of power tipped in favor of special interests and both major parties at the expense of the public interest, then what can one do about it?

Logic would seem to argue that if deceit and the distorted reality and conscious reason that flows from it keeps power in the hands of the elites, then adopting an anti-bias mind set to partially reduce reality and conscious reason distortions would better empower average people. In politics, unbiased information and unbiased reason is power. Thus, instead of liberals and conservatives endlessly fighting over unresolvable ideological differences, the pragmatic anti-bias mind set would be focused on less distorted reality and less personally biased conscious reason in an effort to serve the public interest, not in an effort to vindicate and defend liberal or conservative political morals or principles.

In other words, people would be less distracted and less deceived by endless, unstoppable status quo deceit. In an anti-bias scenario, the focus would be more on finding the shape of reality for any given issue and then devising a roughly same-shaped policy choice to deal with the issue. As it is now, liberals see issues as liberal-shaped pegs (distorted reality) and they try to pound those pegs into liberal-shaped holes. Conservatives do the same. The problem with those pro-bias mind sets is that reality doesn't care about liberal- or conservative-shaped pegs. Reality just is what it is and it has its own reality- or human-shape.

Is it credible to argue that the two parties shrewdly use liberal and conservative ideology to distract and to build and maintain false reality and flawed reason to keep the public polarized and distrustful, while leaving elites free to exert power? Or, is it the case that only the liberal or conservative side does this, while the other side is mostly honest and rational?

Serving the public interest -- one conception: One vision of service to the public interest: Service to the public interest means governance based on identifying a rational, optimum balance between serving public, individual and commercial interests based on an objective, fact- and logic-based analysis of competing policy choices, while (1) being reasonably transparent and responsive to public opinion, (2) protecting and growing the American economy, (3) fostering individual economic and personal growth opportunity, (4) defending personal freedoms and the American standard of living, (5) protecting national security and the environment, (6) increasing transparency, competition and efficiency in commerce when possible, and (7) fostering global peace, stability and prosperity whenever reasonably possible, all of which is constrained by (i) honest, reality-based fiscal sustainability that limits the scope and size of government and regulation to no more than what is needed and (ii) genuine respect for the U.S. constitution and the rule of law with a particular concern for limiting unwarranted legal complexity and ambiguity to limit opportunities to subvert the constitution and the law.





B&B orig: 11/18/16

Fear and anger are more powerful than hope and empathy

Hauya elegans - Mexico

Dr. Michael Shermer, publisher of Skeptic magazine, authored a short analysis piece for Scientific American magazine on the psychology of political pessimism. In his article, Shermer observes that based on several objective measures, now is the best time in human history to be alive. Despite that, many or most Americans seem to believe that we are in very bad times or even on the verge of collapse and/or civil war.

As previously noted, economist Bryan Caplan pointed to irrational pessimism as one of the biases that cause systematic (not random) irrationality in the economic realm. There is “a pessimistic bias that leads to underestimation of current economic conditions, often expressed as a nostalgia for earlier times with conditions not as good as people usually imagine they were.” Something akin to that bias seems to play out about the same way for politics.

Part of this is due to the press-media usually presenting bad news ranging from car accidents and social mayhem to the brutality of war. That preference for bad news plays into an unconscious bias that social scientist Daniel Khaneman called the “what-you-see-is-all-there-is” bias. Simply put, when people see or hear mostly bad news, they tend to think that’s all there is.

Shermer points out the influence of three other unconscious biases at play. They are (i) “loss aversion”, which causes people to generally feel that “losses hurt twice as much as gains feel good”, (ii) the endowment effect, in which people put more value on something they own than what they don’t own, and (iii) the status quo effect, in which people generally prefer “existing personal, social, economic and political arrangements over proposed alternatives.”

Those three biases are grounded in human evolution. According to Shermer: “. . . . in our evolutionary past there was an asymmetry of payoffs in which the fitness cost of overreacting to a threat was less than the fitness cost of underreacting. The world was more dangerous in our evolutionary past, so it paid to be risk-averse and highly sensitive to threats, and if things were good, then the status quo was worth maintaining.” In other words, evolution has biased humans to varying degrees to resist change.

Politicians and partisans play on our pessimism biases. They argue that “once upon a time things were bad, and now they’re good thanks to our party” or “once upon a time things were good, but now they’re bad thanks to the other party.” For better or worse, “. . . . bad information is processed more thoroughly than good. Bad impressions and bad stereotypes are quicker to form and more resistant to disconfirmation than good ones.”

Finding a way to a less biased, more positive reality is the trick.

PS: For those interested in a bit of the cognitive science. Some of our unconscious biases are hard wired and acquired from evolution. Loss aversion is one example. A loss aversion curve from Daniel Khaneman’s book, Thinking, Fast and Slow, is here (scroll down to figure 10). Note its asymmetry, with the slope of response to loss in the lower left quadrant being steeper than the response to gain in the upper right quadrant. The asymmetric S shape is based on human response data to risk-reward questions. Khaneman comments on the curve: “. . . . losses loom larger than gains. This asymmetry between the power of positive and negative expectations or experiences has an evolutionary history. Organisms that treat threats as more urgent than opportunities have a better chance to survive and reproduce.” The asymmetry was one of the three characteristics of Prospect Theory that Khaneman, a psychologist, proposed as an alternative to the dominant Utility Theory in economics. He received a Nobel Prize in economics for his Prospect Theory contributions.



B&B orig: 11/4/16

Motivated reasoning in politics

We apply fight-or-flight reflexes not only to predators, but to data itself.” Chris Mooney, science journalist referring to unconscious defense reflexes to unpleasant or disagreeable information

Motivated reasoning is an unconscious bias that’s associated with people who hold strong personal or ideological beliefs. According to one source, “motivated reasoning leads people to confirm what they already believe, while ignoring contrary data. But it also drives people to develop elaborate rationalizations to justify holding beliefs that logic and evidence have shown to be wrong. Motivated reasoning responds defensively to contrary evidence, actively discrediting such evidence or its source without logical or evidentiary justification. Clearly, motivated reasoning is emotion driven.”

fMRI Brain scans - speaking, finger tapping, listening

One observer notes that scientists have found that “one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues. . . . . when we think we're reasoning, we may instead be rationalizing. We may think we're being scientists, but we're actually being lawyers. Our ‘reasoning’ is a means to a predetermined end—winning our ‘case’—and is shot through with biases.”

What’s the evidence?: It’s fair to ask if there’s any tangible evidence that unconscious motivated reasoning bias is real. There is. In 2006, researcher Drew Westin and colleagues published a paper showing brain activity in committed political partisans during the 2004 presidential election (Bush’s re-election). The paper, Neural Bases of Motivated Reasoning: An fMRI Study of Emotional Constraints on Partisan Political Judgment in the 2004 U.S. Presidential Election, Journal of Cognitive Neuroscience, vol. 18, issue 11, pages 1947-1958, looked at implicit (unconscious) brain activity for information that was threatening to their own candidate, the opposing candidate or an individual who was neutral to the partisan.

The researchers summarized their results like this: “Research on political judgment and decision-making has converged with decades of research in clinical and social psychology suggesting the ubiquity of emotion-biased motivated reasoning. Motivated reasoning is a form of implicit emotion regulation in which the brain converges on judgments that minimize negative and maximize positive affect states associated with threat to or attainment of motives. . . . As predicted, motivated reasoning was not associated with neural activity in regions previously linked to cold reasoning tasks and conscious (explicit) emotion regulation.”

fMRI brain scanner (functional magnetic resonance imaging)

That was the first neuroimaging evidence for motivated reasoning, implicit emotion regulation, and psychological defense. The brain imaging data suggested that “motivated reasoning is qualitatively distinct from reasoning* when people do not have a strong emotional stake in the conclusions reached.” In other words, when partisans were presented with threatening information, they unconsciously reacted emotionally, not consciously.

* Meaning conscious thought or thinking about information that was positive or neutral for the partisan’s candidate – “cold reasoning tasks” and conscious emotion control.

Despite motivated reasoning’s power to distort perceptions of reality or facts and how we apply common sense to what we think we perceive, simply knowing about its existence can help the conscious mind reduce the distortions. Of course, doing that requires a will or mind set that’s motivated to reduce unconscious distortions.

It all boils down to one’s personal mind set. One libertarian partisan became aware of the distorting influence his own strongly held political ideology had on his perceptions of facts and his common sense in thinking about the facts. He described his experience like this: “Ever since college I have been a libertarian—socially liberal and fiscally conservative. I believe in individual liberty and personal responsibility. I also believe in science as the greatest instrument ever devised for understanding the world. So what happens when these two principles are in conflict? My libertarian beliefs have not always served me well. Like most people who hold strong ideological convictions, I find that, too often, my beliefs trump the scientific facts. This is called motivated reasoning, in which our brain reasons our way to supporting what we want to be true. Knowing about the existence of motivated reasoning, however, can help us overcome it when it is at odds with evidence.”

Questions: Does being a responsible citizen come with a moral obligation to be aware of human biases and their tendency to distort reality and common sense so that they can try to reduce the distortion? Or, because facing unbiased reality is psychologically uncomfortable and threatening to personal self-image and self-esteem, there is no obligation for citizens to care about their own unconscious biases and to operate on the false belief that they in fact do not bias facts or common sense?

B&B orig: 10/27/16

The biology of subjective facts

In politics, finding objective facts is usually hard. In addition to being inundated in an ocean of spin and lies, another barrier is being subjective creatures from a biological point of view. The power of our inherently subjective minds cannot be understated. We are humans. Humans are inherently subjective and our minds evolved to work that way. The laws of the universe require that out minds work that way. It is both biologically and mathematically impossible for humans to operate in politics on the basis of pure conscious reason and objective fact or truth.

Given human biology and the laws of the universe, we have to operate on the basis of mental rules or shortcuts that our minds actually can work with. Those rules simplify and distort reality, including facts, and conscious reason. None of that is a criticism of anyone, any group or the human species. Those are objective fact statements based on human biology and the laws that govern the universe.

Because of those truths, finding objective fact in politics isn't nearly as easy as one might envision. The human mind operates mostly on the basis of unconscious thinking. That thought mode is heavily influenced by (i) biases all of us got from evolution* (nature), and (ii) biases from personal morals we grew into or learned (nurture). In seeing and hearing the world, we first become unconsciously aware of what we see and hear, that input is then filtered through our unconscious biases and then after that unconscious filtering of what we see or hear, we become consciously aware of maybe 0.001% of what our unconscious minds was aware of.

* For example, humans do not think in terms of statistics. That kind of conscious thinking has to be learned. To survive, humans did not need to think in terms of statistics, otherwise we either would not exist, or we would already innately think in terms of statistics. Our innate failure to properly account for numbers explains, for example, why most Americans grossly overestimate the danger of personal harm or attack from terrorists on American soil. That's just one bias we got from evolution, but the distortions of reality that that bias generates can be overcome to some extent by learning and conscious effort. Other evolutionary biases can be harder to somewhat or mostly counteract. I'm not sure if any evolutionary bias can be fully overcome.

Our feeble conscious minds: The little dribble of information we do become consciously aware of has been filtered through our unconscious biases. Those biases distort both the facts or reality we think we see and hear and the common sense we apply to what we think we see and hear. Being more objective (less biased) is tricky. It requires self-awareness and a will to be more objective. Being completely unbiased is impossible. Being less biased is possible.

In other words, our conscious minds are often fooled right from the get go. That makes finding objective facts significantly more complicated than one might think. That's why when liberals and conservatives disagree on something, it is the norm for them to significantly or completely disagree on what the facts are. Their different unconscious biases (morals, political ideology) often lead most (>95% ?) people to see things that fit their biases or fail to see things that contradict their biases. This vignette explains how that works for one political ideologue (a self-aware libertarian) who woke up to understand how his ideology had been distorting both facts and the common sense he applied to the facts he thought he did see:

"Ever since college I have been a libertarian—socially liberal and fiscally conservative. I believe in individual liberty and personal responsibility. I also believe in science as the greatest instrument ever devised for understanding the world. So what happens when these two principles are in conflict? My libertarian beliefs have not always served me well. Like most people who hold strong ideological convictions, I find that, too often, my beliefs trump the scientific facts. This is called motivated reasoning [an unconscious bias], in which our brain reasons our way to supporting what we want to be true. . . . Take gun control. I always accepted the libertarian position of minimum regulation in the sale and use of firearms because I placed guns under the beneficial rubric of minimal restrictions on individuals. Then I read the science on guns and homicides, suicides and accidental shootings . . . . . Although the data to convince me that we need some gun-control measures were there all along, I had ignored them because they didn't fit my creed."

If one accepts the reality of how the human mind usually or always operates in a subjective, reality distorting mode, it is easy to see the basis for profound disagreements over facts between liberals, conservatives and populists in the current presidential election.

If Americans were truly interested in being less biased, their differences of opinion and perceived facts would not disappear. However, they would narrow. The problem is getting past our innate human subjectivity and the massive difficulty in changing one's personal mind set.

B&B orig: 10/13/16

Empathy, conflict and war



Context: Among other aspects of human cognitive biology, social and cognitive science is intensely probing into the biological roots of conflict and war within societies and between nations. Given the disturbing human propensity for sectarian conflict and war in the nuclear bomb age, that is arguably one of the most important topics that science can explore.

After carefully listening to the Clinton-Trump debate last night and people's reactions to it, it now seems undeniable that relative to recent history, American politics is on a new and very dangerous path. Two major factors that underpin America's new direction are Donald Trump's caustic personality and public discontent, fear, anger and distrust. Given the biology of human cognition, that combination is toxic.

The science of empathy: This discussion is an attempt to describe some of the human cognitive biology that is driving a significant portion of the America mind set into treacherous territory. The following is based on a February 2016 interview with the cognitive scientist Emile Bruneau, an empathy researcher at MIT, and other sources including the book The Righteous Mind: Why Good People Are Divided By Religion and Politics, by social psychologist Johnathan Haidt.

Bruneau observes that humans have biases that we may not always be willing or able to admit to. A large portion of our brain is implicit (operates unconsciously) and what happens we don't have conscious control over, including our biases or prejudices. This aspect of how our brain works allows humans to respond to the world and guide behavior without our knowledge or ability to control the process.

A decrease in empathy often arises when people in a group or society encounters opinions or arguments that run counter to the group's beliefs. Even well-reasoned counter opinions and objective facts are not persuasive for most people faced with contrary logic or fact. That isn't surprising. Human biases operate to inhibit people from reasoning objectively. Instead, we normally apply subjective reasoning to the world we think we see and the facts we believe are true. This is routine in politics.

In disagreements, e.g., liberal vs. conservative vs. populist, people in each group generally are uncritically in accepting arguments and interpretations of events that favor their opinions while critically examining or rejecting opposing interpretations and arguments. These biases are endemic and part of human biology. It isn't inevitable that biases always dominate, but our brains are potentiated or sensitized to think and act in accord with personal biases.

Overcoming those biases to some extent is difficult and doing it requires a will do to so and significant cognitive effort. It's hard work but, for better or worse, humans are usually lazy and easily distracted. Some people who can overcome their group's prejudices but what drives that is not understood and is now under study.

The second Clinton-Trump debate: By his explicit language and on-stage demeanor, Trump has divided people into groups. He and his group relentlessly attacks the opposition. Clinton is now responding in kind. Public reactions to the debate make it clear that the two sides profoundly detest and distrust each other. That drains empathy and dehumanizes the opposition. Dehumanizing the opposition makes the door to sectarian conflict easier to open. In terms of international relations, Trump's fury-driven attitude and words opens the door to international conflict, which can lead to war.

B&B orig: 10/10/16

The rationally irrational citizen

In his book, The Myth Of The Rational Voter: Why Democracies Choose Bad Policies, economist Bryan Caplan posits that most people operate in economic and other areas largely on the basis of rational irrationality. Caplan observes that although the private cost of an irrational personal action can be negligible, the social cost can be very high and vice versa. Like economic models for most any product, forces at play in shaping personal rational irrationality include preferences (personal demand) for an irrational behavior and the price the irrational actor pays for acting irrationally.

The implication is that as the personal price for an irrational act increases, the person is increasingly incentivized to consume less irrationality, i.e., they tend to act more rationally. All other things being equal, a high personal preference for a given irrational act tends to lead to more irrational behavior. When viewed like this, there is logic in irrationality, hence the label rational irrationality. And, it becomes clear that rational irrationality is not quite the same as rational ignorance, where voters stop searching for truth when the cost in effort to find truth is too high. By contrast, rational irrationality posits that people actively, but mostly unconsciously, avoid the truth.

This line of thinking gets even stranger when it’s applied to politics. It is well-known to social and political science that people’s beliefs and behaviors are often contradictory under varying circumstances. For example, people who assert strong protectionist beliefs about trade policy, usually don’t give much weight to a product’s national origin relative to the more important factors of the product’s price and quality.

That’s an example of people responding to fluctuating incentives, which unconsciously causes consumers to change viewpoints depending on the circumstances. One can stand back and level an accusation of hypocrisy, but this kind of behavior reflects a natural working of the human mind. So, if a politically protectionist consumer has a choice of buying a pair of jeans made in China for $40.00 or an equivalent pair made in the US for $60.00, it’s not unusual for the consumer to pick the imported product. In this example, the high cost of being politically rational or ideologically consistent is $20.00 per pair of jeans. That’s enough of an incentive to increase the politically irrational act of buying the import from the buyer’s point of view (but it’s a rational choice from the buyer’s economic point of view). If the price differential was lower, say only $8.00 per pair of jeans, maybe most protectionists would opt for the US product over the import to vindicate their ideological belief.

The point is that fluctuating incentives lead to different behaviors.

Caplan goes on to point out the psychological plausibility of rational irrationality, which he asserts “appears to map an odd route to delusion” in three steps. First, a person tries to find the truth (real or imagined), second they weigh the psychological cost of rejecting truth vs. the material (real world) costs, and third, if the psychological benefits of being wrong outweigh the material costs, the person will often “purge the truth from their mind and embrace error.” That self-delusion process may sound implausible, but it’s not. The mental process is mostly tacit or unconscious.

Looked at another way, people psychologically can afford to be irrational on topics where they have little or no emotional or psychological attachment to a given choice or answer, e.g., buying the cheap jeans from China for people who aren’t politically protectionist imposes no psychological cost. However, when there is an emotional or psychological attachment to a given choice or answer, but there’s little or no material cost of error, people will tend to believe whatever makes them feel best, even if they are wrong. On the other hand if there’s a significant material cost of error, people will tend to become more objective and they more critically and consciously weigh the psychological cost of breaking “comforting illusions” against the material cost of error.

Caplan takes care to point out that rational irrationality does not mean that all political views are always senseless or in error. Instead, it casts doubt on everyone’s political beliefs. The problem with rational irrationality is that it fosters both mistaken beliefs about how the world works and support for counterproductive political policies. Unlike shoppers for consumer goods, voters do not have clear incentives to be rational. Voting is not a slight variation on shopping. However, there are major psychological incentives for voters to set objectivity aside and be irrational.

As Caplan puts it: “Political behavior seems weird because the incentives that voters face are weird.” Maybe weird political behavior isn't weird. Weird politics is normal from the point of view of human cognitive biology.

B&B orig: 10/6/16