Pragmatic politics focused on the public interest for those uncomfortable with America's two-party system and its way of doing politics. Considering the interface of politics with psychology, cognitive biology, social behavior, morality and history.
Etiquette
DP Etiquette
First rule: Don't be a jackass. Most people are good.
Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.
Tuesday, August 13, 2019
Pragmatic ideology: The rational politics power shifting goal
I have described some of the core logic that underpins a pragmatic 'anti-bias' political ideology based on cognitive and social science knowledge. The point is an attempt to foster cognitive and social beliefs that would tend to reduce biases and distortions in (i) perceptions of reality and facts, and (ii) the subjective, personal conscious reason we apply to the reality and facts we think we see.
The basic anti-bias concept envisions replacing the morals or principles of standard 'pro-bias' ideologies**, e.g., liberalism, conservatism, socialism, capitalism, libertarianism, populism, etc, with core morals or principles that foster a more open, less biased mind set. Evidence that such a mind set can exist and can foster more rational, less biased thinking has also been described, e.g., Philip Tetlock's finding of superforecasters and their relatively open (~anti-bias) mind sets.
** Pro-bias ideologies are a major source of unconscious confirmation bias and the more powerful powerful unconscious fact and reason distorting bias called motivated reasoning.
None of that sheds light on any purpose for anti-bias political ideology or mind set. The science only provides a rational for the possibility (not certainty) of reducing subjectivity in politics by reducing distortions in perceptions of reality and in application of common sense. So, the question remains: What's the purpose?
A balance of power shift to the public interest:The purpose of an anti-bias ideology is to shift, to some meaningful extent, the balance of power from where it is now in America's representative democracy to the public interest (a public interest conception is described below). Specifically, social science research clearly shows that:
1. Power in the sense of dictating policy choices does not reside with voters or the will of the people -- average people or public sentiment have no statistically detectable impact on setting policy, while organized special interests (including both political parties) exert essentially all policy setting power; and
2. The most powerful tool the existing two-party status quo has at it's disposal is a constitutionally-protected free speech right to influence and distort average people's perceptions of reality and their conscious reason by fostering normal pro-bias cognitive and social identity traits by lying, deceiving, misinforming, irrational emotionalizing and etc.
In other words, the two-party system plays on normal human biology by deceiving people with misinformation, deceit, lies, emotional appeals and other spin tactics that are constitutionally protected free speech. The two-party system relentlessly and crucially relies heavily on (i) distorted perceptions of reality and facts, and (ii) distorted or flawed conscious reason that is applied to its distorted perceptions of reality and facts. The hypothesis is that we are being heavily manipulated by shrewd appeal to human cognitive and social biology.
If one accepts that it is basically true that we are being played and the resulting deceit keeps the balance of power tipped in favor of special interests and both major parties at the expense of the public interest, then what can one do about it?
Logic would seem to argue that if deceit and the distorted reality and conscious reason that flows from it keeps power in the hands of the elites, then adopting an anti-bias mind set to partially reduce reality and conscious reason distortions would better empower average people. In politics, unbiased information and unbiased reason is power. Thus, instead of liberals and conservatives endlessly fighting over unresolvable ideological differences, the pragmatic anti-bias mind set would be focused on less distorted reality and less personally biased conscious reason in an effort to serve the public interest, not in an effort to vindicate and defend liberal or conservative political morals or principles.
In other words, people would be less distracted and less deceived by endless, unstoppable status quo deceit. In an anti-bias scenario, the focus would be more on finding the shape of reality for any given issue and then devising a roughly same-shaped policy choice to deal with the issue. As it is now, liberals see issues as liberal-shaped pegs (distorted reality) and they try to pound those pegs into liberal-shaped holes. Conservatives do the same. The problem with those pro-bias mind sets is that reality doesn't care about liberal- or conservative-shaped pegs. Reality just is what it is and it has its own reality- or human-shape.
Is it credible to argue that the two parties shrewdly use liberal and conservative ideology to distract and to build and maintain false reality and flawed reason to keep the public polarized and distrustful, while leaving elites free to exert power? Or, is it the case that only the liberal or conservative side does this, while the other side is mostly honest and rational?
Serving the public interest -- one conception: One vision of service to the public interest: Service to the public interest means governance based on identifying a rational, optimum balance between serving public, individual and commercial interests based on an objective, fact- and logic-based analysis of competing policy choices, while (1) being reasonably transparent and responsive to public opinion, (2) protecting and growing the American economy, (3) fostering individual economic and personal growth opportunity, (4) defending personal freedoms and the American standard of living, (5) protecting national security and the environment, (6) increasing transparency, competition and efficiency in commerce when possible, and (7) fostering global peace, stability and prosperity whenever reasonably possible, all of which is constrained by (i) honest, reality-based fiscal sustainability that limits the scope and size of government and regulation to no more than what is needed and (ii) genuine respect for the U.S. constitution and the rule of law with a particular concern for limiting unwarranted legal complexity and ambiguity to limit opportunities to subvert the constitution and the law.
B&B orig: 11/18/16
Fear and anger are more powerful than hope and empathy
Hauya elegans - Mexico
Dr. Michael Shermer, publisher of Skeptic magazine, authored a short analysis piece for Scientific American magazine on the psychology of political pessimism. In his article, Shermer observes that based on several objective measures, now is the best time in human history to be alive. Despite that, many or most Americans seem to believe that we are in very bad times or even on the verge of collapse and/or civil war.
As previously noted, economist Bryan Caplan pointed to irrational pessimism as one of the biases that cause systematic (not random) irrationality in the economic realm. There is “a pessimistic bias that leads to underestimation of current economic conditions, often expressed as a nostalgia for earlier times with conditions not as good as people usually imagine they were.” Something akin to that bias seems to play out about the same way for politics.
Part of this is due to the press-media usually presenting bad news ranging from car accidents and social mayhem to the brutality of war. That preference for bad news plays into an unconscious bias that social scientist Daniel Khaneman called the “what-you-see-is-all-there-is” bias. Simply put, when people see or hear mostly bad news, they tend to think that’s all there is.
Shermer points out the influence of three other unconscious biases at play. They are (i) “loss aversion”, which causes people to generally feel that “losses hurt twice as much as gains feel good”, (ii) the endowment effect, in which people put more value on something they own than what they don’t own, and (iii) the status quo effect, in which people generally prefer “existing personal, social, economic and political arrangements over proposed alternatives.”
Those three biases are grounded in human evolution. According to Shermer: “. . . . in our evolutionary past there was an asymmetry of payoffs in which the fitness cost of overreacting to a threat was less than the fitness cost of underreacting. The world was more dangerous in our evolutionary past, so it paid to be risk-averse and highly sensitive to threats, and if things were good, then the status quo was worth maintaining.” In other words, evolution has biased humans to varying degrees to resist change.
Politicians and partisans play on our pessimism biases. They argue that “once upon a time things were bad, and now they’re good thanks to our party” or “once upon a time things were good, but now they’re bad thanks to the other party.” For better or worse, “. . . . bad information is processed more thoroughly than good. Bad impressions and bad stereotypes are quicker to form and more resistant to disconfirmation than good ones.”
Finding a way to a less biased, more positive reality is the trick.
PS: For those interested in a bit of the cognitive science. Some of our unconscious biases are hard wired and acquired from evolution. Loss aversion is one example. A loss aversion curve from Daniel Khaneman’s book, Thinking, Fast and Slow, is here (scroll down to figure 10). Note its asymmetry, with the slope of response to loss in the lower left quadrant being steeper than the response to gain in the upper right quadrant. The asymmetric S shape is based on human response data to risk-reward questions. Khaneman comments on the curve: “. . . . losses loom larger than gains. This asymmetry between the power of positive and negative expectations or experiences has an evolutionary history. Organisms that treat threats as more urgent than opportunities have a better chance to survive and reproduce.” The asymmetry was one of the three characteristics of Prospect Theory that Khaneman, a psychologist, proposed as an alternative to the dominant Utility Theory in economics. He received a Nobel Prize in economics for his Prospect Theory contributions.
B&B orig: 11/4/16
Dr. Michael Shermer, publisher of Skeptic magazine, authored a short analysis piece for Scientific American magazine on the psychology of political pessimism. In his article, Shermer observes that based on several objective measures, now is the best time in human history to be alive. Despite that, many or most Americans seem to believe that we are in very bad times or even on the verge of collapse and/or civil war.
As previously noted, economist Bryan Caplan pointed to irrational pessimism as one of the biases that cause systematic (not random) irrationality in the economic realm. There is “a pessimistic bias that leads to underestimation of current economic conditions, often expressed as a nostalgia for earlier times with conditions not as good as people usually imagine they were.” Something akin to that bias seems to play out about the same way for politics.
Part of this is due to the press-media usually presenting bad news ranging from car accidents and social mayhem to the brutality of war. That preference for bad news plays into an unconscious bias that social scientist Daniel Khaneman called the “what-you-see-is-all-there-is” bias. Simply put, when people see or hear mostly bad news, they tend to think that’s all there is.
Shermer points out the influence of three other unconscious biases at play. They are (i) “loss aversion”, which causes people to generally feel that “losses hurt twice as much as gains feel good”, (ii) the endowment effect, in which people put more value on something they own than what they don’t own, and (iii) the status quo effect, in which people generally prefer “existing personal, social, economic and political arrangements over proposed alternatives.”
Those three biases are grounded in human evolution. According to Shermer: “. . . . in our evolutionary past there was an asymmetry of payoffs in which the fitness cost of overreacting to a threat was less than the fitness cost of underreacting. The world was more dangerous in our evolutionary past, so it paid to be risk-averse and highly sensitive to threats, and if things were good, then the status quo was worth maintaining.” In other words, evolution has biased humans to varying degrees to resist change.
Politicians and partisans play on our pessimism biases. They argue that “once upon a time things were bad, and now they’re good thanks to our party” or “once upon a time things were good, but now they’re bad thanks to the other party.” For better or worse, “. . . . bad information is processed more thoroughly than good. Bad impressions and bad stereotypes are quicker to form and more resistant to disconfirmation than good ones.”
Finding a way to a less biased, more positive reality is the trick.
PS: For those interested in a bit of the cognitive science. Some of our unconscious biases are hard wired and acquired from evolution. Loss aversion is one example. A loss aversion curve from Daniel Khaneman’s book, Thinking, Fast and Slow, is here (scroll down to figure 10). Note its asymmetry, with the slope of response to loss in the lower left quadrant being steeper than the response to gain in the upper right quadrant. The asymmetric S shape is based on human response data to risk-reward questions. Khaneman comments on the curve: “. . . . losses loom larger than gains. This asymmetry between the power of positive and negative expectations or experiences has an evolutionary history. Organisms that treat threats as more urgent than opportunities have a better chance to survive and reproduce.” The asymmetry was one of the three characteristics of Prospect Theory that Khaneman, a psychologist, proposed as an alternative to the dominant Utility Theory in economics. He received a Nobel Prize in economics for his Prospect Theory contributions.
B&B orig: 11/4/16
Motivated reasoning in politics
“We apply fight-or-flight reflexes not only to predators, but to data itself.” Chris Mooney, science journalist referring to unconscious defense reflexes to unpleasant or disagreeable information
Motivated reasoning is an unconscious bias that’s associated with people who hold strong personal or ideological beliefs. According to one source, “motivated reasoning leads people to confirm what they already believe, while ignoring contrary data. But it also drives people to develop elaborate rationalizations to justify holding beliefs that logic and evidence have shown to be wrong. Motivated reasoning responds defensively to contrary evidence, actively discrediting such evidence or its source without logical or evidentiary justification. Clearly, motivated reasoning is emotion driven.”
One observer notes that scientists have found that “one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues. . . . . when we think we're reasoning, we may instead be rationalizing. We may think we're being scientists, but we're actually being lawyers. Our ‘reasoning’ is a means to a predetermined end—winning our ‘case’—and is shot through with biases.”
What’s the evidence?: It’s fair to ask if there’s any tangible evidence that unconscious motivated reasoning bias is real. There is. In 2006, researcher Drew Westin and colleagues published a paper showing brain activity in committed political partisans during the 2004 presidential election (Bush’s re-election). The paper, Neural Bases of Motivated Reasoning: An fMRI Study of Emotional Constraints on Partisan Political Judgment in the 2004 U.S. Presidential Election, Journal of Cognitive Neuroscience, vol. 18, issue 11, pages 1947-1958, looked at implicit (unconscious) brain activity for information that was threatening to their own candidate, the opposing candidate or an individual who was neutral to the partisan.
The researchers summarized their results like this: “Research on political judgment and decision-making has converged with decades of research in clinical and social psychology suggesting the ubiquity of emotion-biased motivated reasoning. Motivated reasoning is a form of implicit emotion regulation in which the brain converges on judgments that minimize negative and maximize positive affect states associated with threat to or attainment of motives. . . . As predicted, motivated reasoning was not associated with neural activity in regions previously linked to cold reasoning tasks and conscious (explicit) emotion regulation.”
That was the first neuroimaging evidence for motivated reasoning, implicit emotion regulation, and psychological defense. The brain imaging data suggested that “motivated reasoning is qualitatively distinct from reasoning* when people do not have a strong emotional stake in the conclusions reached.” In other words, when partisans were presented with threatening information, they unconsciously reacted emotionally, not consciously.
* Meaning conscious thought or thinking about information that was positive or neutral for the partisan’s candidate – “cold reasoning tasks” and conscious emotion control.
Despite motivated reasoning’s power to distort perceptions of reality or facts and how we apply common sense to what we think we perceive, simply knowing about its existence can help the conscious mind reduce the distortions. Of course, doing that requires a will or mind set that’s motivated to reduce unconscious distortions.
It all boils down to one’s personal mind set. One libertarian partisan became aware of the distorting influence his own strongly held political ideology had on his perceptions of facts and his common sense in thinking about the facts. He described his experience like this: “Ever since college I have been a libertarian—socially liberal and fiscally conservative. I believe in individual liberty and personal responsibility. I also believe in science as the greatest instrument ever devised for understanding the world. So what happens when these two principles are in conflict? My libertarian beliefs have not always served me well. Like most people who hold strong ideological convictions, I find that, too often, my beliefs trump the scientific facts. This is called motivated reasoning, in which our brain reasons our way to supporting what we want to be true. Knowing about the existence of motivated reasoning, however, can help us overcome it when it is at odds with evidence.”
Questions: Does being a responsible citizen come with a moral obligation to be aware of human biases and their tendency to distort reality and common sense so that they can try to reduce the distortion? Or, because facing unbiased reality is psychologically uncomfortable and threatening to personal self-image and self-esteem, there is no obligation for citizens to care about their own unconscious biases and to operate on the false belief that they in fact do not bias facts or common sense?
B&B orig: 10/27/16
Motivated reasoning is an unconscious bias that’s associated with people who hold strong personal or ideological beliefs. According to one source, “motivated reasoning leads people to confirm what they already believe, while ignoring contrary data. But it also drives people to develop elaborate rationalizations to justify holding beliefs that logic and evidence have shown to be wrong. Motivated reasoning responds defensively to contrary evidence, actively discrediting such evidence or its source without logical or evidentiary justification. Clearly, motivated reasoning is emotion driven.”
fMRI Brain scans - speaking, finger tapping, listening
One observer notes that scientists have found that “one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues. . . . . when we think we're reasoning, we may instead be rationalizing. We may think we're being scientists, but we're actually being lawyers. Our ‘reasoning’ is a means to a predetermined end—winning our ‘case’—and is shot through with biases.”
What’s the evidence?: It’s fair to ask if there’s any tangible evidence that unconscious motivated reasoning bias is real. There is. In 2006, researcher Drew Westin and colleagues published a paper showing brain activity in committed political partisans during the 2004 presidential election (Bush’s re-election). The paper, Neural Bases of Motivated Reasoning: An fMRI Study of Emotional Constraints on Partisan Political Judgment in the 2004 U.S. Presidential Election, Journal of Cognitive Neuroscience, vol. 18, issue 11, pages 1947-1958, looked at implicit (unconscious) brain activity for information that was threatening to their own candidate, the opposing candidate or an individual who was neutral to the partisan.
The researchers summarized their results like this: “Research on political judgment and decision-making has converged with decades of research in clinical and social psychology suggesting the ubiquity of emotion-biased motivated reasoning. Motivated reasoning is a form of implicit emotion regulation in which the brain converges on judgments that minimize negative and maximize positive affect states associated with threat to or attainment of motives. . . . As predicted, motivated reasoning was not associated with neural activity in regions previously linked to cold reasoning tasks and conscious (explicit) emotion regulation.”
fMRI brain scanner (functional magnetic resonance imaging)
That was the first neuroimaging evidence for motivated reasoning, implicit emotion regulation, and psychological defense. The brain imaging data suggested that “motivated reasoning is qualitatively distinct from reasoning* when people do not have a strong emotional stake in the conclusions reached.” In other words, when partisans were presented with threatening information, they unconsciously reacted emotionally, not consciously.
* Meaning conscious thought or thinking about information that was positive or neutral for the partisan’s candidate – “cold reasoning tasks” and conscious emotion control.
Despite motivated reasoning’s power to distort perceptions of reality or facts and how we apply common sense to what we think we perceive, simply knowing about its existence can help the conscious mind reduce the distortions. Of course, doing that requires a will or mind set that’s motivated to reduce unconscious distortions.
It all boils down to one’s personal mind set. One libertarian partisan became aware of the distorting influence his own strongly held political ideology had on his perceptions of facts and his common sense in thinking about the facts. He described his experience like this: “Ever since college I have been a libertarian—socially liberal and fiscally conservative. I believe in individual liberty and personal responsibility. I also believe in science as the greatest instrument ever devised for understanding the world. So what happens when these two principles are in conflict? My libertarian beliefs have not always served me well. Like most people who hold strong ideological convictions, I find that, too often, my beliefs trump the scientific facts. This is called motivated reasoning, in which our brain reasons our way to supporting what we want to be true. Knowing about the existence of motivated reasoning, however, can help us overcome it when it is at odds with evidence.”
Questions: Does being a responsible citizen come with a moral obligation to be aware of human biases and their tendency to distort reality and common sense so that they can try to reduce the distortion? Or, because facing unbiased reality is psychologically uncomfortable and threatening to personal self-image and self-esteem, there is no obligation for citizens to care about their own unconscious biases and to operate on the false belief that they in fact do not bias facts or common sense?
B&B orig: 10/27/16
The biology of subjective facts
In politics, finding objective facts is usually hard. In addition to being inundated in an ocean of spin and lies, another barrier is being subjective creatures from a biological point of view. The power of our inherently subjective minds cannot be understated. We are humans. Humans are inherently subjective and our minds evolved to work that way. The laws of the universe require that out minds work that way. It is both biologically and mathematically impossible for humans to operate in politics on the basis of pure conscious reason and objective fact or truth.
Given human biology and the laws of the universe, we have to operate on the basis of mental rules or shortcuts that our minds actually can work with. Those rules simplify and distort reality, including facts, and conscious reason. None of that is a criticism of anyone, any group or the human species. Those are objective fact statements based on human biology and the laws that govern the universe.
Because of those truths, finding objective fact in politics isn't nearly as easy as one might envision. The human mind operates mostly on the basis of unconscious thinking. That thought mode is heavily influenced by (i) biases all of us got from evolution* (nature), and (ii) biases from personal morals we grew into or learned (nurture). In seeing and hearing the world, we first become unconsciously aware of what we see and hear, that input is then filtered through our unconscious biases and then after that unconscious filtering of what we see or hear, we become consciously aware of maybe 0.001% of what our unconscious minds was aware of.
* For example, humans do not think in terms of statistics. That kind of conscious thinking has to be learned. To survive, humans did not need to think in terms of statistics, otherwise we either would not exist, or we would already innately think in terms of statistics. Our innate failure to properly account for numbers explains, for example, why most Americans grossly overestimate the danger of personal harm or attack from terrorists on American soil. That's just one bias we got from evolution, but the distortions of reality that that bias generates can be overcome to some extent by learning and conscious effort. Other evolutionary biases can be harder to somewhat or mostly counteract. I'm not sure if any evolutionary bias can be fully overcome.
Our feeble conscious minds: The little dribble of information we do become consciously aware of has been filtered through our unconscious biases. Those biases distort both the facts or reality we think we see and hear and the common sense we apply to what we think we see and hear. Being more objective (less biased) is tricky. It requires self-awareness and a will to be more objective. Being completely unbiased is impossible. Being less biased is possible.
In other words, our conscious minds are often fooled right from the get go. That makes finding objective facts significantly more complicated than one might think. That's why when liberals and conservatives disagree on something, it is the norm for them to significantly or completely disagree on what the facts are. Their different unconscious biases (morals, political ideology) often lead most (>95% ?) people to see things that fit their biases or fail to see things that contradict their biases. This vignette explains how that works for one political ideologue (a self-aware libertarian) who woke up to understand how his ideology had been distorting both facts and the common sense he applied to the facts he thought he did see:
"Ever since college I have been a libertarian—socially liberal and fiscally conservative. I believe in individual liberty and personal responsibility. I also believe in science as the greatest instrument ever devised for understanding the world. So what happens when these two principles are in conflict? My libertarian beliefs have not always served me well. Like most people who hold strong ideological convictions, I find that, too often, my beliefs trump the scientific facts. This is called motivated reasoning [an unconscious bias], in which our brain reasons our way to supporting what we want to be true. . . . Take gun control. I always accepted the libertarian position of minimum regulation in the sale and use of firearms because I placed guns under the beneficial rubric of minimal restrictions on individuals. Then I read the science on guns and homicides, suicides and accidental shootings . . . . . Although the data to convince me that we need some gun-control measures were there all along, I had ignored them because they didn't fit my creed."
If one accepts the reality of how the human mind usually or always operates in a subjective, reality distorting mode, it is easy to see the basis for profound disagreements over facts between liberals, conservatives and populists in the current presidential election.
If Americans were truly interested in being less biased, their differences of opinion and perceived facts would not disappear. However, they would narrow. The problem is getting past our innate human subjectivity and the massive difficulty in changing one's personal mind set.
B&B orig: 10/13/16
Given human biology and the laws of the universe, we have to operate on the basis of mental rules or shortcuts that our minds actually can work with. Those rules simplify and distort reality, including facts, and conscious reason. None of that is a criticism of anyone, any group or the human species. Those are objective fact statements based on human biology and the laws that govern the universe.
Because of those truths, finding objective fact in politics isn't nearly as easy as one might envision. The human mind operates mostly on the basis of unconscious thinking. That thought mode is heavily influenced by (i) biases all of us got from evolution* (nature), and (ii) biases from personal morals we grew into or learned (nurture). In seeing and hearing the world, we first become unconsciously aware of what we see and hear, that input is then filtered through our unconscious biases and then after that unconscious filtering of what we see or hear, we become consciously aware of maybe 0.001% of what our unconscious minds was aware of.
* For example, humans do not think in terms of statistics. That kind of conscious thinking has to be learned. To survive, humans did not need to think in terms of statistics, otherwise we either would not exist, or we would already innately think in terms of statistics. Our innate failure to properly account for numbers explains, for example, why most Americans grossly overestimate the danger of personal harm or attack from terrorists on American soil. That's just one bias we got from evolution, but the distortions of reality that that bias generates can be overcome to some extent by learning and conscious effort. Other evolutionary biases can be harder to somewhat or mostly counteract. I'm not sure if any evolutionary bias can be fully overcome.
Our feeble conscious minds: The little dribble of information we do become consciously aware of has been filtered through our unconscious biases. Those biases distort both the facts or reality we think we see and hear and the common sense we apply to what we think we see and hear. Being more objective (less biased) is tricky. It requires self-awareness and a will to be more objective. Being completely unbiased is impossible. Being less biased is possible.
In other words, our conscious minds are often fooled right from the get go. That makes finding objective facts significantly more complicated than one might think. That's why when liberals and conservatives disagree on something, it is the norm for them to significantly or completely disagree on what the facts are. Their different unconscious biases (morals, political ideology) often lead most (>95% ?) people to see things that fit their biases or fail to see things that contradict their biases. This vignette explains how that works for one political ideologue (a self-aware libertarian) who woke up to understand how his ideology had been distorting both facts and the common sense he applied to the facts he thought he did see:
"Ever since college I have been a libertarian—socially liberal and fiscally conservative. I believe in individual liberty and personal responsibility. I also believe in science as the greatest instrument ever devised for understanding the world. So what happens when these two principles are in conflict? My libertarian beliefs have not always served me well. Like most people who hold strong ideological convictions, I find that, too often, my beliefs trump the scientific facts. This is called motivated reasoning [an unconscious bias], in which our brain reasons our way to supporting what we want to be true. . . . Take gun control. I always accepted the libertarian position of minimum regulation in the sale and use of firearms because I placed guns under the beneficial rubric of minimal restrictions on individuals. Then I read the science on guns and homicides, suicides and accidental shootings . . . . . Although the data to convince me that we need some gun-control measures were there all along, I had ignored them because they didn't fit my creed."
If one accepts the reality of how the human mind usually or always operates in a subjective, reality distorting mode, it is easy to see the basis for profound disagreements over facts between liberals, conservatives and populists in the current presidential election.
If Americans were truly interested in being less biased, their differences of opinion and perceived facts would not disappear. However, they would narrow. The problem is getting past our innate human subjectivity and the massive difficulty in changing one's personal mind set.
B&B orig: 10/13/16
Subscribe to:
Posts (Atom)






