Causal illusions consist of believing that there is a causal relationship between events that are actually unrelated. This bias is associated with pseudoscience, stereotypes, and other unjustified beliefs. .... [The research] included a pilot study (n = 287, grades 8-9), a large-scale implementation (n = 1,668; 40 schools, grades 8-10) and a six-month follow-up (n = 353). Results showed medium-to-large and long-lasting effects on the reduction of causal illusions. To our knowledge, this is the first research showing the efficacy and long-term effects of a debiasing intervention against causal illusions that can be used on a large scale through the educational system.
This ability to infer causal relationships in our environment has conferred important survival advantages to both humans and other animals, since it allows us to anticipate changes and adjust our behavior accordingly. However, our ability to detect causal patterns is not error-free. In some cases, it has been shown that individuals can erroneously infer a cause-effect relationship between events that are actually unrelated. This cognitive bias is known as causality bias, or causal illusion.
Causal illusions often lead to suboptimal decisions and can produce undesirable outcomes that underlie many social issues. For example, they have been associated with social stereotypes, ideological extremism, epistemically unwarranted beliefs such as paranormal, superstitious, and pseudoscientific beliefs, and the use of alternative and complementary medicine, among others.
There is evidence that not only adults but also children can show causal illusions. In fact, children and adolescents might be especially vulnerable to causal illusion as they lack the basic cognitive skills and background knowledge exhibited by adults, which are important characteristics involved in causal judgment. (citations removed for clarity)
In the face of these threats to human well-being that are associated with causal illusions as well as with other cognitive biases, the design of debiasing methods represents a major goal of modern psychology.
That accords with the intend behind pragmatic rationalism being an anti-biasing, anti-ideology ideology. This research is an example of the feasibility of teaching school children defense against the dark arts of dark free speech, crackpot reasoning in this case.
As usual, this research needs to be replicated and expanded to verify the results and to see if more effective protocols would give better results. The protocol amounted to a single 90 minute session with an initial bias induction phase, followed by a training (debiasing) phase. The bias-induction phase showed the students that they, not just other people, are vulnerable to forming causal illusions and the potential adverse consequences of holding false beliefs. That teaching was intended to motivate students about learning how to correct the problem in the second phase of the teaching protocol. The researchers described key parts of the induction of illusion phase protocol like this:
Thus, the induction phase was conducted with the aim of generating a biased judgment about the effectiveness of a target product among the participants, by using techniques that are common in advertising and pseudoscience. The target product was a metal ring (replacing the small piece of ferrite used by Barberia et al., 2013) that participants were asked to wear on their fingers. They were told that the product was made of a new material recently developed in a top research laboratory, which endowed it with special properties. They were explained that when the product contacted the skin, it increased the physical and cognitive capacity of the wearer. Following the strategy commonly used in pseudoscience, a hyper-technical explanation of the product was offered.
To increase their perception that the product was effective, participants were told that individuals in previous tests reported feeling that they had performed the tasks particularly well when using the ring. Second, participants engaged in two physical exercises (stability and flexibility), similar to those advertised by companies trying to show how some popular products improve sports performance, such as the Power Balance bracelet, which has been shown not to work.
The researchers described part of the training or debiasing phase of the 90 minute session like this:
This phase started by revealing the ineffectiveness of the ring. Then, an explanation was provided about the mistakes that were made when testing the ring, what could have been done to detect the fraud, and how to apply the experimental method and adequate control conditions when testing causal links. Specifically, we instructed participants in the importance of control conditions in order to evaluate a causal relationship. First, using the example of an alleged remedy against a common cold, they were educated on the necessity of comparing the probability of recovery from the cold when using the remedy with the probability of recovering spontaneously with no intake of the remedy. This comparison, and not simply the fact of recovery being very likely when taking the remedy, was presented as the key element to consider when evaluating its effectiveness.
Just think about that. Marketers use known techniques to induce causal illusions in consumers so that consumers buy what is in essence, products that often do not work at all. They are literally selling an illusion. I could not find published data on the annual marketing value of inducing causal illusions by marketers, but it is probably worth tens of billions in sales each year. A fascinating 2023 research paper, Scarcity affects cognitive biases: The case of the illusion of causality, made the interesting observation that people with limited money tended to be more resistant to forming causal illusions than one with less money concern. Perplexity nicely summarized the results of that paper.
The importance of causal illusion on political ideology and extremism is not clear to me. A 2018 research paper commented:
Research on the related literature of “motivated cognition” suggests that people’s causal inferences can be either accurate or biased, depending on which outcome better fits previous beliefs, opinion, and worldview. Thus, we take this argument further and propose that the causal illusion will be developed selectively to favor those conclusions that align with previous beliefs and ideology.
That paper found that people exhibit causal illusions selectively, especially in situations that favor their existing political views. Thus, people who identified as left-wing tended to form the illusion that a left-wing ruling party was more successful in improving city indicators compared to a right-wing party. Right-wing participants showed the opposite pattern. This selective causal illusion occurred despite all research participants being presented with the same information. It seems to be the case that pre-existing ideology or belief influences the kind of illusion a person will tend to generate, i.e., an illusion that distorts reality to fit pre-existing ideology.