People are generally very good at detecting cause-effect relationships. This ability helps us understand the world, learn, make decisions, and predict the future. In short, it helps us adapt and survive. In fact, we are so good at spotting causal patterns that sometimes we find connections that don't really exist. As a result, we fall into the so-called causal illusion, i.e., we mistakenly believe that one event causes another, when, in fact, both are unrelated. A typical example occurs in the field of health when we assume that a pseudoscientific treatment is effective (therefore, it causes healing), despite it having no real effect.
To avoid this type of error, it is essential to develop scientific thinking which establishes cause-and-effect relationships only when supported by evidence. But how can we encourage scientific thinking when dealing with causal relationships? We can consider two approaches: increasing people's motivation (for example, by offering rewards for correct answer), and providing people with adequate information on how to solve the problem.
Aranzazu Vinas and Helena Matute (researchers at the University of Deusto) and Fernando Blanco (University of Granada) wanted to answer this question and better understand the mechanisms involved in causal learning processes. The results of their research have been published in the journal Royal Society Open Science.
The article involved three experiments conducted online. Participants were told they had to imagine they were doctors and were presented with a series of records of fictitious patients to whom they could administer a treatment or not. Immediately after their decision, they saw whether the patient recovered. In the end, participants had to judge to what extent they believed the treatment was effective. Importantly, the treatment was not really effective: the patients healed at the same rate regardless of whether they received it or not.
In the first two experiments, half of the participants were offered a financial reward if they answered correctly, while the other half were not. The results showed that all participants developed causal illusions to the same extent. That is, the reward did not serve to reduce the illusion.
In the third experiment, half of the participants received a piece of information explaining that people tend to develop causal illusions and that, in order not to fall into this error, it is important to consider all available information, not only what happens when the potential cause (treatment) is present, but also when it is absent (i.e., to think scientifically). Meanwhile, the other half were not given this explanation. This simple advice helped to reduce the causal illusion significantly, although it was not enough to eliminate it completely.
In summary, this research confirms that the causal illusion is a common error and that it is difficult to eliminate it completely. However, it also shows that we can help people think scientifically, thereby reducing their causal illusion. To this end, instructing them on how to critically evaluate all the available information is more effective -and often cheaper- than increasing their motivation with financial rewards. Even a simple written instruction can make a difference.
Source:
Vinas, A., Blanco, F., & Matute, H. (2025). Reducing the causal illusion. A question of motivation or information? Royal Society Open Science, 12. 250082. https://doi.org/10.1098/rsos.250082