Metascience – the study of science and its improvement – is not just a field of science. There is also a movement. Moreover, it is a movement fueled in part by moral strength – a desire to change culture and conduct based on strong beliefs about the right way to do scientific work, and why so many unconverted people do. or the uninitiated is wrong.
Movements like this enter inherently dangerous territory. As playwright Arthur Miller writes in his autobiography: “Nothing is so visionary or blinding as moral outrage”. It is always a threat to be aware when you are wrong. In addition, the rise of charismatic visionaries is more or less guaranteed. And scientists can be dazzled by the charisma as much as anyone else. They can form echo chambers and also develop “us and them” thinking and loyalty. Conflicts of interest also arise as soon as funding, projects and organizations come into play. It’s a very risky mix.
This makes questions such as liability and investigation skeptical critical. But it also makes them more difficult. For example, the camaraderie of “us and them” thought and movement causes some people to view the criticism of fellow travelers as “friendly fire” that is inappropriate, causing them to keep the reviews to themselves. They can go into defensive attack mode instead of thinking when others are criticizing as well – perhaps especially when those critics are not true believers in the movement.
I don’t think the metascience movement is doing well enough to avoid the risk of movement. Although many practice what they preach – there are so many great examples – some clearly don’t. There are several levels that we need to consider: how we conduct our own research, how research is done on the issues we believe in and our proposals for change, and how we do advocacy. We focus more on the first of these – how we conduct our own research – and less on the others, but they all matter. Just because we believe something is theoretically good and are idealistic doesn’t mean that our idea will have the effects we anticipate. The success of even the best-laid out plans is not guaranteed – and anything powerful enough to have a positive impact can have unforeseen effects as well. Also, once you change the environment or other things change, new issues will arise. Each solution tends to cause new problems down the line. If we are to be sure that things are better, we need to test our ideas and be critical of our own practices and those of others.
We are unfortunately better at talking about prejudices and cataloging them than at controlling them within ourselves. As critical as the elimination of cognitive bias is, there is far too little consideration on how to do it, or research on methods. The metascience movement tends to focus more on research and analytical skills, which are of course vital. But the same is true of values, integrity and cognitive skills, and we don’t consider them enough. All the technical skills in the world can also be totally destroyed by unrecognized conflicts of interest.
In this context, what more do we need? We need a broad approach to risk of bias in metasciences, which encompasses our high risk of confirmation and ideological thinking bias, as well as our intellectual and financial conflicts of interest. We must be self-critical and value independent evaluation of our beloved ideas and practices. We really need to take our role as critical peers seriously. We must take criticism seriously and value them, even if they are antagonistic. And we have to become very good to admit that we are wrong.
This article is based on remarks prepared for the panel “Bolstering accountability and self-skepticism in the metascience movement”, at Metascience 2021.
The designs are mine (CC BY-NC-ND license). (More cartoons on Statistically Funny.)