Never trust anyone who doesn’t change their mind

One of the enormous ironies of the Covid-19 pandemic is that what should be an unquestioned triumph for science seem to have actually reduced trust in science for many. In less than a year science provided the tools to end a global pandemic, including an understanding of transmission, sophisticated models of epidemiology, and multiple safe and highly effective vaccines. You’d think that would bring folks for once and for all into the science-is-great-and-I’m-thankful camp – but no.

Instead, we’ve seen lots of folks reacting to the way organizations like the US Center for Disease Control and the Canadian National Advisory Committee on Immunization have revised guidance as more information came in. Transmission was surfaces, then droplets, then aerosols; we should wash our groceries, but then not; blood clots caused by the AstraZeneca vaccine were vanishingly rare, then moderately rare, then (just) common enough to prefer other vaccines.* The result has been incendiary media stories posing as exposés, a torrent of quacks and science deniers seizing on every shift as if it supported their twisted version of reality, and – most sadly – what seems like a lot of ordinary people throwing up their hands and deciding that if guidance keeps changes, they shouldn’t listen to any of it.

That is very much not the right response.

Experts changing their minds isn’t a sign of weakness or confusion. It’s quite the opposite: willingness to change their mind is part of how an expert can be – or at least, should be – recognized. Willingness to chance one’s mind is how we distinguish science from cults. In fact, science (as a process) is arguably nothing but an elaborate system for knowing when to change one’s mind. Show me someone who gave unchanging advice through a global pandemic of a novel disease, and I’ll show you either a charlatan or someone afraid of saying anything that matters.

But this isn’t just about the pandemic, of course. Willingness to change one’s mind is how we advance our understanding of the natural world. We test hypotheses, and when they fail, we discard them.** When we don’t do that, we’re saddled with a zombie idea – one that persists even though we’ve long since accumulated data that conflict with it (we’ll let Paul Krugman handle those in economics, and we’ll let Jeremy Fox handle those in ecology). Human knowledge has advanced so enormously precisely because we’ve been willing to learn new things and discard the mistakes we made before. (We once thought that caterpillars were spontaneously generated from dew, and had nothing to do with butterflies. We once thought burning substances released phlogiston. I could go on.)

I’ve seen suggestions that CDC and NACI and others made their own problems by poor messaging around their mind-changing. That might be partly true, but I’m not sure that any amount of better messaging could get past our society’s crush on gotcha journalism and motivated reasoning. What we really need is for the general public to understand the absolutely key point: that they should never trust anyone who doesn’t change their mind. That would sure be easier if we could silence the charlatans who deliberately distort science and seize on mind-changing as if it were a problem. No, I don’t know how to do that either.

As scientists, of course, we understand all this, and we change our minds as needed and value others who do that too. Wait, did I forget to light up my “sarcasm” sign? If you clicked on that link about ecological zombie ideas, you’ll have seen a strong argument that ecology, at least, could benefit from changing its mind a little more often. As a general rule, scientists are very smart about some things, but are regular humans who think fuzzily about lots of other things. Every field has examples.

What about me? Do I change my mind? Well, this post isn’t quite about what I thought it was about when I started writing it, so there’s a small example.*** More generally: yes, I do; but likely not as often as I should. I’m human like all of us.

So, change your mind (except about reading Scientist Sees Squirrel), and value those who change theirs. Especially if they can explain why.

© Stephen Heard August 4, 2021

Image: changing your mind © Andrew Doane via the Noun Project CC BY 4.0

*^For what it’s worth, I had two doses of the AstraZeneca vaccine and was grateful – no, elated – to have that chance.

**^Although this Popperian caricature is a major oversimplification of what we really do. That’s a subject for an entire course in philosophy of science.

***^OK, it seems only fair to offer a bigger one. As a grad student, I argued strongly that we should study ecology in pristine ecosystems, where the processes we were studying weren’t all messed up by human disturbance. Now I understand that (1) Earth doesn’t have any “pristine” ecosystems; (2) the particular sites I chose to work at were way more impacted than I thought; and (3) there’s an equally good argument for seeing human disturbance as something that can reveal, rather than hide, process.


10 thoughts on “Never trust anyone who doesn’t change their mind

  1. Pavel Dodonov

    Great post 🙂 Somehow changing one’s mind is too often seen as a weakness rather than an important ability to adapt to changing circumstances and additional knowledge. And somehow the ability to change one’s mind is not usually listed among the characteristics of scientists in popular media and elsewhere. Perhaps science outreach should try and focus more on how science works rather than just on the results? There are some iniciatives in this regard, but still.

    And I also had one Astrazeneca dose and am eagerly awaiting the second!

    Liked by 1 person

  2. Ken Hughes

    I agree with the gist of the post, but there has to be a limit, right? For example, imagine a scenario in which new scientific results—that led to altered public health guidelines—came in every single day. If the people setting guidelines were to change their mind that often, it wouldn’t breed confidence that the current guideline is meaningful, and the public would be right to be skeptical.

    That, of course, is an extreme scenario. But I think it demonstrates that, above a certain frequency, changing one’s mind is no longer beneficial.


    1. ScientistSeesSquirrel Post author

      Well, here we have updating information (mind-changing) needing to coexist with the inability of (some) members of the public to understand mind-changing. So there’s some interesting game theory going on. Ideally, we’d have a public that understands, so that even new guidance daily would be received well. I do agree we don’t have that public. So the point of the post (in the fairy-tale world where everyone read my posts and learned from them!) would be to help create that public. (Of course I’m using “daily” just like you, as an extreme example for the sake of argument!)


  3. janig717

    I thought the way science moves forward might be a problem for the general public to understand, especially when it’s happening in real time. But I think the bigger problem is that scientists, at least the ones most often in the media, presented science through the lens of the US presidential election. Expressing concern over some crowd generating activities and not others, or kids in school, for example, was inconsistent policy and clearly not based on biology. I was shocked that even high impact journals went down this rabbit hole.


  4. Jeff Houlahan

    I don’t know what I think about this. On the one hand, I admire people who are willing to change their minds in the face of conflicting evidence. But a willingness to change your mind doesn’t mean you are trustworthy – it’s all about why you change your mind. And maybe if science was presented with more uncertainty we wouldn’t have had such a decline in public faith – six months ago anybody that believed the virus could have come form the Wuhan lab was a considered a conspiracy theorist and a science -denier and today it’s considered to be a plausible hypothesis. In addition, it’s not unfounded to believe that the messaging on corona-virus has at times been intended to mislead – the early messaging on masks was deliberately misleading because there was a concern there would be a ‘rush’ on masks. The claims by economists for free trade were often misleading because there was a belief that if they acknowledged the downside of free trade it would lose support. It is not crazy to accuse scientists of being overconfident and arrogant nor is it crazy to believe that “science” has been used to mislead the public. I think there are coherent and defensible reasons for mistrust of science and ‘they changed their mind’ doesn’t sound like a very useful heuristic for me in deciding who to trust. Using our “failures” as a sign of why people should pay more attention to us strikes me as odd and would, I expect, strike most people as odd. And, make no mistake, when we change our minds, it is the result of a ‘failure’. Not necessarily a failure in process – the scientific process requires failures on the way to success – but that doesn’t mean they aren’t failures. Maybe if we stopped deifying science (and I don’t think that overstates how science is treated in some contexts) and were more clear about the uncertainty in our own conclusions it wouldn’t be so disillusioning for people when we get it wrong.


    1. ScientistSeesSquirrel Post author

      Well, Jeff, I’ll point you back to my last sentence: “especially if they can explain why”. Yes, you’re right that changing you mind *without* new information isn’t a sign of reliable expertness. I guess I thought that went without saying 🙂

      As for our process requiring failure: I get your point, but I think the wording is terrible – just asking to be misinterpreted the way “significant” effects are. It’s important for people to understand that not getting the expected result, or sometimes not rejecting the null, isn’t “failure” in the everyday sense! As you say, it’s exactly how the process is supposed to work.


      1. Jeff Houlahan

        Sure – but maybe we shouldn’t be so quick to insist on the absolute priority of science when it is a process that relies so heavily on failure. When getting it wrong many times is often what is required to get it right, maybe we should be less surprised by and less dismissive of people who aren’t willing to make all of their decisions based on science. For example, if somebody has never had the flu but has had a very bad reaction to a flu vaccination, it’s not crazy for that individual to choose not to be vaccinated. As somebody who has been twice-vaccinated, I still think they’re bucking the odds…but I don’t think they’re irrational.


  5. Pingback: Leadership lessons from the vaccination-mandate fiasco | Scientist Sees Squirrel

  6. Marco Mello

    Nice point! Learning to change your own mind and to cherish people able to change their minds is maybe one of the most difficult lessons in life. Sadly, it’s indeed one pilar of the scientific culture, which makes it very hard for laypeople to trust in us. Especially in a Post-Truth Era boosted by social media. No trivial mission…


  7. dolphinwrite

    Never completely give your trust to anyone, ever. Now, having said that, if I’m in a war, I trust my buddies along side, fighting for them as they fight for me, and I would say the same with firefighters, police, and such, also good friends and family. But the trust also understands other people’s fallability. They make mistakes. And this is more clear in the arena of science and health. They simply don’t know everything, and we have the responsibility to think for ourselves. And if one is afraid to be responsible for wrong decisions, say goodbye to freedom.



Comment on this post:

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.