Image: Books 5 – 9 in Louise Penny’s Three Pines series, featuring Armand Gamache.
“I don’t know. I was wrong. I’m sorry”. Lacoste recited them slowly, lifting a finger to count them off.
“I need help”, the Chief said, completing the statements. The ones he’d taught young Agent Lacoste many years ago. The ones he recited to all his new agents.
Chief Inspector Armand Gamache, of the Sûreté du Québec, knows a lot about homicide detection. Gamache is the protagonist of Louise Penny’s Three Pines series of crime novels. Over 15 novels so far, Penny has portrayed the usual assortment of crimes and their solutions, but also (unusually for the genre) Gamache’s approach to managing and mentoring the earlier-career detectives assigned to his unit. His management philosophy can be summed up as willingness to utter, whenever appropriate, the Four Statements:
- I don’t know.
- I was wrong.
- I’m sorry.
- I need help.
These work very well for Gamache in the novels. I’ve found they work pretty well in science, too.
“I don’t know” is the whole point of science. We can’t get anywhere at all without first recognizing what it is we don’t know. We’re very good at this (grant writing forces us to be so) when it comes to what I’d call science content: the things we know about how nature works*. It’s interesting, though, that we’re considerably less good at this with respect to science process: the ways we do and report science, and the ways our scientific community works. Shockingly often, we express strong opinions about something for which the answer isn’t known – or if it is known, the answer (the data) is in a science-studies literature that we don’t read. Just as a few blood-pressure-raising examples: Does one space or two after a full stop make text more readable? Will open review or double-blind review do a better job of removing biases from peer review? If we make our papers open-access, will more people read them? Does post-publication peer review work? Is Comic Sans an abomination? I’m willing to bet some cash that you have a strong opinion about at least one of those four things; I’m willing to be even more cash that you can’t support that opinion with strong evidence.
“I don’t know” is also, I’ve discovered, very powerful in teaching. This is something I didn’t understand early in my career, when I thought my students expected me to know everything. Which they do, actually, and that’s exactly why I’m now willing – even eager – to answer a question in class with “I don’t know”. (And then, of course, we talk about how one might find out.) Saying “I don’t know” represents what science actually is, and puts responsibility for learning on students, where it should be. My upper-year students, at least, seem to respect “I don’t know” as an answer more, not less, than bluster; and they’re even motivated by it.
“I was wrong” is something that’s obviously fundamental to science; it’s also, unfortunately, problematic. Max Planck claimed that science advances one funeral at a time – that new ideas replace old ones not because people change their minds but because adherents of the old ideas retire or die. That’s both too cynical and not cynical enough. Too cynical, because of course we change our minds sometimes; but not cynical enough, because it’s becoming clear that just waiting for funerals isn’t enough to fix issues of bias in science (just as one example).
For science content, “I was wrong” is an interesting construct. It’s built right in to our routine use of significance testing in statistical analysis: we posit a null hypothesis, and attempt to show that we’re wrong about it. Since we’re usually interested in the alternative, in a way we have to be wrong to be right! But it’s an open question whether this routine wrongness makes us more willing to admit being wrong about substantive hypotheses. There’s increasingly copious evidence that people p-hack a lot, and reluctance to say “I was wrong” is surely involved there. I have enormous respect for scientists who can reject or abandon a pet hypothesis, not just a null or a strawperson one. I wish there were more.** (If anyone has ever changed their mind about one vs. two spaces after a full stop, I’ll be astounded.)
“I’m sorry” is, of course, key to being human, not just to being a scientist. All of us make mistakes, and while an apology isn’t the end of one’s post-mistake obligations, it’s nearly always the right way to start. Once, when I was a newly minted Department Chair, I made a fellow named Will MacDonald (OK, that’s not his real name) furiously, indignantly, incandescently angry. I was doubly surprised: I’d asked him to do something that I thought was pretty reasonable; and Will is the kind of mellow, easygoing, Grateful-Dead-infused pull-togetherer that you’d never expect to see past one raised eyebrow. I knew I’d done something wrong (it turned out the previous Chair had promised we would never, ever ask him to do the thing I’d asked him to do). But I started with “I’m sorry”, moved on to find out what was wrong and how we could fix it, and we’re friendly collaborators to this day. All of us make mistakes; what matters is what we do next.
My story about Will is about life, not science, but “I’m sorry” has scientific lessons too. In particular, there’s a bizarre belief that as long as we’re acting in the pursuit of knowledge, any kind of behaviour is OK (and thus we never need to apologize). Think of the savaging we give papers in journal clubs (at least, the kind of bad journal clubs I used to enjoy; good journal clubs focus on finding the good in papers). Or think of the poor behaviour we see, occasionally but still too often, from anonymous peer reviewers or from senior folk asking aggressive or even harassing questions after seminars and conference talks. These folks would be better, of course, to be kind to begin with; but at a minimum owe science an “I’m sorry” afterward. You don’t hear enough of those.
“I need help” can be a simple but powerful tool. Doing science frequently means needing to accomplish something non-scientific: hiring somebody, or negotiating the intricacies of university purchasing policies, or getting someone to repair the noisy fume hood in your lab. I’ve discovered that “I need help” is a near-irresistible*** opener on the phone with that person in HR, or Financial Services, who holds the key to what you need. It puts you and that person on the same team, and shows you recognize their expertise and value to the organization. I’ve watch people open the same conversations with peremptory demands and a clear attitude of superiority, and I can assure you that’s a very efficient way to become someone’s very lowest priority.
But “I need help”, like “I’m sorry”, has a particularly scientific context too. Once upon a time, science was pursued by individuals in secrecy: alchemists jealously guarding their data in the hopes of profiting when they finally turned based metals into gold. But for 400 years, science has grown increasingly collaborative. We invented journals (thus recruiting editors and publishers) and scientific societies (recruiting society officers). We invented peer review. We invented coauthorship (and, arguably, took it to extremes). We adopted sophisticated methods of analysis that rely on statisticians and software writers. We discovered that progress is made more rapidly, and better, by teams that find synergy in complementary expertise and that find new problems and new tools in interdisciplinarity. And the very fact that all this seems trivial makes it surprising, when you think about it, to see people rejecting help: most obviously, perhaps, insistence that we don’t need peer review because all reviewers do is make unreasonable demands. I’ll admit, I have those moments myself, usually just after I’ve read a new set of reviews, but I’ve learned to set a review aside and it’s truly rare to find a review that isn’t helpful.
And there you have the scientific wisdom of Armand Gamache: I don’t know. I was wrong. I’m sorry. I need help. Schmaltz? Maybe. But I’ve learned that schmaltz can be powerful – and that fictional homicide investigators can know a lot about science.
© Stephen Heard November 12, 2019
Tune in next week for the scientific wisdom of Detective Superintendent Alan Banks. OK, just kidding, but Peter Robinson’s Inspector Banks crime novels are superb.
*^I’m not saying we’re perfect. There are, as Jeremy Fox is surely protesting furiously as he reads this, zombie ideas in science: things we think we know, despite the fact that they’re either unsupported or contradicted by evidence. But I think the very fact that overturning a supposed fact is a high-profile thing to do means that true zombies are not that common.
**^Somewhere I have a half-written post about the death of one of my pet hypotheses. It’s a bit embarrassing, in the context of this post, that I’ve never finished it.
***^I did say “near”. I could tell you some stories, most of which involve computer “support”.