Admitting Our Shortcomings

An eye-opening article was published in Nature by Nicholas P. Holmes. He critiqued his own previously published papers on Twitter. First of all, his humility is admirable. How many researchers criticize their own work behind closed doors, much less in front of an oftentimes merciless Twitter audience? Beyond that, though, he wrote, "Every year in June, I discover that the most self-critical scientists are final-year undergraduates. In the results section of their dissertations, they mercilessly apply the rules that we teach them. Their discussions are largely of limitations, catalogues of failure. Their conclusions can be brutal. Somewhere between graduating and beginning our careers, we researchers seem to lose this flair for self-criticism." I can confirm that my dissertation also included a long section of well-deserved limitations.

He goes on to offer a few suggestions of how to improve academic research. First, he explains, "be your own harshest critic." I think that most of us are okay with this part because there is really no responsibility yet. I would hope that most people critique their own work heavily before they publish it. Before I submit an article, I know that I tend to get a little bit obsessive about it. I do try to critique my work ahead of time to put my best foot forward.

His second suggestion gets more to the root of my point in this post. He writes, "In addition, the systems of publication, funding and employment need to nurture and reward such honesty." Imagine a system that prioritizes taking responsibility. Taking responsibility requires humility, and penalties may still accompany that. If a scientist continually publishes research that needs to be withdrawn, it is reasonable to assume that perhaps the scientist needs further scrutiny before being offered book deals. Trust needs to be earned and often regained. However, building a system that encourages taking responsibility and acknowledging shortcomings is much more beneficial than a system that incentivizes covering up failures and allowing those problems to continue indefinitely.

For any of you who have watched the sitcom Family Ties, you might remember an episode where Alex is a research assistant for an economics professor close to presenting his magnum opus. As Alex is running over the numbers one last time, he realizes that there is no support for the professor's conclusions. When he tells his advisor about this problem, the faculty member becomes very defensive and suggests that he needs to present this research and ultimately publish it because of the competitive nature of the academy. Of course, because it is a family sitcom, the professor comes clean about everything, and they all live happily ever after. However, the concerns in that episode are certainly well-founded.

The system incentivizes people to get away with whatever they can. Hypothetically, no one else would have known about the professor's faulty numbers, he would have ultimately published a paper or a book, and it would have benefited him on his performance review at the end of the year. Peer review is good, but it is not perfect. Mistakes happen. Sometimes the only person who would know about these errors is the author. All the incentives encourage the professor to publish, though. While I hope that we all have our moral compasses rightly oriented, I am not sure that we always do.

Taking responsibility is the morally right thing to do. And taking responsibility is not necessarily an admission of unethical behavior. You did the best you could at the time, and now you realize that you could have done something better. Like I said above, if you make a habit of it, your reputation may suffer just because people will question anything you publish and whether or not it will be withdrawn. I am not saying that we need to reward terrible work or lower our expectations of quality.

Instead, I am speaking in very general terms here that the entire point of academic research, the peer review process, and the academy's tradition is to continually move towards more knowledge. Some theories are correct, while some theories are wrong. In our current environment, sticking to your guns is beneficial, while humility is detrimental. Rather than shame people who have moved closer to more knowledge, we ought to extend more grace than we might naturally prefer if they take responsibility for their errors. Unethical behavior is still unacceptable, but honest mistakes need to be seen as building blocks rather than lead weights. Just as entire fields continue to grow and gain more knowledge, individual researchers do that as well. The system ought to incentivize that progress.

Previous
Previous

My Path to Academia

Next
Next

Why Should You Care What I Say?