Halle is the new Data Librarian at the University of Nevada, Las Vegas University Libraries. In addition, she is certified as an instructor with The Carpentries. Her current research interests include data literacy, digital humanities, and improving the accessibility of data science and technology education. She tweets here.
Making mistakes (and owning up to them) can be terrifying. It is more than just admitting something went wrong; it is potentially acknowledging that you were wrong. “Everyone makes mistakes,” writes Paul Krugman in his op-ed “America’s Epidemic of Infallibility” for the New York Times, “…Nobody is perfect. The point, however, is to try to do better — which means owning up to your mistakes and learning from them.” However, “owning up to mistakes,” to failure, to confessing you were wrong is easier said than done.
Throughout the years in academia and as students, we took many classes and participated in projects and studies. We learned research methods, writing, and statistical analysis. We studied great failures, oversights, misinformation, and scandals. It was impressed upon us never to put ourselves in that position, never to repeat those mistakes. These high-profile people and cases seemed a world away. This could never happen to us, we assured ourselves, confident that we would make better decisions.
While these anecdotes are important examples of extreme cases of error, they were too far separated from us, they happened to other people, and represented the worst of the worst. No, it was the smaller mistakes, sneaky ones that crept up when we were not vigilant, that we were more likely to make. But when that time inevitably came, when that error finally was made, no matter how small, what did we do? Those types of errors were never discussed. As students, budding researchers in increasingly competitive fields, even the smallest of errors might have a great impact.
“This fear is normal. It is valid. But this aim for infallibility is limiting and makes confronting mistakes, which will inevitably occur, harder to face.”
As a young, early-career, tenure-track librarian, I hold a brand new position at my institution in a rising field. I feel pressure not just to engage in research but to engage in it perfectly, to prove myself in an errorless fashion. This fear (let’s call it what it is) can be stagnating, ultimately impacting research yet to be produced. This fear is normal. It is valid. But this aim for infallibility is limiting and makes confronting mistakes, which will inevitably occur, harder to face.
Enter: Fuckup Nights, “a global movement and event series that shares stories of professional failure.” While this movement has largely been focused on entrepreneurs and the business industry, the concept of sharing and learning from mistakes can be relevant to academia and the research process. With a mission to “help decision-makers make better-informed decisions,” Fuckup Nights allows people to engage directly with stories of failure, to hear them, and to learn from them. This is an action that is not just cleansing for the person admitting their mistakes but also for their listeners, who, more likely than not, can relate on some level.
Several academic institutions have begun adapting this movement for their own audiences. The University of Rochester holds “Screw Up Nights” featuring panels of faculty who speak to students about rejection and failure. This type of method is admirable. It breaks down this barrier of fear, not just for students but for faculty as well. While we are still accountable for our actions, there is a strength that stems from admitting error. If we as faculty members cannot admit our own lack of infallibility, what message does that send to the students with whom we engage regularly, the researchers we are training for their careers?
This is a hard conversation for faculty to have, to not just admit error but do so in a public manner. In the article “Intellectual humility: the importance of knowing you might be wrong” by Brian Resnick for Vox, it is stipulated that “In order for us to acquire more intellectual humility, we all, even the smartest among us, need to better appreciate our cognitive blind spots.” The author goes on to state:
“Even when we overcome that immense challenge and figure out our errors, we need to remember we won’t necessarily be punished for saying, “I was wrong.” And we need to be braver about saying it. We need a culture that celebrates those words.”
As a new faculty member, as someone afraid to mess up, and inspired by Fuckup Nights, I have begun to try and normalize those words. In workshops that I teach through my position at the library, I try to make a point to include anecdotes. Not just those news-breaking, global anecdotes, but personal ones as well. In front of those audiences, I admit to my failures, those smaller, sneaky mistakes (and, in some cases, larger ones), that I myself never was taught to handle.
I speak of a time during graduate school, struggling through a semester-long research project. Efforts stemming from this project were meant to be submitted to conferences, added to portfolios, and showcased to future employers. My difficulties finding appropriate datasets, dealing with copyright concerns, extracting and analyzing the data were all hurdles, ones I convinced myself were just steps towards my final creation. I was wrong. I made an error early on in the process: for the resources I had available, I chose a bad research question. But, in denial, afraid of what failure would look like, I continued to push through.
In those last few weeks, before the final project was due, I gathered my courage and admitted my mistakes to the instructor. She helped me pull together the scraps of my project, worked with me to salvage what little I had left, and aided me in understanding where I went wrong and why. My presentation topic went from being on my nonexistent research to “lessons learned in the process,” a positive spin on a not-so-positive experience.
So in these workshops that I now teach, I discuss my failures. I talk about that research project or even the time I almost lost 4 months’ worth of data (don’t get me started on that one). But, in addition, I try and model the behavior that my instructor showed me: empathy, understanding, and the importance of communication (especially when things go wrong). In my workshops, I say, “I messed up.” Together, we discuss those mistakes, what could have been done better, breaking down that barrier of failure and working together to learn from each other.
“Risk allows us to take chances, to branch out, to explore new and novel ideas fully. Nothing interesting was ever achieved by remaining stagnant. Mistakes happen. Whether small or large, everyone makes them.”
To engage in research is to risk. That’s half the fun of it – to acknowledge that failure is possible. Risk allows us to take chances, to branch out, to explore new and novel ideas fully. Nothing interesting was ever achieved by remaining stagnant. Mistakes happen. Whether small or large, everyone makes them. The difference is being able to acknowledge their existence, being able to say, “I was wrong.” After all, can we truly ever blossom as academics, confined by this fear of, well, fucking up?
Acknowledgments: I would like to acknowledge Sue Wainscott, Mark Lenker, and Brittani Sterling for reading my drafts of this post. In addition, a huge thank you to Melissa Bowles-Terry and Chelsea Heinbach for encouraging my enthusiasm for talking about failure.
This work is licensed under a Creative Commons Attribution 4.0 International License
The expressions of the writer do not reflect anyone’s views but their own