The Way I See It
Library Assessment
Taking a More Active Role in Telling Our Story
© 2025 Cori Biddle.
“I have always depended on the kindness of strangers.”
—Blanche DuBois, A Streetcar Named Desire, Tennessee Williams
I am not sure why this line now lives, rent free, in my head. I am not a theater fan; I read the play once in a college English class and probably have seen clips of the classic film. Maybe it’s because with the increased importance given to library assessment, I have a growing empathy for Blanche and the inherent helplessness in this line.
Academic libraries are by nature support entities on campus. As such, we often determine our value based on others: supporting faculty research, assisting student success, or partnering with other institutional departments. There is nothing inherently detrimental with this model, except when we are asked to provide evidence of our value to administrative stakeholders. Often they require libraries to gather specific quantitative assessment metrics like gate counts, circulation statistics, or program attendance. While these numbers are easy to gather, and easy to present in a table, they lack the nuance to tell our complete story. In this structure libraries are often dependent on others, both because the assessment process is less than favorable, but also because what they are measuring is out of our control. Though these limitations are obvious to us, libraries are often not given the opportunity to provide context and explanation of these numbers. Major administrative decisions are being made based on this incomplete data, but is there any way that libraries can take a more active role in justifying our existence?
One of the first places we have experienced this scenario is in reference and instruction. Impact was easily captured with tally marks, adding up the total number of one-shot instruction sessions or number of reference questions at the service desk. In years past these sorts of quantifiable metrics, easily plugged into an Excel sheet, were enough to fill an annual report and satisfy any curious administrator. However, in the post-COVID years, the cracks in this assessment model are coming to light. Numbers in isolation are no longer enough, but it is unclear how we can paint a more nuanced picture and still maintain the digestibility of a table, graph, or chart. How do we add context to temper the concern of lowering instruction numbers? How do we explain the increasing quality of reference support, even if the quantity of interactions is dwindling?
Now I see a similar situation occurring with library programming and outreach. Students do not attend workshops or in-person events because of… reasons. They may feel overworked, uncomfortable with face-to-face interactions, or embarrassed by needing help or remediation. Once we have a successful program, such as coffee bars/study breaks, student study spaces, or sensory/relaxation rooms, others on campus tend to duplicate our efforts, leading to a drop in our numbers because of oversaturation.
It’s not that the value of our services has decreased; it’s just that the need for those services has been affected by external forces. The “kindness” of strangers has reached its limits with changing student demographics, and faculty and other campus departments experience their own pressure to show results. How does the library then take control of this narrative and mitigate any negative effects on the administration’s view of the library? It can be tempting to pad our numbers: Schedule any instruction requests even if they do not involve a research project or count any interaction (even just a “hello”) as a reference interaction. But that is only a temporary solution and does nothing to solve the problem. Perhaps there are other paths that we can take, though I doubt that any of them will be easy.
We can push back on the quick and easy quantifiable data and put effort into identifying other ways that can measure the library’s impact in a more comprehensive way. We are all surveyed to death: participation in comprehensive surveys, quick exit tickets, and other feedback mechanisms has become inconsistent at best. We need to explore other ways to tell our stories, like through student impact statements or focus groups. We may even attempt to tie library usage to the holy grail of student retention or graduation rates (though many examples of this so far are more correlation than causation). Once we decide to look beyond the overused and unsatisfying, we will likely identify even more novel assessment strategies. Any alternative we offer to administration, though, will require a strong and prolonged advocacy campaign to gain acceptance from both the library profession and higher education in general.
Libraries tend to want to be “all things to all people,” but is that sustainable? We strive to demonstrate high impact in a variety of areas: academic success, student belonging, and faculty research support. While this is laudable, we no longer have the capacity to do this. We need to think critically about where we are spending our time and resources. We can reach out to faculty and try to make connections by offering department programming on a trending topic (artificial intelligence, anyone?), but what do we do when no one shows up to the program? Do we continue to offer it because all the other libraries are doing it? Or do we instead transfer that effort into our leisure reading program because that is where we had the most engagement last semester? This process requires a continual shifting of our library paradigm and rescaling of our goals and visions.
In any case, we need to stop being so reliant on existing expectations and instead rediscover the fundamental role we want to play at our institutions. We need to build a more active view of ourselves and our work. While it might be a harder route, it will hopefully be more sustainable. To quote another movie (and mix even more metaphors), “If you build it, they will come,” but what happens when they don’t? Our impact and value need to extend beyond that measure. 
Article Views (By Year/Month)
| 2026 |
| January: 19 |
| 2025 |
| January: 0 |
| February: 0 |
| March: 0 |
| April: 0 |
| May: 15 |
| June: 956 |
| July: 138 |
| August: 112 |
| September: 69 |
| October: 68 |
| November: 81 |
| December: 61 |