Scholarly Communication
Preparing for the Best
Adapting Collection Assessment for an Era of Transition
© 2024 Taylor Ralph
We all know that libraries, and more specifically our collections, have adapted to significant challenges as our world changes. Now, more than three years after the onset of the COVID-19 pandemic that spotlighted economic crises specific to higher education, we find ourselves navigating a new type of aftermath. During, and post, pandemic, libraries have endured budget cuts while meeting the call to support or even extend remote services and have had to think more creatively about how to maximize un-paywalled access to the most important resources. Solutions not limited to prioritizing access over ownership and open scholarship see an increase of reliance on interlibrary loan, pay-per-view and linking services, and the adoption of transformative agreements and other open access (OA) resources. However, there is little attention paid as to how these increasingly popular methods are going to both change and complicate the process of collection assessment, especially as it relates to assessing quantitative usage data. Comprehensive assessment is complicated enough, and it is more important than ever to start asking questions about how these shifts in the scholarly communication landscape will impact these processes and those that are tasked with them.
Changes to Collection Development
Though stemming from stress, current developments in the scholarly communication landscape can grow to be quite positive. One of the most exciting opportunities for libraries in this post-pandemic environment involves transforming our primary identity. One such change includes shifting from “the traditional scholarly repository . . . into the scholarly communication hubs our campus communities need us to be.”1 Ithaka’s 2022 US Library Survey reflects these adjusting attitudes, noting a key finding that library “priorities continue to shift from collections to services.”2 These findings do not negate the importance of collections in libraries, but do indicate that monetary resources may be dedicated elsewhere other than research materials. The concept of “just-in-case” collection development is fading as we prioritize new endeavors and as prices for electronic resources continue to increase at a pace budgets are not.
One logical step libraries have been taking is to disinvest in “Big Deal” subscription packages. Especially as gold OA content becomes more prominent in these packages, libraries question the investment not only due to rising subscription prices, multiple fees, and high author publishing charges (APCs), but also due to the fact that the overwhelming majority of content in these packages are not used by researchers.3 In these cases, deviating the monetary investment from larger subscription packages to interlibrary loan costs and intermediary services such as Get It Now, Reprints Desk, Article Galaxy Scholar, and more could result in cost savings. Additionally, by providing pay-per-view access to a larger spectrum of titles at the article level, “libraries are more likely to meet the information needs of their patrons.”4 Libraries are taking advantage of these access models at different levels, and research shows that overall, document delivery for articles with little staff mediation is a “feasible and sustainable alternative to expensive serials subscriptions.”5
Libraries are also investing in services that link researchers to different levels of open resources, whether they are published in OA journals or platforms, or are a pre-print version of an article. Some of these services, such as LibKey, “have been well-established in academic libraries,”6 while APIs for that and other linking tools are continually improving. These services link out directly from a library’s discovery service, without the need for researchers to search through lists of open resources.
While green OA promotion is further developing in libraries, publishers have been increasing open output through gold OA or hybrid OA models through transformative agreements. These models, which over the past few years have been steadily gaining in popularity, rely on APCs to make content openly available.7 Currently, gold OA accounts for the highest number of research articles and journals considered open access.8 Though the sustainability and overall affordability of these models may be questioned,9 libraries continue to negotiate for open content with publishers as a way to provide access to affordable and reliable resources.
Impact on Assessment
So how does the adoption of these access models impact collection assessment work in libraries? Understandably, collection assessment practices vary widely between libraries, especially considering library type, size, and staffing levels. Some libraries engage in a regular assessment cycle, while most others do project-based assessment or on an as-needed basis, and some not at all. Assessing electronic resources (e-resources) can be complicated, as it requires some expertise related to usage data collection and analysis. Most e-resources, and almost all of those provided by large publishers, can be analyzed through standardized COUNTER reporting, which is currently in release 5, and reported as quantitative usage data. COUNTER reports were created by Project Counter as a way to standardize e-resource usage data for publishers and vendors. In theory, these reports make it easier to compare resource usage at the title, database, platform, and item level between publishers with the specification and download of just a few select metrics.
If you have enabled your integrated library system (ILS) or an alternate subscription tool to automatically harvest these reports via a SUSHI API (a protocol that automates gathering usage data from COUNTER reports), there is even less burden to gather this data. However, with large subscription packages becoming less common, and other services and models becoming more popular, usage data becomes more disparate and COUNTER report convenience is lessened. In providing access to e-resources through large subscription packages, usage assessment may be accomplished by comparing fewer large reports. But breaking up these subscriptions and relying on smaller publishers, open resources, and pay-per-view and linking services requires data collection via multiple tools with little to no standardization or makes usage data impossible to access at all.
At Oregon State University, a large public institution, we regularly assess e-resource usage to determine if a resource should be renewed, canceled, upgraded, or potentially swapped out. These reviews are not only done to keep up with cost saving measures, but also to ensure we remain responsible stewards of our collection. Even if there is no outright subscription cost in the inclusion of OA resources or more manageable, user-based cost associated with pay-per-view services, it is still important to assess these resources to keep the collection relevant, and because these resources require a substantial amount of staff time in relation to implementation, discovery, and maintenance. Now, in addition to consulting COUNTER statistics for e-resource usage, libraries will have to consider usage from various platforms and services that are not subscription based to make the best decision about future collections, and of course they all have different metrics and measures.
Quantitatively assessing OA content presents its own unique and well-established challenges. For most OA resources, especially those that are green or bronze, there is no way to gather standard usage statistics. It is not kept by providers, as there is no way to track usage by either IP or API key that are required by most subscription services. Depending on the platform, gold OA and hybrid usage may be obtained by COUNTER reports, but these reports are limited at the title level rather than at the item level of usage, which makes it difficult to get an accurate measure. Though conversations about OA in libraries has existed for years, “most academic librarians have not explicitly articulated how OA materials and services should be treated in their local settings,”10 which includes assessment. There are strides in this area, such as the new COUNTER 5.1 release prioritizing usage of OA content at an item level,11 but complexity remains for fully OA databases or journal collections.
Strategies and Generative Questions
To tackle the issue of difficult to obtain usage statistics, there are a few strategies for libraries to consider and even more questions that we should be asking. Many of the pay-per-view, linking, and interlibrary loan platforms provide their own types of usage data. These are commonly the number of purchases or clicks attributed to a resource at the title level, number of individual users, and more. Many also provide visualizations of usage trends in an attempt to demonstrate growth by month, year, etc., and looking at these requests can sometimes hint at which titles may need to be added to the collection or identify subject areas of potential growth. However, it is important to note that depending on how your authentication methods are set up for individual users, some of the data gathered may not be granular enough to support financial decision-making. Library systems themselves could also provide a creative solution to gauging usage, especially of OA materials. Link resolver usage is one way to determine if researchers are using the library’s discovery system to access included OA resources, and the same goes for any usage reported by the institutional repository. Of course, this cannot show how many of our users are relying on Google to find open content but may provide a general idea of popular research areas.
Consulting multiple systems, tools, and COUNTER reports to get an idea of this anomalous e-resource usage is time-consuming and presents readily apparent challenges. Before diving in and gathering data for the sake of data, it is important that library employees take a step back and ask questions about what is truly needed to accomplish an e-resource collection assessment that is comprehensive enough to meet collections goals. Some major questions to consider include the following:
- Who is going to be responsible for gathering the data?
- Who has access to the administrative functions for these pay-per-view, linking, or interlibrary loan systems?
- Who will be responsible for writing and upkeeping documentation on these processes?
- Where will we store, keep, or amalgamate this data?
- What daily work needs to be re-prioritized to make room for complicated assessment procedures?
- How will usage data influence our decision-making related to collection development?
- If usage is low, how will we decide how to move forward?
- How will data be interpreted and presented to relevant communities?
Conclusion
By preparing ourselves through asking questions and pre-planning for these changes, we can accomplish three things: ensure the health and relevancy of our e-resource collections as user needs evolve, make sure we have the data necessary to make informed decisions about collections budgets, and practice care for library staff that will need to engage in this work. Collection assessment processes will have to adapt, and our work priorities with them. Adjusting collection assessment to the shifts in the scholarly communication landscape, especially those relating to open scholarship, is something that might be considered, but not often explicitly named or planned for. Making scholarship more easily accessible for our users through open or on-demand resources while finding cost savings for ourselves is an ultimate goal, but it does not come without complications. Hopefully, as we continue to make progress for our communities, we can maintain that level of progress for ourselves as well.
Notes
- Angie L. Ohler and Joelle Pitts, “From Peril to Promise: The Academic Library Post–COVID,” College & Research Libraries News 82, no. 1 (2021): 42, https://doi.org/10.5860/crln.82.1.41.
- Ioana G. Hulbert, “US Library Survey 2022,” Ithaka S+R, March 30, 2023, https://sr.ithaka.org/publications/us-library-survey-2022/.
- Philippe Mongeon, Kyle Siler, Antoine Archambault, Cassidy R. Sugimoto, and Vincent Larivière, “Collection Development in the Era of Big Deals,” College & Research Libraries 82, no. 2 (2021): 220, https://doi.org/10.5860/crl.82.2.219.
- Gail Perkins Barton, George E. Relyea, and Steven A. Knowlton, “Rethinking the Subscription Paradigm for Journals: Using Interlibrary Loan in Collection Development for Serials,” College & Research Libraries 79, no. 2 (2018), https://doi.org/10.5860/crl.79.2.279.
- Julie A. Murphy and Chad E. Buckley, “Document Delivery as a Supplement or Replacement for Serial Subscriptions,” Serials Review 44, no. 3 (2018): 245, https://doi.org/10.1080/00987913.2018.1525238.
- Ashley Zmau and Holly Talbott, “New Developments in Library Discovery and Access,” Library Technology Reports 58, no. 7 (2022): 34.
- Leigh-Ann Butler, Lisa Matthias, Marc-André Simard, Philippe Mongeon, and Stefanie Haustein, “The Oligopoly’s Shift to Open Access. How For-Profit Publishers Benefit from Article Processing Charges,” Zenodo (2022): 5, https://doi.org/10.29173/cais1262.
- Sarah Jurchen, “Open Access and the Serials Crisis: The Role of Academic Libraries,” Technical Services Quarterly 37, no. 2 (2020): 161, https://doi.org/10.1080/07317131.2020.1728136.
- Mihoko Hosoi, “Negotiating Open Access Journal Agreements: An Academic Library Case Study,” Pennsylvania Libraries: Research & Practice 9, no. 1 (2021): 59.
- Barbara M. Pope, “Adventures in Rightsizing: Enhancing Discovery and Research with Open Access Journals in the University Library,” Kansas Library Association College and University Libraries Section Proceedings 9, no. 1 (2019): 4.
- Tasha Mellins-Cohen, “Community Consultation for COUNTER Release 5.1,” Project Counter Blog, July 5, 2022, https://www.projectcounter.org/community-consultation-for-counter-release-5-1/.
Article Views (By Year/Month)
| 2026 |
| January: 54 |
| 2025 |
| January: 52 |
| February: 72 |
| March: 55 |
| April: 97 |
| May: 128 |
| June: 119 |
| July: 108 |
| August: 102 |
| September: 107 |
| October: 131 |
| November: 138 |
| December: 113 |
| 2024 |
| January: 3 |
| February: 560 |
| March: 246 |
| April: 57 |
| May: 57 |
| June: 58 |
| July: 32 |
| August: 37 |
| September: 89 |
| October: 64 |
| November: 88 |
| December: 42 |