International Insights
Academic library assessment
Barriers and enablers for global development and implementation
© Martha Kyrillidou
Academic library assessment has grown as a field over the last 20 years. The pressures of increased competition over scarce resources and rapid technological changes have put pressures on academic libraries as they have on their parent institutions. In the midst of all the pressures and transformations lies a strong desire to be user-focused and responsive to the changing needs of faculty and students.
Academic library assessment, as it is called in the United States, or performance measurement, as it is typically called in the United Kingdom, has flourished as an area of research that informs practice or as an area where practice engages systematic research methods for continuous improvement. Academic library assessment, therefore, is tightly linked to university and higher education transformations, performance, and effectiveness across the globe. Academic libraries advance the disciplines as well as serve the nexus of where disciplinary perspectives come together, where an engineer can read about policies and laws and an accountant can understand the political environment.
How do we know that academic libraries are truly contributing to both economic development and informed citizenry outcomes? By focusing on the user.1
The development of library assessment is an indicator of the robustness of the academic library focus on the quality of their services delivered to the user and the user-centered approaches that have dominated in the recent years with the development of User Experience and Assessment (UXA) or Assessment and User Experience (AUX) programs.
How do we know what we know about academic library assessment?
How well developed are libraries across the world? To be able to answer this accurately we would have needed to be knowledgeable about languages and cultures across the world, but like every researcher, I profess certain limitations here in that our analysis below has a certain bias reflecting the English-speaking world through the resources we are discussing below.
OCLC is offering a website on global statistics2 that is a good start for someone who would like to learn how many libraries, how many librarians, how many volumes, how much expended, and how many registered users we have in libraries across the globe. The site breaks down the information for academic libraries, national libraries, public libraries, school libraries, and special libraries. And the data are compiled from a variety of data sources available at the country level. We wanted to place academic library statistics in the context of other developments in libraries, so the table below presents select data for China, the United States, and South Africa for academic and total number of libraries. The disparities among regions of the world as viewed from these representative countries are vast. For China and the United States, about a third of the library economy is comprised from the resources represented by academic libraries (see Table 1).
China Total Libraries |
COUNT |
USA |
COUNT |
South Africa |
COUNT |
Expenditures |
$152,000,440 |
Expenditures |
$21,759,280,324 |
Expenditures |
$484,705,816 |
Librarians |
58,953 |
Librarians |
157,685 |
Librarians |
2,341 |
Libraries |
109,673 |
Libraries |
101,349 |
Libraries |
11,406 |
Users |
15,160,109 |
Users |
231,262,659 |
Users |
N/A |
Volumes |
1,063,356,687 |
Volumes |
2,580,863,485 |
Volumes |
52,756,234 |
Academic Libraries Expenditures |
$55,412,959 |
Academic Libraries Expenditures |
$7,008,113,939 |
Academic Libraries Expenditures |
$7,212,590 |
Librarians |
30,894 |
Librarians |
26,606 |
Librarians |
648 |
Libraries |
3,842 |
Libraries |
3,793 |
Libraries |
66 |
Users |
4,272,000 |
Users |
7,641,610 |
Users |
N/A |
Volumes |
447,893,493 |
Volumes |
1,099,951,212 |
Volumes |
14,411,691 |
Table 1. Select data for China, United States, and South Africa from OCLC Global Statistics
Barriers to global benchmarking: The LibQUAL+ example
We need to increase our efforts for opening opportunities across the globe to counteract the forces that are creating systemic increasing inequalities, a trend not easy to shift. Any comparison of academic libraries at the global level needs to be contextualized with the socio-economic forces that are shaping the performance of organizations, research and development, and well-being of the users and citizens of each country.
For one thing a wariness to participate in global benchmarking efforts of academic library service quality may arise once a local institution perceives that it may not be on a par with other more advanced countries and environments. I observed such behavior with the widely adopted international protocol for measuring library services quality, LibQUAL+.
Rooted in the tradition of the services marketing field, LibQUAL+ started as a grant-funded project3 formally in 2000. With widespread deployment in the United States over the first three years, our Canadian French-speaking colleagues and our U.K. colleagues worked with us to create French and British English versions. The protocol spread internationally, gradually with more language translations, including Hebrew, Arabic, Greek, and Swahili, among a total of more than 39 different language variations. LibQUAL+ did become a widely used way of measuring library service quality across the globe. However, not every library in the world that wants to deploy LibQUAL+ participates in the official service we created at the Association of Research Libraries (ARL). Many implementations have taken place on a research basis outside of ARL infrastructure, as can be seen from published articles. Furthermore, more systematic local implementation and adaptations of LibQUAL+ often do appear through the peer review workflows.
In conversations I have had with colleagues in South America and in Greece, there are a couple of reasons why such implementations are not happening in the context of closer collaboration with the U.S.-based offering, managed by ARL. One reason is that what may be viewed as a modest participation fee ($3,200) in the U.S. is still a relatively expensive proposition for academic library assessment in many other countries like Peru, Chile, or even places like Greece.
A second reason is the observation that scores in these other settings are lower and, therefore, create a hesitation for engaging at the same level as other institutions that are doing better. And, a third reason is a desire to customize the protocol and experiment and publish alternative and more useful versions of the protocol, locally. Though these reasons may not fully exhaust all the possibilities, they offer some insights on what some of the obstacles of widespread international collaboration in academic library assessment are.
Enablers of global benchmarking for academic library assessment
The world of academic library assessment does come together with exchange of ideas and perspectives through a few venues that have consistently helped inform the debate around international developments in academic library assessment over the last few years. For the U.S. environment, much insight can be gained from the proceedings of the biennial ACRL Conference, where the latest research is featured and, for the international environment, the International Federation of Library Associations and Institutions (IFLA) conference offers some insights on library assessment.4 More specifically, on library assessment issues at the international level, we have specialized events such as the following:
- Library Assessment Conference (LAC) (biennial, started in 2006, United States)5
- International Conference on Performance Measurement in Libraries (biennial, started in 1995, United Kingdom, formerly known as Northumbria)6
- Evidence-Based Library and Information Practice Conference (EBLIP) (biennial, started in 2001, rotating among United States, United Kingdom, Australia, and Canada)7
- Qualitative and Quantitative Methods for Libraries (QQML) (annual, started in 2009, rotating among Greece, Ireland, Italy, Turkey, United Kingdom, and France, so far)8
These events have consistently produced research that is subsequently published in peer-reviewed journals in the profession. The themes of these conferences often have similarities focusing on issues of transformation, user outcomes, user experience and institutional assessments, as well as some international comparisons. Some of these venues are more focused on methods (LAC) and others are more focused on context (QQML). Transformations related to library spaces and user experience have dominated the research trends presented. User experience (UX) with its roots in usability studies also has its own separate conferences and publishing venues, as does digital library assessment.
The latest call for papers from EBLIP10 makes evident the current marriage of UX and assessment.9 This trend translates into assessment programs that are multifaceted and expand beyond institutional assessment. UX is also expanding beyond usability into a more holistic interpretation of the user experience, often including the physical environment and facilities aspects of a library’s operations. Thus, we are seeing academic libraries and librarians devoting more time on UX and assessment approaches and creating programs with professionals that specialize in approaches (quantitative or qualitative), tools (e.g., Tableau), development (Python), and often including project management services bundled with assessment, planning, marketing, and outreach activities.
But is this all happening in a similar way across the globe? Is there a need for more participation from certain parts of the world? These developments are uneven across the globe, and there is an increasing need for knowledge transfer across the globe. With libraries having multiple people with embedded assessment responsibilities or multiple people employed in UXA or AUX programs, the need to support people from less affluent regions to attend some of these events is paramount.
Among the four events highlighted, the representation of people from non-English speaking countries is more prevalent in the QQML conference, where one can interact with colleagues from countries in Africa, Asia, and Europe. The other three events tend to have a typical representation of 5% to 15% from regions outside the host countries.
What can be improved?
Academic library assessment can be improved at a personal, institutional, professional, policy, and standards level. Examples for improvements at each level include:
Exchanges/Internships/Training: The personal level
Professionals hired in the new expanding roles of library assessment, digital developments, project management, planning, marketing, and outreach are in high demand. It would be very useful if more opportunities existed for these professionals to offer targeted internships (online or in-person, ,nationally and internationally) aimed at expanding the know-how about tools and methods. Beyond the conferences, targeted training in this area is frequently offered by organizations like the National Information Standards Organization (NISO).10
Cross-library type assessments: The institutional level
UXA methods and tools are similar across types of libraries, and more dialogue is needed on how these approaches are implemented across different library types. Today’s high school student is tomorrow’s university student and next year’s faculty professor. Understanding our users and their information needs in the context of their life cycle is extremely important, much like development psychology tries to understand the development of human beings in the context of their life span.
Integrated library services: The professional level
Libraries have spent enormous amounts of energy developing integrated library systems but have failed in developing integrated library services. Our licensing approaches, which absorb larger and larger portions of our budget, are working against the concept of an integrated library service as each vendor tries to create a monopoly with specialized bells and whistles, including enhancements of the user interface that creates a unique and distinct identity of the product, but fails the user in being able to search seamlessly with a satisfactory user experience across products. Efforts like the development of FOLIO are trying to address some of these challenges, but further work is needed. Why do we keep licensing resources that are perpetuating the bubble effect of privilege and exclusion for the price of a poor user experience?
Privacy: The policy level
As we deploy more systematic approaches to understanding our library users, we need to spend more time understanding the library privacy frameworks, policies, and procedures. We need to have in place in order for our organizations to deploy not only robust, but also ethical approaches to the development of new services and environments. We need to nurture thoughtful understanding of how we use data in libraries to improve services and protect privacy with nuanced approaches for different contexts.
Data dictionary: The standards level
One of the longest-standing standards for libraries and information services is the NISO Z39.7 standard entitled “Information Services and Use: Metrics and Statistics for Libraries and Information Providers Data Dictionary.” The work of this standard has given birth to multiple other standards, including COUNTER and SUSHI, and is currently undergoing another revision. It is one of a few venues where stakeholders from different types of libraries and vendors come together to develop consensus on what is what. Though much more needs to happen beyond what’s what, this standard offers an important foundation for related international work at the ISO level.
Conclusion
In summary, in the future we would like to see integration of technology and library assessment solutions with deep understanding of the psychology of the user. Such environments will allow us to address the needs for specialized populations, will be more inviting to everyone, and will allow us to bridge the increasing gap among the global divides in a sustainable way.
Notes
- Martha Kyrillidou, “From Input and Output Measures to Quality and Outcome Measures, or, from the User in the Life of the Library to the Library in the Life of the User,” Journal of Academic Librarianship 28 (1) (2002): 42-46.
- OCLC Global Library Statistics, accessed September 23, 2018, https://www.oclc.org/en/global-library-statistics.html.
- LibQUAL+: Measuring Library Service Quality, accessed September 23, 2018, http://www.libqual.org/home. Support for LibQUAL+ was provided over the years through the following funding agencies: Department of Education Fund for the Improvement of Post-secondary Education, National Science Foundation, and Institute of Museum and Library Services.
- For those who plan to attend IFLA in August 2019 in Athens, Greece, we recommend visiting the Stavros Niarchos Foundation Cultural Center, where the new headquarters of the National Library of Greece are located, accessed September 23, 2018, https://www.snfcc.org/about/vision/the-national-library-of-greece/. If you’re planning to visit Thessaloniki, we recommend you visit the Municipal Public Library of Thessaloniki and the Central Library of the Aristotle University. If you’re planning to visit Veria on your way to Vergina archeological site, we recommend you visit the award-winning Central Public Library of Veria.
- Library Assessment Conference, accessed September 23, 2018, https://libraryassessment.org/.
- International Conference on Performance Measurement in Libraries, accessed September 23, 2018, https://libraryperformance.org/.
- Evidence-Based Library and Information Practice, accessed September 23, 2018, http://eblip9.org/ and https://library.usask.ca/ceblip/eblip/eblip-conferences1.php.
- Qualitative and Quantitative Methods in Libraries, accessed September 23, 2018, http://qqml.org/.
- Evidence-Based Library and Information Practice, accessed September 23, 2018, https://eblip10.org/CallforPapers/tabid/8101/Default.aspx.
- NISO Training Series: Assessment Practices and Metrics for the 21st Century, accessed September 23, 2018, https://www.niso.org/events/2018/10/niso-training-series-assessment-practices-and-metrics-21st-century.
Article Views (Last 12 Months)
Contact ACRL for article usage statistics from 2010-April 2017.
Article Views (By Year/Month)
2022 |
January: 11 |
February: 9 |
March: 13 |
April: 26 |
May: 11 |
June: 21 |
2021 |
January: 27 |
February: 13 |
March: 8 |
April: 25 |
May: 12 |
June: 13 |
July: 15 |
August: 15 |
September: 17 |
October: 19 |
November: 18 |
December: 8 |
2020 |
January: 34 |
February: 39 |
March: 17 |
April: 20 |
May: 27 |
June: 39 |
July: 56 |
August: 28 |
September: 18 |
October: 23 |
November: 16 |
December: 33 |
2019 |
January: 55 |
February: 57 |
March: 45 |
April: 32 |
May: 56 |
June: 38 |
July: 64 |
August: 39 |
September: 63 |
October: 46 |
November: 34 |
December: 39 |
2018 |
January: 0 |
February: 0 |
March: 0 |
April: 0 |
May: 0 |
June: 0 |
July: 0 |
August: 0 |
September: 0 |
October: 0 |
November: 384 |
December: 88 |