The 2018 ACRL Academic Library Trends and Statistics Annual Survey

Survey results and how to use them

Tracy Elliott is a member of the ACRL Academic Library Trends and Statistics Editorial Board and dean of the university library at San Jose State University, email: tracy.elliott@sjsu.edu

The editorial board of the ACRL Academic Library Trends and Statistics (ALTS) Annual Survey is thrilled to announce a 19.8% increase in survey participation over the past 4 years. Along with this increased participation comes a better understanding of what is happening in academic libraries and more impactful data for participants and researchers. Survey participants receive a complimentary link to summary data on the ACRLMetrics website. A subscription to ACRLMetrics provides access to all ALTS data starting from 1999 to present. A print edition of the 2018 data is also available for purchase through the ALA Store. This article highlights some of the findings from the 2018 survey and identifies valuable ways the data from the survey can be used.

Data from the survey

At the 2019 ALA Annual Conference and Exhibition in Washington, D.C., Georgie Donovan, chair of the ALTS Editorial Board, along with ACRL Associate Director Mary Jane Petrowski and Lindsay Thompson, 2018 survey administrator, presented the 2018 survey results.1 Although the survey collects more than 60 different data points, the presenters focused on those they felt pointed to significant shifts in academic libraries. Petrowski presented numerical data on factors, which saw some of the greatest percentage of change over a four-year period (2015-18), including staffing, expenditures, hours of operation, and gate counts. She also presented categorical data from the trends section of the survey, which in 2018 measured library contributions to student success, including four high-impact practices identified by the National Survey of Student Engagement. Survey participants have access to all of these findings and much more.


Table 1 shows a four-year comparison of average full-time equivalent (FTE) librarians by Carnegie Classification. While community colleges (3.5%) and four-year (13.3%) institutions increased their number of professional librarians, comprehensive (-5.7%) and doctoral/research (-11.6%) institutions have reduced their FTE librarians.

Table 1. FTE Librarians

Materials expenditures

Community colleges were the only Carnegie class to expend more on materials over a four-year period. See table 2 for the percentage change. The average increase in materials expenditure for associate degree-granting institutions was much higher than the other institutional types, with Baccalaureates reporting the largest decrease.

Table 2. Materials expenditures

Hours of operation

Although the survey data showed a decrease in expenditures over time, table 3 shows three of the Carnegie classes increased their hours. These hours are the average hours for when classes are in session. Doctoral institutions were the only institution type to report a decrease in hours.

Table 3. Hours of operation

Annual gate count

Similarly, those increasing their hours also experienced a higher gate count. Table 4 shows the average annual gate count by Carnegie Classification institution type. Once again, the Doctoral institutions reported a decrease.

Table 4. Annual gate count


Every year, the ACRL ALTS Editorial Board develops several questions based upon a theme. Survey participants provide input on the theme. This is referred to as the “trends section” of the survey. The 2018 theme was “library contributions to student success.” The survey asked libraries to indicate their participation in the high-impact practices identified by the National Survey of Student Engagement.2 The four most popular responses included 1) first-year seminars or experience, 2) writing-intensive courses, 3) undergraduate research, and 4) culminating senior experience.

Furthermore, the survey asked libraries to reveal metrics they use to measure their contributions to student success. The top sources for metrics included: 1) information literacy instruction, 2) reference transactions, 3) use of e-resources, 4) use of physical collections, and 5) research consultations. However, even with the level of reported participation and data collection, the overwhelming majority of survey respondents did not know if there was a correlation between library use and retention and/or graduation rates. The trends section for the 2019 survey includes questions on Open Educational Resources (OER).3

Using the survey results

One of the most pragmatic reasons for participating in the survey is that all of the federally required questions of the Integrated Postsecondary Education Data System (IPEDS) are included in the survey and can be exported to the institutional key holder for input. Therefore, the data only has to be reported once. However, the data from the ALTS is available much sooner than the IPEDS data, and there are additional questions not captured by IPEDS that participants will find useful. All participating institutions have free access to summary results. Individual responses are published in a print edition entitled ACRL Academic Library Trends & Statistics. Users that subscribe to ACRLMetrics can access the individual responses, create reports, and access historical library IPEDS data from 2004, and National Center for Education Statistics Academic Library Survey data from 2000 to 2012. In his review of the tool, Christopher Stewart reported, “taxonomies created by ACRLMetrics allow for a layered approach to searching basic survey information as well as additional, value added data such as ratios, percentages, and other interesting metrics.”4


Benchmarking is a highly recommended practice for library assessment and advocacy. In fact, the ACRL Standards for Libraries in Higher Education include 45 different recommended benchmarks for assessing library contributions to institutional effectiveness.5 Appendix 2 of the standards provides guidelines on how benchmarking and peer comparison can be used to advocate for more staffing or materials funding. Peer groups can be identified among the survey participants by geographic location, size, Carnegie Classification, and more. Oftentimes, an institution’s Office of Institutional Research has already identified the institution’s peer or aspirational groups. After identifying the peer group, participants can select specific data points for comparison, or review all, to identify gaps and strengths. For example, the library may choose to compare their number of FTE librarians to institutions similar in enrollment and classification. If the number is lower, they can use that information to advocate for more librarians. By comparing that same number with aspirational institutions, the library can use the hiring of more librarians as a strategy for enhancing intuitional quality. Furthermore, accrediting agencies expect libraries to use data benchmarking to describe their effectiveness. The Southern Association of Colleges and Schools, for example, requires library benchmarking data related to research and learning. In that case, reporting the peer comparison of FTE librarians being equal or greater, could provide evidence of investment in the library’s contribution to overall institutional effectiveness.

Strategic Planning

The Association of Research Libraries (ARL) identifies promoting a “culture of assessment that informs evidence-based decision making” as one of its guiding principles.6 Heather Lewin and Sarah Passonneau suggested, “Library assessment can generate a road map to address changes to benefit the scholarly community.”7 By looking at the trends captured in the ALTS, whether their own or the broader library community, libraries can identify strategies that will impact the effectiveness of their universities. For example, the survey looks at how users access library resources, including the library website and interlibrary loan. Furthermore, the trends section of the survey can be especially useful for strategic planning. Participating libraries guide the editorial board in the development of these questions, reflecting the current roles of libraries in the research and educational mission of the institution.

In their book, Managing with Data: Using ACRLMetrics and PLAmetrics, Peter Hernon, Robert Dugan, and Joseph Matthews discuss the appropriate use of these datasets to promote library accountability and relevance. The authors provide a framework for how the ALTS data can be used for strategic planning and informed decision making.8


The large datasets available through ACRLMetrics reflect years of data collection from a large number of participants. As stated at the beginning of this article, the number of participants continues to grow significantly. Therefore, researchers are not limited by sample size, allowing for a multitude of statistical tests, longitudinal studies, and cross-validation models. For these reasons, the ALTS data serves as a powerful tool for researchers. There have been a number of studies already completed using this data. Jody Condit Fagan explored the impact of reference, instruction, and materials expenditures on database searches and full-text downloads.9 Holly H. Yu examined ACRL and ARL survey data to identify trends in research data services.10 Furthermore, the ability to layer IPEDS data with ALTS data allows researchers to examine library impact on student success and institutional missions. For example, Elizabeth M. Mezick identified a significant relationship between retention, materials expenditure, and the number of library professional staff.11

Participation in the survey

There is still time for libraries to participate in the 2019 survey. The survey collection period ends February 28, 2020. Please consider the value in participation, for your institution and the profession. For more information about the survey, go to https://acrl.libguides.com/stats/surveyhelp.


1. Georgie Donovan, “Update on ACRL 2018 Academic Library Trends & Statistics Survey,” PowerPoint presented at the ALA Annual Conference, Washington, D.C., June 20-25, 2019, https://acrl.libguides.com/c.php?g=576969&p=7017848.

2. “NSSE National Survey of Student Engagement: High Impact Practices,” accessed November 30, 2019, http://nsse.indiana.edu/html/high_impact_practices.cfm.

3. “LibGuides: ACRL Academic Library Trends and Statistics: 2019 Survey Information,” 2019 Survey Information—ACRL Academic Library Trends and Statistics - LibGuides at ACRL, accessed November 30, 2019, https://acrl.libguides.com/stats/surveyhelp.

4. Christopher Stewart, “An Overview of ACRLMetrics, Part II: Using NCES and IPEDs Data,” The Journal of Academic Librarianship 38 (6): 342–45, https://doi.org/10.1016/j.acalib.2012.09.018.

5. “Standards for Libraries in Higher Education,” accessed November 30, 2019, http://www.ala.org/acrl/standards/standardslibraries.

6. “Who We Are,” Association of Research Libraries, accessed November 30, 2019, https://www.arl.org/who-we-are/.

7. Heather S. Lewin and Sarah M Passonneau, “An Analysis of Academic Research Libraries Assessment Data: A Look at Professional Models and Benchmarking Data,” The Journal of Academic Librarianship 38 (2): 85–93, https://doi.org/10.1016/j.acalib.2012.01.002.

8. Peter Hernon, Robert E Dugan, and Joseph R Matthews, Managing with Data: Using ACRLMetrics and PLAMetrics (Chicago, IL: American Library Association, 2015).

9. Jody Condit Fagan, “The Effects of Reference, Instruction, Database Searches, and Ongoing Expenditures on Full-Text Article Requests: An Exploratory Analysis,” The Journal of Academic Librarianship 40 (3-4): 264–74, https://doi.org/10.1016/j.acalib.2014.04.002.

10. Holly H. Yu, “The Role of Academic Libraries in Research Data Service (RDS) Provision,” The Electronic Library 35 (4): 783–97, https://doi.org/10.1108/EL-10-2016-0233.

11. Elizabeth M. Mezick, “Return on Investment: Libraries and Student Retention,” The Journal of Academic Librarianship 33 (5): 561–66, https://doi.org/10.1016/j.acalib.2007.05.002.

Copyright Tracy Elliott

Article Views (Last 12 Months)

No data available

Contact ACRL for article usage statistics from 2010-April 2017.

Article Views (By Year/Month)

January: 5
February: 0
January: 19
February: 8
March: 13
April: 7
May: 18
June: 11
July: 5
August: 12
September: 13
October: 10
November: 10
December: 18
January: 115
February: 86
March: 100
April: 83
May: 66
June: 48
July: 33
August: 29
September: 32
October: 24
November: 26
December: 22
January: 9
February: 949
March: 295
April: 151
May: 136
June: 262
July: 118
August: 96
September: 137
October: 160
November: 100
December: 85