ACRL

College & Research Libraries News

Using the ACRL performance manual: The LSU Libraries experience

By Barbara Wittkopf

Reference Librarian

Louisiana State University

and Patricia Cruse

Reference Librarian Louisiana State University

UsingMeasuring Academic Library Performance makes it easier to plan‚ implement‚ and analyze user surveys.

User surveys are occasionally touted as be- ing self-serving. In addition, surveys can be demanding and expensive to administer, the results difficult to interpret, and the data collected frequently are never used. However, the reality is that surveys are often the only way a library can determine if its services are meeting the patrons’ needs. With the publication of the ACRL-commis- sioned manual Measuring Academic Library Per- formance: A Practical Approach, it is easier to plan, implement, and analyze a survey.1

Planning and implementation

In 1989 the Louisiana State University (LSU) Libraries conducted a user survey which revealed that patrons were “very satisfied” with service at Middleton Library. The Reference Services Department decided to conduct a similar survey in 1990 using the procedures and forms provided in the ACRL manual. The goal was to exceed or maintain the perceived level of service reported in the previous year.

In using the manual the following objectives were addressed: (1) selecting appropriate surveys, (2) establishing a time frame, (3) adapting the manual, and (4) training the staff.

(1) Survey selections.Two surveys from the manual were selected to be conducted simultaneously: the Reference Satisfaction Survey and the Reference Transaction Survey. The Reference Satisfaction Survey (Form 14-1) was designed to qualitatively report the user’s response regarding:

• the outcome of the reference transaction (i.e., relevance, amount, and completeness of information)

• the service experience (i.e., the perceived helpfulness of the staff)

• the overall satisfaction with the transaction.

The second survey, quantitatively measured the number of reference transactions that transpired by hour during the survey periods using the Reference Service Statistics (Form 13-1). It was anticipated that data from this survey could be used to ascertain whether there was adequate staffing during peak hours of service.

(2) Time frame.A decision was made to conduct a pilot survey in the fall of 1990 as a “dress rehearsal” for another survey to be conducted in the spring of 1991 prior to the 1992-94 University Accreditation Review. A representative week was chosen in the middle of each semester, avoiding midterms and holidays such as Thanksgiving, Mardi Gras, and spring break.

(3) Adapting the manual.The surveys in the manual were easily adapted. There were minor changes to the forms and a procedural change in the way in which the forms were distributed.

The “Reference Transaction.”Both the Reference Satisfaction Survey (Form 14-1) and the Reference Service Statistics (Form 13-1) are based on the premise that a “reference transaction” is taking place. The manual suggests that everyone who approaches the reference desk during the survey period be handed a form. Patrons who feel they have received “reference assistance” are asked to complete the form assessing the service received.

However, at LSU instead of letting the patron decide if they had “asked a reference question” (versus a directional question) the staff member, who was involved in the transaction, decided if a reference transaction had transpired, and was responsible for distributing the surveys to the patrons receiving assistance. This guaranteed that only the patrons who had actually received reference assistance were surveyed. It was felt that this was more appropriate than letting the patron decide because the staff shared a common understanding of the term “reference transaction” as defined by the Integrated Postsecondary Education Data System. IPEDS defines a reference transaction as an information contact that involves the knowledge, use, recommendation, interpretation, or instruction in the use of one or more information sources by a member of the library staff.2

… surveys are often the only way a library can determine if its services are meeting the patrons’ needs.

Distribution of forms.In the pilot survey the reference staff distributed the Reference Satisfaction Surveys (Forms 14-1) to the patrons at the end of the reference transactions. The manual recommends that third party individuals such as students distribute the forms to separate the reference transaction activity from the request to complete the survey. Although budgetary constraints in the student budget for the fall semester prevented third party individuals from distributing the forms, a commitment was made by the staff for the spring 1991 survey to followthe procedure in the manual.

Therefore, during the spring 1991 survey, once a staff member had completed the reference transaction s/he communicated non-verbally to the student worker on the other side of the desk to hand the form to the patron. The staff member then noted the transaction on the Reference Services Statistics form which was taped on the desk.

Changes in forms.During the surveys, statistics were taken on an hourly basis. This is a change from the Reference Services Statistics (Form 13-1) in the manual which uses a combination of one- and two-hour time slots.

The changes to the Reference Satisfaction Survey (Form 14-1) were more numerous and were based on recommendations made in the ACRL manual and comments made at the 1990 ACRL program on performance measures. As a result of these recommendations and in keeping with other statistical surveys that have been taken at the LSU Libraries, the categories for respondents were adjusted to accommodate both LSU and non-LSU patrons. A category for elementary and secondary school students was added and “personal use” was included as a reason for using the library. The basic change to Form 14-1, however, as noted above, was to distribute this survey form only to users who had received reference assistance as determined by the staff. As a result, the line at the top of the form reading “If you were NOT asking a reference question today, please check here and stop” was omitted.

Data analysis.The ACRL manual is divided into two sections. The first section, titled “Measurement” is a general overview of survey implementation. The second section, “The Measures,” is an in- depth description of the specific surveys. Both of these sections were extremely helpful in analyzing the collected data. The data for both surveys were loaded into LOTUS 1-2-3. Following the manual’s suggestions and directions, it was extremely easy to create spreadsheets and graphs of the collected data. However, because of limitations in the LOTUS program the data cannot be cross tabulated, which would provide meaningful data. For example, it might be very useful to know how non- LSU students rated Reference Services. In the future, SPSS-PC+ will be used (as suggested in the manual) in place of LOTUS in order to manipulate the data more efficiently.

Although the data collected at every institution are unique and should be treated as such, it would have been helpful if a more in-depth discussion on data analysis were included describing what the numbers mean and how to interpret them.

(4) Training staff.Reference desk service is provided by a team of librarians, associates, and library school graduate assistants (GAs) who share desk hours at three desks. Approximately two weeks before the pilot survey began a memo announcing the surveys was distributed to the reference staff and the library administration. The surveys had previously been announced as an upcoming activity in the Department’s Annual Report.

The memo explained the purpose of the surveys, emphasized that “service is being evaluated, not individuals,” and explained the components of each survey, i.e., the five questions regarding user satisfaction, and the simultaneous recording of transactions by hour. The dates were announced and copies of the forms were distributed.

A meeting of the staff members who would be involved with the surveys was called. The procedures for both surveys were discussed in-depth. This allowed the staff the opportunity to ask questions about the content of the survey and procedures, as well as offer comments and suggestions. The IPEDS definition was explained by way of examples. Before the spring survey, statistical graphs of the users responses from the fall pilot survey were shared with the staff. By allowing staff to be an integral part of the survey implementation and sharing data from the fall pilot survey, it was hoped that they would develop a vested interest in its execution and outcome.

Problems in implementation

(1) Staff burnout. Despite staff training before each survey, the number of surveys distributed and the number of transactions recorded daily dramatically decreased by the end of the week as staff participation waned in the process. It may have been helpful to keep the staff informed as the week progressed with a chart indicating the number of surveys that were returned each day along with the desired response rate.

(2) Patron burnout. The user satisfaction survey requires a response each time a patron receives assistance during the week-long survey period. Frequent users of reference services verbally (and perhaps silently) commented to the reference staff that they had already completed a survey form and were not enthusiastic about completing multiple forms throughout the week. The manual emphasizes that the person distributing the forms must be quite aggressive in these circumstances.

(3) Survey comprehension. In the process of data analysis two observations were noted regarding patron comprehension of the user surveys. First, the terms were occasionally perceived as generic and somewhat vague, i.e., what does “completeness of the answer that you received” signify? Second, “overall satisfaction” was often interpreted as an opportunity to comment broadly on any area of the library not just Reference Services, for example, “You need more serials” or “Why don’t you have more terminals on the third floor?” If the LSU Libraries conducted this survey in the future a recommendation would be to qualify this question to read: “Overall, how satisfied are you with Reference Services today?”

Survey data and accreditation

In light of dwindling resources, accreditation review teams are now assessing qualitative as well as quantitative data. Attention is being given to service outputs at an institution as well as to budgets, staff size, and volume holdings.

Reference is considered a public service area that can be assessed along with other educational and research units within the university. The Southern Association of Colleges and Schools’ (SACS’) Resource Manual for Institutional Effectiveness suggests that a statement of purpose for these units might be that “the university maintains a major commitment to public service” and that the expected results would be that “client satisfaction with service provided would be consistently high.”3

In anticipation of the 1992-1994 SACS review at LSU, reference staff were pleased that the data from both the fall and spring surveys indicated that the patrons were still “very satisfied” with the reference services they received in Middleton Library.

Results

In addition to the positive feedback from the users indicating a high degree of satisfaction with the service, the quantitative data reinforced staff perception that the reference desks were adequately staffed during peak hours of services. The hourly data, by day, will also be useful should it be necessary or desirable to reduce hours of service. The manual made it easy to plan, implement, and analyze two reference surveys simultaneously.

Ed. note: Measuring Academic Library Performance: A Practical Approachby Nancy Van House, Beth Weil, and Charles McClure is available as both a book and a book and software package. The software package is designed for entering and analyzing performance measures data collected in surveys as suggested in Measuring Academic Library Performance. The software, developed on a runtime version of the popular database program Paradox, is meant to be easy to use by those without extensive computer experience. The software requires: an IBM-compatible computer with at least 640K of RAM (an 80286 or faster model is desired for adequate speed.); DOS 3.1 or higher; a high- density 5 1/4" disk drive; and a hard disk drive with at least 3 megabytes of free space. Although the software development was somewhat delayed, the package is completed and ready for shipment.

Measuring Academic Library Performanceis available from ALA Publishing Serv., Order Dept., 50 E. Huron St., Chicago, IL 60611; phone: (800) 545-2433, press 7; or fax (312) 944-2641. Book only: ISBN 0-8389-0529-3, $29.00; book and software package: ISBN 0-8389-0542-0, $70.00. ■ ■

Notes

  1. Nancy Van House, Beth Weil, and Charles McClure, Measuring Academic Librart/ Performance (Chicago: ALA, 1990).
  2. Ibid, p. 96.
  3. Resource Manual on Institutional Effectiveness,2nd ed. (Atlanta: The Commission on Colleges of the Southern Association of Colleges and Schools, 1989), pp. 28-29.
Copyright © American Library Association

Article Views (By Year/Month)

2026
January: 5
2025
January: 15
February: 11
March: 5
April: 14
May: 16
June: 21
July: 18
August: 24
September: 19
October: 29
November: 34
December: 33
2024
January: 1
February: 1
March: 3
April: 3
May: 7
June: 13
July: 6
August: 6
September: 4
October: 6
November: 5
December: 5
2023
January: 1
February: 0
March: 1
April: 3
May: 6
June: 0
July: 1
August: 0
September: 4
October: 2
November: 1
December: 4
2022
January: 6
February: 5
March: 0
April: 2
May: 2
June: 3
July: 4
August: 3
September: 2
October: 0
November: 2
December: 1
2021
January: 7
February: 4
March: 3
April: 4
May: 1
June: 3
July: 1
August: 0
September: 1
October: 5
November: 2
December: 0
2020
January: 3
February: 11
March: 5
April: 4
May: 5
June: 3
July: 3
August: 1
September: 3
October: 4
November: 7
December: 4
2019
January: 0
February: 0
March: 0
April: 0
May: 0
June: 0
July: 0
August: 10
September: 6
October: 5
November: 8
December: 11