ACRL

College & Research Libraries News

Benchmarking waiting times

By Joy Tillotson, Janice Adlington, and Cynthia Holt

Joy Tillotson is head, Information Services, at Memorial University of Newfoundland; e-mail: joyt@morgan.ucs.mun.ca. Janice Adlington is network resources librarian at Trinity College; e-mail: janice.adlington@mail. cc.trincoll.edu. Cynthia Holt is reference librarian and bibliographer at the University of Manitoba; e-mail: bolt@cc.umanitoba.ca

The Information Desk is always busy.” We heard it from the students when we were at the desk, when we did focus group interviews about our reference service, and when we did a survey of user satisfaction with reference service. We wanted to try to measure some aspect of the problem and see if the steps we were taking to deal with it were having any effect. So we decided to measure the time people had to wait for service at the desk and compare it over time and with waiting times at other institutions. This is part of the process called benchmarking.

What is benchmarking?

“The ongoing activity of comparing one’s own process, product, or service against the best known similar activity, so that challenging but attainable goals can be set and a realistic course of action implemented to efficiently become and remain best of the best in a reasonable time.”1 Marshall and Buchanan report that relatively few examples exist of benchmarking in library reference services and suggest that one way to do it is to compare the service over time in one institution as well as to compare it in a more traditional benchmarking sense with that of service desks elsewhere.2

Background information

The Queen Elizabeth II Library is the main central library at Memorial University of Newfoundland, St. John’s, Newfoundland. The university has 15,000 students and 800 faculty at the St. John’s campus and is served by a large main library and a smaller health sciences library. In the main library, general reference service is offered by librarians at one centrally located Information Desk. Near the Information Desk are 12 CD-ROM stations and a public Internet terminal. Library assistants circulate periodically in the CD-ROM area to handle problems with printers and the mechanics of searching.

What did we measure?

We measured the length of time users had to wait at the Information Desk before speaking to a librarian, the length of time the librarians spent with each user, and the number of users who left the desk without speaking to a librarian. We also counted the total number of users arriving at the desk, whether they spoke to a librarian or not. Because we were concerned about the amount of time we spent helping people with CD-ROM questions, we also kept track separately of the time spent on them.

How did we measure it?

We observed the Information Desk for a total of 30 hours in January-March 1993 and 30 hours in February-March 1995. Each hour between 10:00 a.m. and 4:00 p.m., Monday to Friday, was observed once during this period; these were felt to be the busiest times at the Information Desk. An observer sat in an office 25 feet from the desk and used stopwatches to time all interactions with the librarians, and the length of time users remained at the desk. We also kept track of the number of users who went away without speaking to a librarian, and how long they waited.

What did we find?

The biggest difference between the two years surveyed was in the number of users who came to the desk during the study period. In 1993 there were 749, and in 1995 there were 973. Despite this 30 percent increase in users, there was little change in the other figures. The average waiting time increased slightly: 1 minute 50 seconds in 1993, and 1 minute 56 seconds in 1995. The average time spent with a user decreased slightly: 3 minutes 13 seconds in 1993, and 2 minutes 58 seconds in 1995. The number of users who left without being served increased by 17 percent: there were 70 in 1993 and 82 in 1995. The proportion that left unserved decreased slightly from 9 percent in 1993 to 8 percent in 1995. The average amount of time spent on CD-ROM questions decreased from 7 minutes 30 seconds in 1993 to 6 minutes 6 seconds in 1995. However, the number of CD or other computer (e.g., Internet) questions increased from 36 to 57, so the total time spent on this type of question increased.

How did it happen?

How did we cope with 30 percent more users without a major increase in average waiting time or a major decrease in the average time spent with users? One thing that happened was that users filled up the less busy times. There were no longer any hours that were noticeably less busy. It has always been the case that, besides the two librarians scheduled to be on the desk, other librarians passing the desk will stop and help if there is a lineup at the desk. These “extra” librarians dealt with 74 users in 1995 (they were not counted separately in 1993). In 1993, library assistants helped people with CD-ROMs from 10:00 a.m. until 3:00 p.m— fixing paper jams, adding new paper and ink, showing people how to print results, etc. Their interactions with users are not counted in the survey. Their hours were extended to 4:00 p.m. by the time of the 1995 study. This appeared to have an effect, since in the 1993 study, 3:00–4:00 p.m. was the hour that saw the greatest number of people leaving without talking to a librarian. We think this may have been because librarians were away from the desk more, dealing with CD-ROM problems, when the library assistants were not there. In 1995 the total number of users who came between 3:00 and 4:00 p.m. decreased from 180 to 160 (11 percent), and the number who left unserved decreased from 21 to 13 (38 percent).

How did it compare with other organizations?

We searched business and library literature and found two articles that reported waiting times in a service situation. An article in Inc. investigated the waiting times on customer service phone lines provided by 18 software publishers.3 The authors reported 11 waiting times from a “representative sample” of five companies, including Lotus, Microsoft, and WordPerfect. Waiting times varied from 10 seconds to 27 minutes, with an average time of 5 minutes for 10 tries where the caller got an answer. In one case, the caller gave up after being on “silent hold” for “eons.” Of the calls that were answered, 70 percent were answered in 4 minutes or less. Katz, Larson, and Larson measured waiting times at a branch of the Bank of Boston.4 They reported waiting times from 0 to 13 minutes, with an average waiting time of 4 minutes. Fifty-nine percent of the waits were 4 minutes or less. No one left without being served. Compared with this, we had waiting times that varied from 0 to 12 minutes for those who were served, with an average waiting time of about 2 minutes, and with 94 percent of the waits being 4 minutes or less for those who were served. The figures were almost the same for people who left without speaking to a librarian, except that the longest waiting time was 6 minutes.

What does it all mean?

We think we are doing reasonably well. Compared with the outside organizations we could find, our waiting times are shorter. Compared with the way we were in the first study, we are reaching more people, losing no more than we did, keeping them waiting about the same length of time, and giving them about the same amount of time. The addition of library assistants during the 3:00–4:00 p.m. time period may have been responsible for the loss of fewer users during that time in 1995. The 1995 results are not solely due to our efforts. Our users seem to have spread themselves around more evenly and become quicker at picking up the use of CD-ROM databases. There are still problems with waiting times that we hope to be able to address, such as people who wait a very long time for a question that requires a very short answer.

Incidentally we were able to quantify two things that we had often speculated about: time spent at the desk by “extra” librarians (i.e., not just the two who were scheduled) and the difference between the number of questions recorded on the statistics sheets and the number of users who come to the desk. The extra librarians contributed 10 percent of the time that was recorded; 15 percent more people were helped at the desk than were recorded on the statistics sheets.

The study’s greatest value has been to give us figures that we can use to bolster our arguments for maintaining at least current levels of staff.

Notes

  1. Gerald L. Balm, Benchmarking: A Practitioner’s Guide for Becoming and Staying Best of the Best (Schaumburg, Ill.: QPMA Press, 1992), 16.
  2. Joanne G. Marshall and Holly Shipp Buchanan, “Benchmarking Reference Services: An Introduction,” Medical Reference Services Quarterly 14, no. 3 (1995):59–73.
  3. Susan Greco and Chris Caggiano, “Software Support: Please Hold for Customer Service,” Inc. 13, no. 8 (1991):96.
  4. Karen L. Katz, Blair M. Larson, and Richard C. Larson, “Prescription for the Waiting-in-Line Blues: Entertain, Enlighten, and Engage,” Sloan Management Review 32, no. 2 (1991):44–53 ■
Copyright © American Library Association

Article Views (Last 12 Months)

No data available

Contact ACRL for article usage statistics from 2010-April 2017.

Article Views (By Year/Month)

2021
January: 4
February: 3
March: 2
April: 6
May: 4
June: 1
2020
January: 5
February: 3
March: 3
April: 0
May: 9
June: 7
July: 3
August: 1
September: 5
October: 3
November: 5
December: 1
2019
January: 0
February: 0
March: 0
April: 0
May: 0
June: 0
July: 0
August: 11
September: 8
October: 5
November: 9
December: 3