College & Research Libraries News
How useful is your homepage? A quick and practical approach to evaluating a library’s Web site
A library’s Web site has become the tool most used by students and other information seekers to unearth the information riches of the Internet. As a result, the Web site has become one of the library’s most important resources, serving both as a source of informa- tion about the library and its holdings and as a gateway to the Web.
Because of its importance, we need to know more about the effectiveness of library Web sites, how they are used, what features our users like, and what is confusing to them. This article describes how the library at the University of the Sciences at Philadelphia (USP) tackled these and other issues. The information gained is now being used to revamp the library’s Web site.
The history of a Web site
When the USP library’s campus Web site was set up in 1995, the library’s homepage described the library and had links to hours, staff, explanations of services, and Web-accessible databases—at that time, maybe two or three. From there, the homepage steadily grew. In the ensuing years, databases were added to the site until there were 13 pages of them, along with collections of full-text, and explanations of what was there. The library’s Web master intended to do something different with the pages, but there always seemed to be more pressing items on her to-do list. When she was promoted to a campuswide position, it quickly became evident that someone else had to maintain the library’s site. No one else on our small staff expressed interest (in fact, they cringed at the thought), so the task fell to the library director, Mignon Adams. Since the campus Web site was also undergoing a redesign, redoing the library site seemed appropriate.
After attending several sessions on how to make Web sites more usable, and looking at the Web sites of other libraries, library staff realized that our database page was too long and cumbersome; students often did not know what “full-text” meant, let alone “bibliographic database”; and we had too many icons.
Last year the library’s Web site received 165,105 hits, second only to the “prospective student” sections of the campus Web site. But we had no idea who was visiting the site, for what reason, and whether they found what they wanted. Web usability studies indicate that one way to find out is to observe individual users as they looked for specific kinds of information, but this sounded like a labor-intensive activity that would have to wait until a slow period to implement.
Richard Dougherty had given two workshops in the Philadelphia area exploring a technique called “RADAR” (Recognizing Actual Desires And Requirements), a facilitated process designed to gain input quickly from users and staff. Dougherty and Adams had been working together on another project, and she asked him if his methods could be applied to a Web site analysis.
About the authors I
Mignon Adams is the director of libraries at the University of the Sciences in Philadelphia, e-mail: m.adams@usip.edu; Richard Dougherty, retired director of Libraries at the University of Michigan, is now president of Dougherty and Associates, e-mail: rmdoughe@umich.edu
Description of the process
RADAR is a tool specially designed to help a library’s staff stay in touch with its users. The objectives of the RADAR process are twofold: to generate planning information about the current and changing needs of library users and to identify constructive actions librarians can take to respond to identified users’ needs and desires.
The underlying premise of RADAR is that a library’s own staff knows user information needs and desires, and, because they work with users on a day-to-day basis, are able to place users’ needs into an overall service context.
A unique feature of RADAR is its panels of actual users. The panelists share how they obtain information, what sources they use, and, why they make the choices they do. Panelists also offer suggestions to the library for improvement. With this information staff can assess what they hear, ask for clarification, and, more importantly, determine what actions should be taken in light of what they have heard.
The process is structured so that the staff/ user interactions take place in an atmosphere that maximizes the willingness of staff to speak out without fear of contradiction.
Why not focus groups or a survey?
Well-designed surveys can generate useful information, particularly when the library requires quantitative data. Developing a questionnaire, however, and constructing a reliable sample of people who are actual users of a Web site are not easy tasks. Surveys take time, expertise, and money.
Carefully selected panels and well-de- signed questions with skillful facilitation can provide invaluable insights as well. But focus groups need people who are knowledgeable about the subject and who are opinion leaders. The most often-heard complaint about focus groups among librarians is the difficulty of finding faculty and students who are both knowledgeable and willing to participate in the sessions.
The use of panels in RADAR also avoids one of the limitations of library-oriented focus groups: cost. In the corporate world, a focus group will use well-trained, experienced facilitators and one-way mirrors with observers to identify topics that need to be probed. How often do we use professional facilitators who haλ'e enough experience in library issues to recognize when probing follow-up questions need to be asked? This is important because a librarian, even as an observer, should not be present. The RADAR approach avoids this problem of interpretation because panels interact directly with the staff who are given an opportunity to assess what they have heard and ask probing questions.
How we created our user panels
In the USP test, faculty and students who were thought to be actual users of the library’s Web site were contacted. The director invited faculty whom she believed to be regular users of the library and its resources. She asked student workers who had the time available, and offered to pay them their student worker salaries. Even with this informal approach, the director had difficulty finding faculty and students with time to participate.
We assumed that the participants in our test panels were users of the library’s Web site, but as we found out later, about half of the panelists somewhat apologetically admitted that they were not frequent users. This proved to be an unexpected bonus because the staff was able to find out why, and, of course, these discussions led to other revelations, a sampling of which are presented in the following paragraphs.
What we heard from our customers
The informal presentations and the follow-up interactions between the panelists and the staff were revealing and produced a wealth of information. We learned:
•The library’s homepage was rarely used by faculty and students; some were unaware it was there.
•Students admitted that they were often confused by the Web site; they thought the database page was too complicated to figure out.
•Instead of using the library’s resources, students are more likely to use Yahoo or Google; they don’t use the directories of these search engines but type in their search request in natural language.
•Jargon presents problems for some: what is a database; what do charged out, availability, and browse mean?
•Faculty are familiar with the databases that they most frequently use. However, they didn’t try out new ones, they didn’t read descriptions, and they were unaware of some useful sources.
•The layout of the Web site was criticized for having too many words and a type-size that was too small.
After the panelists departed, staff were asked: “What is the meaning of what you have heard?”
The staff comments quickly validated what the students had been saying about going directly to search engines and bypassing the library and its resources. The staff weren’t a bit surprised, and it took only a few minutes for staff to compose a telling message. Again, what follows is a sampling:
•Students don’t know how to use the resources the library is already providing.
•There is really a mismatch between what we are offering and what customers are using.
•They aren’t asking us; they are going directly to the Web.
•Ease of getting information is more important than the quality of information.
•There are lots of terms that aren’t understood.
•They are asking us to make the “stuff” simple.
The staff responds
The group generated a long list of suggestions and recommendations. Some of the ideas dealt with organization of the Web site, the presentation of information, and additional training workshops. Suggestions that received greatest priority through a voting process are listed here:
•Make the Web site more basic; the main page must be more direct—shorter and to the point.
•Don’t overwhelm students with terminology and jargon.
•Categorize resources by academic major: pharmacy, medicine, history, etc.
•Design the Web site for different audiences: only a first-time user needs a description of the library, its hours, or location.
•Give quick ways to get into the sources: the “top three” resources for psychology or biology, for example.
•Provide basic instructions.
Lessons learned
What did we learn from the RADAR experience that warrants telling others? First, providing an opportunity for staff to listen, comment, and offer recommendations has greatly enriched everyone’s understanding of what was right and wrong with the design of the library's existing Web site. It also provided the Web master with valuable information about specific things that needed to be done and about how to achieve them.
Second, the staff now sees that everyone has a stake in making the Web site more responsive to the needs of users. We were convinced that the staff would be frank because they didn’t feel that they had a stake in the design of the current Web site. We were right about that, but didn’t fully appreciate how quickly some staff began talking in terms of “our tool” and what “we” need to do in the library to make this important tool more helpful and valuable to our users.
Third, simplicity is a virtue—less is more … less text … less items on the page. We really have to provide basic information for underclass students. The Web site at present is or appears to be complex for them.
Finally, our original design didn’t pay enough attention to the desires, needs, and preferences of our audiences. The fact that the homepage is so often not the entry point underscores the point that what we think of as our homepage is not relevant to our users’ needs.
Next steps
A staff task force has been appointed to help with the design of the revised Web page. Staff have already said that we need to pay more attention to what other academic libraries have done. Let’s take advantage of what others have done well.
Future designs will strive to incorporate what we heard the students and faculty telling us—keep it simple for the undergraduates.
There will have to be more instructional workshops for students, even though staff have been providing instruction that should have addressed many of the student and faculty questions and concerns. How we can do this better is just one of the challenges facing the library’s staff.
A staff member reminded us that the library needs to do a better job of marketing existing library services. She asked: “Why should they have to come to us? We need to reach out to them.” Hopefully taking action based on what we learned represents a first step towards achieving this. ■
Article Views (By Year/Month)
| 2025 |
| January: 10 |
| February: 15 |
| March: 9 |
| April: 16 |
| May: 14 |
| June: 23 |
| July: 24 |
| August: 20 |
| September: 25 |
| October: 33 |
| November: 44 |
| December: 22 |
| 2024 |
| January: 0 |
| February: 2 |
| March: 5 |
| April: 10 |
| May: 8 |
| June: 6 |
| July: 10 |
| August: 4 |
| September: 9 |
| October: 5 |
| November: 5 |
| December: 5 |
| 2023 |
| January: 1 |
| February: 1 |
| March: 1 |
| April: 7 |
| May: 1 |
| June: 1 |
| July: 1 |
| August: 1 |
| September: 1 |
| October: 5 |
| November: 5 |
| December: 4 |
| 2022 |
| January: 10 |
| February: 2 |
| March: 2 |
| April: 1 |
| May: 2 |
| June: 2 |
| July: 6 |
| August: 4 |
| September: 4 |
| October: 0 |
| November: 5 |
| December: 1 |
| 2021 |
| January: 5 |
| February: 3 |
| March: 2 |
| April: 2 |
| May: 2 |
| June: 2 |
| July: 1 |
| August: 0 |
| September: 1 |
| October: 6 |
| November: 1 |
| December: 1 |
| 2020 |
| January: 5 |
| February: 9 |
| March: 7 |
| April: 8 |
| May: 9 |
| June: 1 |
| July: 5 |
| August: 0 |
| September: 6 |
| October: 3 |
| November: 4 |
| December: 4 |
| 2019 |
| January: 0 |
| February: 0 |
| March: 0 |
| April: 0 |
| May: 0 |
| June: 0 |
| July: 0 |
| August: 24 |
| September: 7 |
| October: 21 |
| November: 10 |
| December: 5 |