ACRL

Association of College & Research Libraries

Recipe for disaster or formula for success?

Anna Marie Johnson is coordinator for library instruction and Melissa Laning is team leader of assessment and resource planning at the University ofLouisville, e-mail: annamarie@louisville.edu; malaniOWgwise.louisville.edu

Creating and assessing a large scale collaborative library introduction exercise for honors students

Working with any large group of students in the library is always a challenge. It becomes even more so when the students number 160 plus and the exercise is supposed to introduce them to all parts of the university library in less than two hours! The logistics alone can cause huge headaches. At the University of Louisville (UL), we have attempted this three years in a row and have learned much about what works and what does not. This article describes the history of the project, our goals, how we have assessed the project’s effectiveness, what we have learned, and what we would recommend.

Our previous attempts

Working with the Honors sections of General Education 101, the university’s introduction to campus life, has provided us with a chance to try out instructional methods and content on a large scale. The Honors section classes take place over a three-day period during the weekend before the fall semester begins.

In 1997, three instructors from the university libraries presented simultaneous sessions to groups of ten to fifty. The sessions dealt only with researching on the Web and using university e-mail. Though the sessions were productive, it was impossible to know exactly how effective they were, because no formal assessment was conducted. The anecdotal feedback was that the students enjoyed it, but they wished it had included more about the libraries.

Based on this limited anecdotal feedback, we revised the course content for the following year by creating two different sessions. The first was held on a Friday morning and consisted of three consecutive sessions called “Critical Evaluation of Web Information.”

Each session was taught by a different library instructor, but the content was the same: a PowerPoint presentation of the key factors relating to the evaluation of information and an exercise using printed copies of two Web pages to be compared for their accuracy, authority, currency, coverage, and objectivity. The auditorium-style room and the lack of hands-on capability made the session frustrating.

The second part of the students’ orientation to the libraries consisted of a Saturday morning “scavenger hunt” exercise. This exercise required them to answer questions about various resources in different departments of the main library and then get an “information passport” stamped by someone in that area. Although this introduced the students to various parts of the library, it was problematic for several reasons. First, the library was scheduled to be closed on the day of the scavenger hunt, so extra personnel were required to open the building and staff it. Second, the size of the group made “traffic jams” a problem. Too many students would crowd an area, all looking for the same resource or all asking questions at the same time. There was also no assessment and no group discussion of the experience. The only feedback was again anecdotal, and the students reportedly said that they found the exercise “boring” and “not challenging enough.”

Learning from our mistakes

While the scavenger hunt did not necessarily qualify as a disaster, obviously some new ideas and techniques were needed. It was obvious more planning was needed, especially in terms of logistics. It would also be important to make it more engaging for the students. Our audience was Honors students who generally tend to be self-motivated and to expect more challenging material.

We worked with the critical Web information piece to make it more participatory and “active-learning” oriented. We moved to another room on campus and broke the students in discussion groups. The exercises centered around “information dilemmas,” such as “term paper mills” on the Web, and the students were asked to discuss and then share the contents of their discussion with the rest of the group. We also changed the format of the scavenger hunt to make it more in-depth and interesting.

The students were divided into eight different themes: Science & Technology, Life in the ’80s, Violence, Other Cultures, Wealth, the Arts, Documentary Photos, and Tarzan. The themes were developed by the librarians who created the exercises for them.

The thinking was that a theme for the exercises would provide an opportunity to engage the students in topical discussions. This divided the l60 students into groups of 20. Thinking twenty was still too large, each theme was divided into smaller segments of five students each. For example, the theme of Violence was broken into Murder, Terrorism, Gangs, and Violence in the Media. This

Library exercise tip #7 Organizing the exercises around themes helped the librarians generate questions and made the exercises more interesting for the students. way, no more than five students would be likely to be looking for a specific reference book or other resource.

At a planning meeting in May, it was decided not to have a specific number of questions for the assignment, but to design the questions in such a way that each small group visited at least three or four different areas of the library.

The instmctors for this project came from all areas of the library: Media and Current Periodicals, Information Literacy, Reference, Technical Services, and Rare Books and Photoarchives. This allowed for a diversity of viewpoints as well as expertise. Each volunteer instructor was assigned a theme and was allowed to create his or her own subthemes and questions. Questions were shared among the instructors, but only a few of the questions were standardized across the whole group. This allowed for flexibility and made the most of each person’s expertise. Some instructors simply created questions relating to their theme, while others created scenarios to help engage the students’ attention and to give context to the questions.

Library exercise tip #2 Involving staff from all areas of the library helped generate variety and interest.

The questions ranged from the very specific, such as “Using the Statistical Abstract of the United States, find which state had the highest number of murders in 1995” to the more general, “Browse in the LC call number section of the reference area that you think is most related to your topic. Write down the citation for one book that you find there that you think might help with researching this topic. Tell why you think so.”

Although the flexibility and variety were nice, there was little standardization and little guidance as to how to write the questions. This may have created some discrepancies in the students’ experiences.

Library exercise tip #3 We wanted the students to think about library research as a concept, not just the day's activity. asked questions about the publication date and how that would affect their topic and whether they would use the article. We also asked about the name of the journal, if they knew its reputation, and why that might be important. To move beyond the scavenger hunt concept as we understood it and to channel the students’ thinking about library research, were key aspects of our questions.

Another primary goal was to help the students discover parts of the library with which they might not otherwise come into contact. For example, Government Publications, Photo Archives, and Rare Books are rich resources that many students never use. However due to time constraints (the whole exercise was 50 minutes) and the sheer number of students, it was impossible for all students to experience all parts of the library.

To remedy this, we created a framework around the 50-minute exercise. The groups of 20 would come together at the start for a short orientation/explanation/pep talk for roughly 20 minutes. At the end of the 50-minute exercise, there would be a ten-minute preparation period and about thirty minutes for the groups of twenty students to share with one another what they had learned about the different parts of the library through short oral presentations. To help them prepare for this, the students were given a variety of guiding questions, such as “Describe your topic briefly”; “What did you have problems with and how did you solve them?”; “What factors affect what resources you use?”; and “Where would you start your next research paper?” The sharing period was meant to allow the students to learn from others’ experiences as well as their own.

UL Honors Program Expedition perception question and results

1. The library expedition introduced me to library research tools and collections that I did not already know about:

  # of %
  respondents  
Strongly disagree 0 0%
Disagree 4 3%
Agree 86 56%
Strongly agree 64 42%

Feedback and assessment

At the end of this session, the students completed a ten-minute evaluation of the morning’s activities. Based on our experience, we knew that feedback from students was extremely valuable in the development of meaningful and effective instruction sessions; however, we had relied in the past on informal or indirect information. This time we created a brief survey to elicit direct and focused feedback from students regarding their experience.

The first consideration in our survey design was deciding what we wanted to learn. Many instruction evaluation surveys ask a series of questions to determine whether the respondent was satisfied with the instruction. Because feedback from the previous year indicated that the session was boring and too easy, we wanted to know whether the revised approach improved the quality of the students’ experiences. We were also looking for clues to possible future enhancements. Assessment in academic libraries is also increasingly interested with finding out what students actually learn during information literacy classes.1 Because student learning is critical to our instruction mission at the UL, this was an important area to investigate further. On a more practical level, we also wanted to find out which of our nonstandardized exercises was most successful in conveying basic information evaluation concepts and library-use skills.

The second consideration in the survey design was related to length and format. Given the limited amount of time scheduled for the evaluation and the fact that it was right before lunch, we knew the survey had to be short and to the point. For that reason, we limited the evaluation form to one, twosided page and used mostly multiple choice questions.

The evaluation had three sections. In the first section, we asked the students to identify their team and topic, and to indicate previous library usage. In the second section, we asked them to rate from “Strongly Agree” to “Strongly Disagree” a number of statements about their experience, such as “There was enough time to complete the library exercise for my topic.” The students were also asked specifically about the level of difficulty for their session. In the third section, we asked a series of multiple-choice questions aimed at discovering if the students knew when to use the libraries’ catalog, when to use an index, and how materials are organized in the library.2

The UL Libraries have the Bubble Publishing software, created by Scanning Dynamics, Inc., which allowed us to design and print our own evaluation forms 3 The software also allows us to scan the results and create a simple report from the information collected (see sidebar).

Overall, the evaluation results showed that the 1999 Honors Library Expedition was a successful program. We were especially pleased to see that 88% of the respondents thought that the level of difficulty was just right. More importantly, the information will help to make future adjustments to the program format and content.

For example, on the “experience satisfaction” type questions, we found that almost one-third of the respondents did not feel they had enough time to complete the library exercises, even though 97% of them thought they had learned something useful for their classes.

We hope to have a longer time period for future Library Expeditions, but if not, we may to need to make the exercises shorter. On the “library knowledge” portion of the evaluation, we found that the most confusion centered on a question about what cannot be located using the library catalog. Using the reports module of the Bubble Publishing software, we can identify which exercises led to more or less accurate responses in this area. This information can be used to shape the development of new exercises.

Use of more formalized evaluation this year allows us to establish a baseline for future reference and a way to measure progress. One of the main enhancements planned for next year is the development of specific learning objectives that will be used for creating/ revising the exercises. The question will be how to retain the creativity and flexibility of the exercise, while giving more structure to the question creation aspect.

Also, this event took an extraordinary amount of planning time on the part of the volunteers. Will each exercise need to be recreated each year? Can we build on what we already have? These are issues that we need to address. Another possibility would be to combine the General Education 101 experience with a composition class so that the research would have a concrete purpose and would be an integral part of the curriculum of the class.

While this year certainly was far from a disaster, we will work on our “formula” for next year, adding some of the above elements in an attempt to continually improve the library experience for this large and vital group of students.

UL Honors Program Expedition knowledge question and results

2. To find a book or periodical owned by the Ekstrom Library, you should use:

  # of respondents %
Alta Vista 0 0%
Reader's Guide 0 0%
Minerva 2000 149 97%
ProQuest 5 3%

Notes

  1. 1. ACRL Task Force on Information Literacy Competency Standards. “Information Literacy Competency Standards for Higher Education (Draft), June 1999.” The final, approved version was published in the March 2000 issue of C&RL News.
  2. 2. Two good sources for the format and content of instruction evaluations are: Diana D. Shonrock, ed., Evaluating Library Instruction (Chicago: ALA, 1996).
  3. Wanda K. Johnston, ed., Library and Learning Resource Programs: Evaluation and Self Study (Chicago: ACRL, 1998).
  4. 3. For more information about Bubble Publishing Software, see: http:// bubblepublishing.com/WwwOffice.htm. ■
Copyright © American Library Association

Article Views (By Year/Month)

2026
January: 3
2025
January: 8
February: 13
March: 7
April: 17
May: 11
June: 20
July: 21
August: 19
September: 22
October: 22
November: 26
December: 29
2024
January: 3
February: 1
March: 2
April: 5
May: 4
June: 6
July: 2
August: 9
September: 5
October: 5
November: 2
December: 5
2023
January: 2
February: 1
March: 0
April: 3
May: 0
June: 1
July: 1
August: 0
September: 4
October: 3
November: 3
December: 4
2022
January: 2
February: 0
March: 1
April: 0
May: 4
June: 2
July: 4
August: 3
September: 4
October: 0
November: 1
December: 1
2021
January: 3
February: 3
March: 1
April: 6
May: 1
June: 1
July: 3
August: 0
September: 0
October: 4
November: 2
December: 0
2020
January: 8
February: 7
March: 4
April: 6
May: 7
June: 2
July: 4
August: 1
September: 2
October: 5
November: 0
December: 1
2019
January: 0
February: 0
March: 0
April: 0
May: 0
June: 0
July: 0
August: 11
September: 5
October: 4
November: 5
December: 3