College & Research Libraries News
The value of campus partnerships in redesigning library instruction: Administrators, faculty, and students get involved
Today’s academic libraries are faced with the need to be proactive and imaginative in developing instructional tools for the effective use of information technology. Ideas and comments from students provide valu- able insights but may not typically be com- municated to library professionals. This ar- ticle focuses on the importance of gathering student input and addressing expressed aca- demic library instruction needs. It also dis- cusses the roles that external entities can play in guiding methodical implementations of new information technology.
Environment
The University Libraries of Notre Dame is served by a libraries advisory council—comprised of alumni and corporate and community leaders who are dedicated to supporting and underwriting the various functions of the library. Opportunities for interaction with library faculty and staff are scheduled during the advisory council’s semiannual meetings.
During the spring 2000 meeting, several members of the advisory council participated in a hands-on library instruction session, which highlighted the libraries’ electronic resources. One council member expressed concern that students have not taken full advantage of these important resources. Another member suggested that students would respond well to Web tutorials, possibly designed as computer games.
Since the concerns and suggestions of the advisory council members fit with the libraries’ goal of designing and implementing interactive, computer-based instructional modules, the decision was made to capitalize on the interest and possible support of the advisory council members. A library instaiction task force, comprised of four instruction librarians and chaired by the coordinator of library instruction, was formed to lead the project.
Questions and assumptions
The associate director for user services directed the task force to “determine strategies and ‘best practices’ for involving students, faculty, and librarians in identifying issues, characteristics, goals, measures, and existing models for developing and marketing a Web- based library instruction program.” The task force would also identify key issues related to the information-seeking habits of undergraduates. Three sets of questions were formulated:
1. What can students tell us about their information-seeking behavior? How, when, and where do they prefer to focus their information-seeking efforts?
About the authors
Hector Escobar Jr. is visiting staff librarian, Joni Kanzler is coordinator of library instruction,G. Margaret Porter is librarian, and Cheryl Smith is staff librarian atthe University of Notre Dame, e-mail: hector.escobar.4@nd.edu; joni.e.kanzler.1@nd.edu, g.m.porter.2@nd.edu, cheryl.s.smith.454@nd.edu
2. How will the knowledge gathered from students affect our approach to teaching library skills and information literacy? In which formats should information be presented to reach the intended audience most effectively?
3. How do we develop, test, and evaluate library instructional programs that most effectively enable Notre Dame undergraduates to access and use information resources and services selected by the libraries?
Involving students in the process became the foundation for the work of the task force. We also identified individuals from other campus departments who would form the membership of two new groups: a project team and an advisory committee.
The project team and the advisory committee
Letters were sent to potential members of the two groups asking for their participation. The task force was rewarded with a high level of cooperation and enthusiasm from across the campus in response to the initial request letters. Although Notre Dame librarians have faculty status and serve on a variety of university committees, collaborations and coalitions are usually initiated outside of the library. This was an excellent opportunity to build support, alliances, and confidence in the libraries’ role on campus and in students’ intellectual growth.
The final project team includes two instructional technologists; an administrator in student affairs, who also teaches; a faculty member from the university’s teaching center, who works with graduate teaching assistants and teaches in the College of Science; and the four task force members. The four individuals chosen for the advisory committee include: a member of the library advisory council, who is also a visiting professor at the university; the dean of the First Year of Studies; the director of the first-year writing program; and the director of the Teaching and Learning Center.
Members of the project team were asked to assist with the construction and implementation of the program. The advisory committee would advise us of the best paths towards implementation.
Student questionnaires and focus groups
Efforts in fall 2000 focused on gathering information from students enrolled in the required first-year composition class (FYC). To get a variety of responses and ideas from students, we decided to use two different methods of data collection: a questionnaire and focus groups. Participants were limited to first- year students (FYS) who attended at least one library instruction session in their FYC class during fall 2000.
All FYC instructors were asked if they would allow their classes to participate in this study. Of the 44 FYC sections available, 14 sections taught by 11 different instructors participated and a total of 230 students responded to the questionnaire. Questionnaires were distributed and collected during a regular class period. The questionnaire, developed by task force members, was based on the three question sets mentioned earlier. Surveys conducted by other institutions and sources outlining survey design were consulted. The final questionnaire consisted of 11 questions, all with at least 3 response options.1
Students participating in a focus group were offered a meal and a $10 copy card as compensation. Fourteen students participated in three focus groups. Nine questions were formulated to enhance information gathered from the questionnaires.2 Questionnaires were distributed and answers were tabulated in October 2000 and focus groups were held in November. The task force recorded and compiled responses, questions, and discussion from the focus groups. We gained valuable information about students’ perceived difficulties with using the library and its Web pages, as well as their feelings regarding online help and tutorials.
Student findings
The task force was able to draw a number of conclusions based on students’ responses to questionnaires and focus group discussion. We used the combination of responses and comments to help us answer two of the three questions we had posed at the beginning of the process.
1. What can students tell us about their information-seeking behavior? PIoiv, when, and where do they prefer to focus their information-seeking efforts?
•they rely on the Web for accessing general information;
•they rely on librarians for starting research and developing research strategies, keyword selection, and information about electronic and other resources;
•they do not wait until the last minute to begin the research process;
•they use electronic resources for most of their research; and
•they use the library’s electronic resources from locations outside the library.
2. How will the knowledge gathered from students affect our approach to teaching library skills and information literacy?In which formats should information be presented to reach the intended audience most effectively?
•they do not use the available online help to any great extent;
•they experience great difficulty when attempting to develop effective keywords and search strategies;
•they are intimidated by the size of the library;
•they feel that locating print materials in the library is daunting and confusing;
•they believe a single library instruction session is inadequate exposure to library resources and research strategies; and
•they are not interested in using an online library instruction tutorial.
The questionnaires and focus groups did not provide us with any direct answers to our third question of how to develop, test, and evaluate library instructional programs that most effectively enable Notre Dame undergraduates to access and use information resources and services selected by the libraries.
The information gathered did, however, provide a foundation for expanding our instructional program and formulating a set of recommendations. We had clearer objectives for the advisory committee and the project team and would rely on their collective expertise to develop assessment tools and testing mechanisms.
Recommendations
Along with a report on the information-gathering process, the following recommendations were sent to the project team and the advisory committee.
All FYS should complete at least two library instruction sessions. Students value the personal contact with librarians and time for hands-on experience with library resources. Additionally, a Web-based online exercise should be created to assess the needs of the students and their understanding of research libraries.
Results of our initial investigations indicated that students' instructional needs differ from the suggestions of the libraries' advisory council: students do not show tremendous interest in online Web tutorials.
The path from the university’s Web pages to the libraries’ should be more direct. Better explanations of links are needed, as well as the development of prominently placed “How do I . . .” links or menus.
We also recommended the development of an information literacy course for all incoming FYS. Problem-solving skills, critical evaluation of information, and the ability to apply information efficiently and effectively would be key components of this course.
Implementation
The task force’s report and recommendations were given to the library administration, the advisory committee and project team, and library faculty. A subsequent meeting with the advisory council revealed the difficulties in adding new required course work to an already crowded first-year studies program and the challenges of altering course objectives on an institutional level. Nevertheless, the task force report brought the need for more extensive library instruction to the attention of key members of the campus community.
As a result, the director of the writing program recommended that all FYC instructors schedule two library sessions during fall 2001. Likewise, positive student and FYC instructor response to hands-on experience justified the purchase of additional laptop computers for use in two wireless library classrooms. In response to student concerns, easy access to basic research assistance was added to the libraries’ Electronic Resources Gateway Web page.
Discussions with the project team focused on the kind of online assessment tool that should be developed, tested, and implemented. Although the original goal of the task force was to develop an online tutorial, student responses and discussions with the members of the project team indicated that a better starting point would be a tool to assess student skills at the beginning of their first year.
The project team identified several campus entities that would be able to assist with the development and dissemination of the assessment tool. Project team members had been asked to serve based on their knowledge of students’ learning patterns, pedagogy, and technology. With their support and advice, the task force formulated a proposal for interested members of the libraries’ advisory council.
The proposal recommends an online skills assessment tool rather than an online tutorial. A projected budget details the cost of development, implementation, maintenance, and analysis. Concurrently, the task force is developing a list of possible competencies to be evaluated with the online assessment tool.
Conclusion
The University Libraries’ administration and the libraries’ advisory council shared a perception that a Web-based library instruction tutorial was needed. The library instruction task force was formed in part to research and develop this instructional technology. Results of our initial investigations indicated that students’ instructional needs differ from the suggestions of the libraries’ advisory council: students do not show tremendous interest in online Web tutorials.
Due to students’ varied experiences with libraries and information-retrieval methods prior to arriving on campus, a logical first step would be a Web-based library skills assessment test. Members of the advisory committee and the project team support this idea. The latter group would assist in developing the test and the venues needed for its administration to students. To ensure objectivity, future questionnaire development and focus groups would be conducted by professionals outside the library, in consultation with the task force.
The coalition resulting from the first year’s work has been an unanticipated benefit. By communicating with members of the libraries’ advisory council, we were able to build support at important administrative levels. We met with students and instructors from the FYC program in larger numbers than ever before. The project team will work with librarians in new and innovative ways.
In the coming year, we hope that the task force’s proposal for the assessment process will be accepted and funded by interested members of the advisory council so that work can proceed. While the most tangible result is the commitment to at least two instructional sessions for FYS, the formation of strategic alliances with external groups is equally important.
Bibliography
1. Brandt, D. Scott. “The multiple personalities of delivering training via the Web.” Computers in Libraries Y! (1997): 51-3.
2. Dewald, Nancy H. “Web-based library instruction: What is good pedagogy?” Information Technology and Libraries 18,1 (1999): 26-31.
3. “Transporting good library instruction practices into the Web environment: an analysis of online tutorials.” Journal of Academic Librarian ship 25, 1 (1999): 26-31.
4. Greenbaum, Thomas L. Moderating focus groups. Thousand Oaks, California: Sage Publications, 2000.
5. Merton, Robert Kind. The focused interview. New York: Free Press; London: Collier Macmillan, 1990.
6. Michel, Stephanie. “What do they really think? Assessing student and faculty perspectives of a Web-based tutorial to library research.” College and Research Libraries 62, 4 (2001): 317-32.
7. Morgan, David, ed. Successful focus groups. Newbury Park, California: Sage Publications, 1993
8. Stewart, David W. Focus groups. Newbury Park, California: Sage Publications, 1990.
9. Thomas, Susan J. Designing surveys that work. Thousand Oaks, California: Corwin Press, 1999.
10. University of Texas at Austin. Libraries for the Future. (Survey & Focus Group Outline). Austin, Texas: Graduate School of Library and Information Science, 2000. ■
Notes
- Please see Web site for full details: http://www.nd.edu/~refdept/instruction/ libra rians/litf/ index. shtml.
- Visit http://www.nd.edu/~refdept/instruc- tion/librarians/litf/focus_response.shtml).
Article Views (By Year/Month)
| 2026 |
| January: 14 |
| 2025 |
| January: 5 |
| February: 10 |
| March: 9 |
| April: 12 |
| May: 10 |
| June: 14 |
| July: 18 |
| August: 22 |
| September: 19 |
| October: 41 |
| November: 43 |
| December: 31 |
| 2024 |
| January: 1 |
| February: 0 |
| March: 1 |
| April: 7 |
| May: 2 |
| June: 3 |
| July: 2 |
| August: 5 |
| September: 4 |
| October: 6 |
| November: 1 |
| December: 2 |
| 2023 |
| January: 1 |
| February: 0 |
| March: 0 |
| April: 6 |
| May: 1 |
| June: 2 |
| July: 4 |
| August: 0 |
| September: 2 |
| October: 2 |
| November: 2 |
| December: 2 |
| 2022 |
| January: 2 |
| February: 0 |
| March: 0 |
| April: 1 |
| May: 1 |
| June: 3 |
| July: 3 |
| August: 2 |
| September: 1 |
| October: 3 |
| November: 1 |
| December: 1 |
| 2021 |
| January: 3 |
| February: 3 |
| March: 3 |
| April: 1 |
| May: 3 |
| June: 2 |
| July: 0 |
| August: 2 |
| September: 0 |
| October: 4 |
| November: 3 |
| December: 1 |
| 2020 |
| January: 8 |
| February: 5 |
| March: 3 |
| April: 6 |
| May: 11 |
| June: 1 |
| July: 3 |
| August: 1 |
| September: 4 |
| October: 5 |
| November: 2 |
| December: 3 |
| 2019 |
| January: 0 |
| February: 0 |
| March: 0 |
| April: 0 |
| May: 0 |
| June: 0 |
| July: 0 |
| August: 16 |
| September: 20 |
| October: 10 |
| November: 12 |
| December: 6 |