Assessing library instruction sessions: A pilot project at the University of Connecticut Libraries

Jennifer Lanzing; Anna Kijas


The Library Research Services (LRS) program area at the University of Connecticut is integral to the instruction of students, both undergraduate and graduate, and LRS members spend much of their time planning and teaching classes designed to increase the students’ information literacy and research skills. LRS members are subject specialist librarians, and they teach one- or multi-session classes geared toward research in a specific discipline. Sessions may include an introduction to library resources, database instruction, citation instruction, overview of library services, and other appropriate topics.

In the summer of 2010, the LRS team implemented a pilot program that produced two surveys to assess the effectiveness and value of these instruction sessions. One was for students, and the other for faculty. Data from each survey were correlated and distributed to the appropriate subject librarians following their instruction sessions. The goal of this program was not to test the information literacy of the students. The results were given to each of the individual librarians who taught the session so that they could use them as they saw fit to tweak their information sessions and provide better service to their constituencies.

This was the first attempt by the LRS area to evaluate our instruction sessions in a standardized way. In order to develop an understanding of assessment programs for library instruction, we conducted an environmental scan to examine the instruction assessment tools developed by other institutions, as well as a literature review to familiarize ourselves with the current methods and findings. Researchers have stressed that before an actual survey or other assessment tool can be created, the librarians involved must first decide what exactly they hope the goals of implementing such a tool will be.1

We took this to heart and began by meeting with the LRS subject librarians to discuss what questions they would like to see included and what questions they did not think this survey would be able to answer in a meaningful way.

Method

We entered our finalized survey questions into Survey Monkey. For both the student survey and faculty survey, we developed a set of statements followed by a five-point Likert scale, with 1=strongly disagree and 5=strongly agree. We also included several open-ended questions, where respondents were asked to elaborate on issues such as what they felt was the most valuable part of the session, what they would like to see included in a future session, and whether they felt the session was worthwhile and appropriate to their course.

The surveys were ready for distribution, via e-mail, at the end of September 2011 after classes had already begun. Following an instruction session, the individual liaison would send the links to both surveys to the faculty member with a request to share the student link with their class and to fill out the faculty survey. Surveys were accepted until the close of the fall semester. The survey creators gathered the data in Survey Monkey and used Excel to sort and analyze the data. We then shared aggregate data with the liaisons as a whole, as well as individual survey results with the appropriate liaison.

The goals of the analysis included: learning where students and faculty felt that more instruction was needed and discovering if there were variations or similarities among the different class levels from freshmen to graduate students.

At the end of the fall semester, we decided to continue the pilot program through the spring semester. This was partly due to our delay in making the surveys available in the fall semester and so that we could maximize our data pool. Some minor edits were made to the surveys. For example, the original surveys did not ask the survey taker to provide the name of the librarian who had taught the session. This made it difficult and time-consuming to match up surveys with the correct librarian. In order to protect confidentiality of the liaisons, we only wanted to share survey results with the specific librarian who had taught that session. Asking for the name of the librarian on the survey would greatly help with this issue. This change was made in the revised survey.

Also, due to the varied answers when asked to provide the course name, we clarified this question in the revised survey by including an example of a course number (i.e., ENGL 1010). The distribution method for the spring semester differed slightly, as well, giving the liaisons a couple of options for getting the links to students and faculty. One option was to give students five minutes at the end of a library session to fill out the survey, if possible. The option remained to e-mail the links to the faculty. The surveys closed at the end of the spring semester in May 2012.

Results

The results from both the student and faculty surveys were overwhelmingly positive from both fall 2011 and spring 2012. Almost all respondents found the instruction sessions to be useful and worthwhile. When we analyzed the results, a few patterns became clear. The most common answer to “Question 7: What would you like to see in a future session?” was “Nothing,” or some variation on that idea. However, in Fall 2011, 13.1% responded that they would like more focus on RefWorks instruction.

This was surprising to us because we offer several workshops throughout the year that are specifically geared toward RefWorks and teaching students the various features. What this finding may suggest is that the RefWorks workshops are not adequately advertised and students are unaware that they are available or that additionally scheduled workshops are necessary.

For spring 2012, this number was down to 6.3%. Based on the results of the survey from fall 2011, some subject specialists decided to hold RefWorks workshops specifically for the academic departments with which they worked. This may account for part of the decrease in respondents requesting more RefWorks assistance in the spring.

Several students expressed a desire to see more databases covered in the instruction session. Often a librarian will focus on several databases important to the specific discipline or relevant to a course assignment or project. In fall 2011, 10.7% of student respondents wished a greater number of databases were covered in the session.

A surprising number of students (9.8% in fall 2011, 9.4% in spring 2012) said that they wished the session had included a section on simple logistics about using a university library. This includes information regarding how to find books in the stacks, information about tutoring services provided in the library, and basic computer skills. Freshmen usually receive basic library orientation information in their English classes, but many students will not retain the lessons by the time they need to apply them to their own research. After seeing the survey results, some subject specialists have incorporated some basic library information into their one-shot instruction sessions.

Graduates were the only group to ask for longer sessions in the future. Also, graduate students were less likely than undergraduates to need additional help with basic library skills, such as finding books on the shelves or navigating the library Web site. Instead, graduate students’ responses indicated that they were mostly concerned with learning about relevant databases and where to find information specific to their research. They wanted more in-depth, high-level research help.

Most liaison librarians discuss the session with the faculty member in advance so both parties understand the intended instructional goals. The faculty responses were overwhelmingly positive, and they only included a few suggestions for content that could be included in a future session.

One comment expressed an interest in having someone from the Writing Center co-teach an information session with the liaison librarian, while another said that next time, he or she would want to have the information session after a paper is assigned rather than before so that students can better envision how to apply what they are learning.

One answer that came up in about 12.5% of the surveys in spring 2012, but was virtually absent from fall 2011, was more focus on searching techniques. Also, 10.2% of the students asked for more instruction on the basic research paper process. This did not come up in the fall 2011 surveys either.

One factor that might account for these differences between semesters is a higher level of respondents from the humanities and social sciences in spring 2012, departments that typically require term papers at least in the upper undergraduate levels.

The surveys from fall 2011 had a higher percentage of respondents from the hard sciences and engineering, where a typical research paper at the end of the semester is not often assigned. One way this request was addressed by individual subject specialists is to include more interactive exercises in their sessions, which allow students to have hands-on experience with different search strategies and databases.

Conclusion

There were some problems we encountered in the analysis of the survey results. One issue we had was the lack of clarity in some of our survey questions. Students gave wildly different answers to questions such as “What is this session for?” and “Where did this session take place?” In the next iteration of the survey, we knew we needed to be clearer as to what exactly we were asking for from the students. We added an example of a course name to the former and an example of a classroom name to the latter.

Other problems we encountered revolved around the limitations of the Survey Monkey software. We would need to go into the results manually and pull the data for each individual liaison. We realized that this was far too time consuming for anyone to make this part of their day-to-day workflow. We needed a product that would allow each liaison to log on and retrieve only his or her survey results. Survey Monkey does not have that functionality and this compromises our promise of confidentiality made to the liaisons and the respondents.

To get the functionality we required, we decided to try out a different survey software called Qualtrics. Qualtrics is now the standard tool that our subject specialists use to evaluate their information sessions. Each librarian has the ability to send a URL link to the appropriate faculty member and ask him or her to fill out the faculty survey and to share the student version with the class. The Qualtrics software allowed us to create an account with the survey template for each librarian. An individual librarian is able to log in and see only his or her survey results. The director of the LRS area is able to view all results.

This protects anonymity and eliminates the need for a third party to determine the appropriate librarian and distribute results. Each librarian is responsible for keeping track of his or her survey results and making changes to his or her instruction sessions when necessary.

The survey provides another tool for instruction librarians to use in evaluating their sessions and improving the overall quality of the library instruction program.


Note
1. Merz, LH.. Mark, BL.. , Assessment in College Library Instruction Programs (Chicago: Association of College and Research Libraries, 2002 );Shonrock, DD.. , Evaluating Library Instruction: Sample Questions, Forms, and Strategies for Practical Use (Chicago: American Library Association, 1996 ).
Copyright © 2013 Jennifer Lanzing and Anna Kijas

Article Views (Last 12 Months)

No data available

Contact ACRL for article usage statistics from 2010-April 2017.