College & Research Libraries News

Those immersed resurface: A follow up with Track 2 participants of the first Information Literacy Immersion

by Michelle Toth

About the author

Michelle Toth is instruction librarian at the State University of New York-Plattsburgh, e-mail:

So what happens to all those great ideas and all that motivation that we get when we attend conferences and professional development opportunities? In the case of the first Track 2 participants of ACRL’s Institute for Information Literacy’s Immersion program, quite a lot. Two years after the first Immersion program, a follow-up survey pursued this question and found where great ideas and motivation are taking librarians and the institutions they work for.

In July 1999, ACRL’s Institute for Information Literacy held its first Information Literacy Immersion program at the State University of New York (SUNY)-Plattsburgh. The Immersion experience offers two distinct tracks to provide “intensive training and education for academic librarians”1 in information literacy. Track 1 immerses participants in a curriculum focused on understanding information literacy and developing and improving individual teaching and assessment skills. Track 2 participants, on the other hand, delve into the construction of programmatic plans and strategies for incorporating information literacy at libraries and institutions.

The idea for this follow-up study emerged from conversations with the dean of library and information services at SUNY-Plattsburgh, Cerise Oberman, following the 1999 Immersion program. While there are multiple aspects of the immersion experience that can be explored, the Track 2 goals of designing and implementing action plans for information literacy integration were particularly intriguing. By examining the efforts of Track 2 participants after the Immersion program, a study could provide a picture of progress on the part of librarians, as well as an informal evaluation of the Immersion program itself.

On the second anniversary of the first Immersion program, in the summer of 2001, a followup survey was sent to the first set of Track 2 alumni. The purpose of this survey was to see which information literacy initiatives were being pursued at the institutions of these Track 2 participants. In addition, it sought to measure the progress institutions were making with these initiatives and to see how valuable the Immersion program was in preparing participants for these tasks.

After the 1999 Immersion program, an electronic list was set up for the participants to continue to share and discuss issues of information literacy. The call for participation in this survey went out to an electronic discussion list that had been set up for the alumni of the 1999 Immersion program. Of the 51 Track 2 participants attending the Immersion, 35 replied that they would be willing to participate in this study.

The survey

The survey consisted of three parts. The first section gathered brief demographic information, while the second section outlined information literacy initiatives and asked respondents to rank their progress towards achieving those they selected. Respondents also rated the value of the Immersion program in working on these initiatives. The final section closed the survey with a few openended questions. The demographic section, which asked for information such as type and size of institution and number and status of librarians, was designed to help in the analysis of information and ranking of the initiatives section.

The identification and ranking of initiatives section was generated after reading action plans written by the participants and reviewing elements covered in the Immersion program and in the literature. While every attempt was made to come up with a list of initiatives that covered as many areas as possible, it would be impossible to cover them all. To address this, a blank box was left at the end of each category to allow for write-in initiatives that were not otherwise listed. Figure 1 lists the initiatives included in the survey.

In the survey, participants were asked to identify initiatives they and their institutions have worked on since the Immersion program. After identifying initiatives, participants proceeded to rank their progress and indicated how useful they found the Immersion experience in preparing them for these tasks.

For ranking purposes, a scale of one to five was used; number one indicated the smallest amount of progress or usefulness and number five indicated the most progress and usefulness. The open-ended questions at the end of the survey asked the librarians their opinions on the Immersion experience and the impact of initiatives on campuses.

Of the 35 surveys sent out, 22 were returned and 20 were used in this study for a return rate of 62.8 percent with 57.1 percent being used in the analysis. Statistics were run on all 20 surveys. Then the 20 were broken down into three self-identifying categories: Community and Technical Colleges, Four-Year Colleges, and Ph.D. Granting Universities. Seven of the twenty fell into the Community and Technical College category, seven in the Four-Year College category, and six in the University category.

Analysis of the data

This first analysis of the data has revealed that four of the 28 initiatives were being pursued in 17 or more of the 20 campuses reported. These four initiatives are: #7 “Gaining administrative support for information literacy initiatives and programs,” #8 “Developing strategic collaborations with campus groups and services to reach faculty about information literacy,” #9 “Having librarians recognized as the information literacy experts/consultants on campus,” and #23 “Setting learning objectives and goals for information literacy instruction.”

It was not surprising to find that three out of the four most common initiatives identified by this study fell into the Campus Outreach and Support category of the survey. Most certainly gaining attention and support on campus is a crucial part of moving information literacy programs and goals forward. The strategies of identifying librarians as the experts in this area, gaining administrative support and collaborating with groups and services to inform the campus community present a wellrounded approach to reaching your campus.

“Setting learning objectives and goals for information literacy instruction” was the one top initiative that fell outside of the Campus Outreach and Support category of the survey. While the importance of setting goals and objectives cannot by denied, the popularity of this initiative may have been a product of its time.

During the first Immersion program in 1999, the draft of the “Information Literacy Competency Standards for Higher Education” was being widely circulated and discussed. In addition, the 1987 “Model Statement of Objectives for Academic Bibliographic Instruction” was under revision by an ACRL task force, and a draft of the “Objectives for Information Literacy Instruction: A Model Statement for Academic Librarians” was available in the spring of 2000. It will be interesting to see in additional studies whether this initiative remains as frequently pursued among participants in the 2000, 2001, and other Immersion programs.

The progress librarians have made with their initiatives was perhaps the most difficult measure the survey attempted to make. While the survey was able to provide some gauge of progress for these initiatives, it has been able to do so only in a limited way. A number of variables, such as the priority given an initiative and the date work was started, were not measured by the survey. This may account for the sizable gap in the average scores ranging from 2.40 to 3.85. Survey respondents also indicated that they and their institutions were working on anywhere from seven to 23 different initiatives since the Immersion program. The data do not adequately show the actual rate of progress, but demonstrate the work and effort that is being made to move information literacy forward.

Measuring the usefulness of the Immersion program in preparing participants for working on their information literacy goals in the survey was more straightforward. The usefulness of the Immersion program in preparing Track 2 participants for working on the top four initiatives is rated highly, with average scores ranging from 3.66 to 4.0 on the five-point scale.

What is remarkable about these numbers is that while we are typically enthusiastic about our professional development experiences the first few weeks we return from a conference or workshop, this set of data shows that even after two years the Immersion experience is still being valued and used.

While the data from this study may not contribute to identifying “best practices,” it has identified common approaches for integrating information literacy and documents the value of the Immersion program. This information can be used to refine the curriculum for future Track 2 participants of the Immersion program and can be used by library organizations and committees in the development of workshops on the topic of integrating information literacy. It may also be useful to institutions identifying starting points for their efforts in working with information literacy. Finally, this information can be used as a starting point for continuing research and discussion. Particularly useful would be a longitudinal approach for tracking these and other Track 2 participants’ efforts.

As a starting point, this work has begun the rewarding task of documenting the efforts and progress of librarians working on information literacy at their institutions and the impact of the Immersion program in helping those academic librarians to achieve their goals.


  1. “Invitation to Apply” Information Literacy Immersion Web page. 17-Sep-2001 [cited May 13, 2002]. Available from http:// ■
Copyright © American Library Association

Article Views (Last 12 Months)

No data available

Contact ACRL for article usage statistics from 2010-April 2017.

Article Views (By Year/Month)

January: 5
February: 3
March: 1
April: 1
May: 2
June: 2
July: 1
August: 0
September: 0
October: 4
November: 1
December: 0
January: 0
February: 5
March: 0
April: 0
May: 7
June: 1
July: 2
August: 2
September: 5
October: 6
November: 0
December: 1
January: 0
February: 0
March: 0
April: 0
May: 0
June: 0
July: 0
August: 11
September: 2
October: 3
November: 3
December: 1