08_McCartinMarkowskiEvers

Developing an assessment plan for information literacy learning outcomes

Process and planning

Lyda Fontes McCartin is interim director of the Center for the Enhancement of Teaching and Learning, email: lyda.mccartin@unco.edu, Brianne Markowski is head of Information Literacy and Undergraduate Support, email: brianne.markowski@unco.edu, and Stephanie Evers is information literacy librarian, email: stephanie.evers@unco.edu, at the University of Northern Colorado

The introduction of the ACRL Framework for Information Literacy for Higher Education provided an opportunity for libraries to revisit student learning and instruction practices. At the University of Northern Colorado Libraries, we embarked on a process of revising our shared student learning outcomes (SLOs) for all 100-level information literacy credit courses.1 The credit courses, taught by librarians, are offered in conjunction with programs on campus, like the Honors program and Center for Human Enrichment, or as a major requirement for Criminal Justice, History or Audiology and Speech Language majors. The courses are 1-credit, 8-week classes that introduce students to the research process and focus on the following SLOs:

SLO 1: Students will be able to develop a research process.

SLO 2: Students will be able to demonstrate effective search strategies.

SLO 3: Students will be able to evaluate information.

SLO 4: Students will be able to develop an argument supported by evidence.

Once our SLOs better reflected concepts and practices from the ACRL Framework, we began to develop an assessment plan for our credit course program in order to determine if students were meeting our SLOs. As the importance of assessment in library instruction has gained attention, many have developed SLOs and assessments for individual sessions, but program-wide evaluation can seem daunting. This article covers the steps we took to develop a program-wide assessment plan, including the lessons we learned along the way.

The process

Assessing student learning outcomes across courses and instructors requires significant buy-in from all who will teach in the library’s instruction program. Instructors must give up some of the freedom they enjoy in the classroom in order to systematically collect useable assessment data, which can provide evidence of learning and information for curriculum changes. Thus, involving all teaching librarians in the development of the assessment plan is essential. Our assessment plan for the shared 100-level SLOs was collaboratively developed by the libraries’ Curriculum Committee, comprised of all librarians who teach in the information literacy credit course program.

In the first year, we agreed we would use signature assignments to assess the 100-level SLOs. Signature assignments are course-embedded assignments, activities, projects, or exams that are collaboratively created by instructors to collect evidence for a specific learning outcome. These assignments are then embedded in all sections of a course, regardless of instructor.2 We worked on developing our assessment plan one SLO at time using the following process:

Step 1: Develop signature assignment. Each librarian was asked to bring an idea for an assignment that would assess the SLO to a Curriculum Committee meeting. The proposals ranged from assignments that we were already using to assess the SLO in our courses to ideas drawn from the literature. At the meeting, everyone briefly described their proposed assignment to the group. We then discussed the merits of the various assignments as tools for assessing student learning. Discussion prompts include:

  • What do you like about a proposed assignment?
  • Do you feel the assignment would assess the outcome in a meaningful way?
  • How would the assignment work in your course?

After vigorous debate, we agreed upon a signature assignment to pilot. Because the original, proposed assignment often evolved based on our group discussions, a librarian was tasked with revising the assignment to reflect changes we had discussed, as well as developing instructions for students and instructors.

Step 2: Pilot signature assignment. The following semester, two librarians volunteered to pilot the proposed signature assignment. The librarians revised their course schedules to integrate the assignment as either an in-class activity or take-home assignment. Student work was evaluated by the course instructor for grading purposes and collected for subsequent analysis. At the end of the semester, the librarians shared their experiences at a Curriculum Committee meeting by reflecting on the following questions:

  • How did the assignment fit in your course schedule? How and when was the assignment introduced? What instructional activities did students engage with before completing the assignment?
  • Were the assignment instructions clear to students? What revisions may be necessary?
  • Based on your experience, what would you recommend to other instructors?

Piloting allowed us to gather valuable information on how the proposed signature assignment would work in the classroom before embedding the assignment in all 100-level courses.

Step 3: Determine analysis procedure. While the pilot was underway, we discussed how we would analyze the data (student work) once collected. The analysis procedures included descriptive statistics on the percent of students answering a test question correctly, rubrics, and qualitative analysis. Depending on the anticipated complexity of the analysis procedure, one or two librarians were tasked with looking for examples of how others had assessed similar learning outcomes and modifying those procedures to fit our needs.3 At the end of the semester, these librarians presented their recommendations to the rest of the Curriculum Committee. Then we tested the recommended analysis procedure on data collected from the pilot using the following questions:

  • Does this assignment and analysis procedure assess the desired learning outcome?
  • Does this assignment and analysis procedure produce information you feel is useful for improving teaching and learning?
  • What, if any, changes should we implement to the assessment?

Step 4: Set achievement benchmark. After discussing the results from the pilot, we agreed upon an initial benchmark level of achievement for the SLO. This would be used to measure our success in achieving the SLO once we fully implemented the signature assignment. For example, we use a standards benchmark for SLO 3—75% of students will correctly identify each source type (primary research, secondary research, non-research/opinion). There are various types of performance benchmarks, and it is important to consider what type of benchmark you want to set for your SLOs.4

Step 5: Document the assessment method for the SLO. Two documents help us keep the assessment process on track: the Summative Assessment Overview and the Assessment Plan.5 The Summative Assessment Overview provides instructors with the information they need to embed the signature assignment in their course and collect data for assessment. It also includes a detailed description of the analysis procedures and reporting practices. The Assessment Plan provides a big-picture summary of the assessment methods for all program SLOs and documents who is responsible for data collection, analysis, and reporting. It also describes when and how often assessment occurs.

In the second year, we implemented the first signature assignment in all sections of the 100-level courses and began the process again for the next SLO. Now in the fifth year, we have successfully embedded signature assignments for each SLO into all sections of 100-level information literacy courses taught at the University of Northern Colorado. At the end of the 2019–2020 academic year, we completed one full cycle of our assessment plan. We continue to meet regularly to analyze data collected from the signature assignments and to discuss how we can use what we’ve learned from the assessment data to improve teaching and student learning.

Conclusion

Similar to the process we used to develop our SLOs this process for developing a program-wide assessment plan is meant “to be flexible and work within the context of any instruction program.”6 While the process described here was used to assess shared credit course SLOs, the process can be used by libraries looking to assess library-wide SLOs for one-shot or embedded information literacy instruction programs. Below are some suggestions for successfully implementing this process at your library.

Be prepared to rewrite SLOs. During this process, we revised multiple SLOs as we discovered the SLO language did not truly reflect what we wanted to know about student learning. We recommend reviewing each SLO before you begin developing signature assignments to make sure you are satisfied with the language and that the SLO describes what you want to assess. Then, regularly revisit the SLO while developing the signature assignment. Sometimes it is the process of creating the assessment that indicates an issue with the learning outcome being assessed. It is also a good idea to review the SLO as you review the pilot data. This is often where we determined that the data we collected for the SLO was not what we really wanted to assess in our program.

Do not skip the pilot phase. The pilot phase is critical to project success. The pilot indicates issues with the signature assignment, SLO, and the data collection and analysis process. We piloted each SLO in multiple courses, which allowed us to look at a small sample of data and get feedback from the instructors about the logistics of implementing the assignment. The pilot allows you to make changes to the assessment and start over, if needed, before full-scale implementation of the signature assignment.

Work on one SLO at a time. When we first began developing our assessment plan, we believed we could develop it in two years by working on one SLO each semester. We underestimated the amount of time it would take to develop, pilot, analyze data, and revise the signature assignment for each SLO. We slowed down our process and focused on one SLO each academic year. This made the process longer, but easier to manage. As we made progress, we were often able to have one signature assignment in pilot phase and another in development phase. A timeline will help keep the process on track, but be prepared to adjust as needed.

Set aside time for this process. Our Curriculum Committee meets monthly. We were able to accomplish a lot during these meetings, but we also had other business to attend to. At the end of the first year, we held our first, daylong assessment retreat. This gave us more time for in-depth discussions. We have continued to hold yearly assessment retreats to analyze data and discuss changes to the curriculum for the following year.

Document every decision. Early on we did not take extensive notes during meetings, which we regretted. With all the changes you will make to your SLOs, assessments, and curriculum throughout this process, it is important to document each decision you make. Assign a notetaker for each discussion, and keep notes in a central location for easy access.

Determine a project manager, but share the work. There are a lot of moving parts to this process, and a project manager is key. Our project manager was the committee chair. This person determined the timeline, set meetings, kept track of each phase, and assigned tasks. While it is important to have a project manager, the work must be shared. For one, it is too much work for one person. More importantly, sharing the work builds buy-in and ownership of the process for all instruction librarians.

Developing an assessment plan for shared information literacy learning outcomes is a time-intensive process, but it is possible with teamwork, organization, and a willingness to revisit past decisions.

Notes

  1. Andrea Falcone and Lyda McCartin, “Be Critical, But Be Flexible: Using the Framework to Facilitate Student Learning Outcome Development,” C&RL News 79, no. 1 (2018): 16–19.
  2. Office of Institutional Research, Planning and Effectiveness, “Signature Assignment: Quick Reference Guide,” Pratt Institute, June 17, 2013, https://www.pratt.edu/uploads/signature-assignment-resources.pdf.
  3. Brianne Markowski, Lyda F. McCartin, and Rachel Dineen, “Using Signature Assignments to Assess Information Literacy Outcomes” (poster presentation, Library Assessment Conference, Chicago, IL, forthcoming).
  4. Linda A. Suskie, Assessing Student Learning: A Common Sense Guide, 2nd ed. (San Francisco: Jossey-Bass, 2009): 233–52.
  5. See Lyda McCartin et al., “Assessment: 100-Level LIB Course Assessment Plan,” University of Northern Colorado, 2020, https://digscholarship.unco.edu/infolit/24.
  6. Falcone and McCartin, “Be Critical,” 19.
Copyright Lyda Fontes McCartin, Brianne Markowski, Stephanie Evers

Article Views (Last 12 Months)

No data available

Contact ACRL for article usage statistics from 2010-April 2017.

Article Views (By Year/Month)

2021
January: 415
February: 116
March: 4