Making Assessment Matter
Start at the End
Strategies for Actionable Assessment Results
© 2025 Megan Oakleaf and Becky Croxton
Making Assessment Matter is a four-part C&RL News series focused on maximizing the impact of academic library assessment. This first article outlines four key strategies for launching assessment projects designed for action and impact. Future articles will explore how to anticipate decision-making pathways to encourage follow-through, engage participants early to increase the likelihood of actionable outcomes, and craft communications that present compelling results to key stakeholders. Together, the series equips librarians to use assessment to drive meaningful change.
Introduction
Academic library assessment requires knowledge, time, effort, and a commitment to reflection and change. Assessment projects hinge on a willingness to listen and continuously improve. However, many library assessments that are carefully conceptualized, designed, and deployed never “go” anywhere in terms of resulting in beneficial decision-making and action-taking. Assessment librarians and higher education assessment experts have long articulated the importance of “closing the loop” in assessment—that is, using assessment results for improvement. And still, often library assessment projects never quite get to that point. To realize the benefits of assessment, librarians can design assessments from the outset with an eye to the decisions and actions that may result from them. In short, starting assessments with the end in mind is a key strategy for ensuring that projects result in positive change.
Four key strategies can increase the likelihood that an academic library assessment will lead to beneficial outcomes.
1. Identify the Big Picture Questions and Problems to Solve
To ensure assessments lead to meaningful action, start by identifying the overarching questions or problems that need to be addressed. Taking a “big picture” approach helps ensure that the library’s assessment efforts are strategic, impactful, and aligned with broader institutional and organizational goals.
To focus your assessment project, ask yourself—or your stakeholders—“What is the primary purpose of this assessment?” Often, the goal is to better understand the library’s users, community, services, resources, or spaces in order to inform continuous improvement. Assessments may also be conducted to generate data to guide decision-making, enhance services, or demonstrate the library’s value.
At the onset, it’s essential to envision what the results will enable you to do. This step adds clarity, purpose, and strategic direction to the project. Knowing how you intend to use the results ensures the assessment is goal-driven rather than data-driven for its own sake. It also helps ensure the data collected will lead to specific, actionable improvements rather than general observations. Broadly, answers to the question “What do you hope the assessment results will enable you to do?” might include making a positive difference for users; engaging and communicating with users; improving services, collections, and spaces; aligning expenditures with priorities; enhancing communication with users; or demonstrating alignment with the library’s values and user needs.
An often overlooked but important question to ask before beginning a project is “What consequences might occur if the library does not conduct an assessment?” Choosing not to assess may result in missed opportunities to engage with users, wasted resources, failure to meet stakeholder expectations, continuation of ineffective or harmful practices, or a diminished perception of the library as a valuable contributor, to name just a few.
2. Apply a Listening Framework to Build Understanding
Applying a listening framework to assessment planning increases the likelihood of beneficial outcomes. While a cursory purpose for an assessment project might be easily identified, it’s important to probe deeper to learn more about the complex rationale for the project. At its core, assessment is about listening.1 True listening moves beyond surface-level answers and requires noticing, maintaining attention, and using perception to improve awareness, understanding, empathy, communication, and responsiveness. Indeed, many assessment projects are designed to listen to users and their perspectives. Some are designed to listen and solicit feedback from our library worker colleagues (either as individuals, as members of the library organization, or as secondary sources of user perspectives). Yet other assessments enable us to “listen” to our library services, spaces, collections, or technology, and determine how effective, efficient, or impactful they are or could become.2 Most library assessment projects use listening to close a gap of awareness or knowledge and ultimately create a bridge from what the library is currently doing to what the library could do better in the future.
All assessments should be focused on listening. However, the listening process begins not with the deployment of an assessment methodology; it starts from the very beginning. Using a listening framework helps assessment practitioners uncover the deeper motivations behind the assessment project. What do you really want to know? If you’re not the initiator of the assessment, what does the initiator want or need to know? And why do you (or they) want to know it?
There are often multiple underlying purposes for an assessment. You might want to learn what your users value most or how well the library is meeting their needs. A colleague or department may be seeking ways to evaluate their performance or align their actions with stated or implicit values. Library leadership might be interested in understanding—or enhancing—the actual or perceived value of library engagement. Or, perhaps you, as an assessment practitioner, want to explore whether the library is learning from its experiences, improving over time, or making meaningful progress.
As you listen for the deeper motivations behind an assessment, one valuable lens to apply is the concept of triple-loop learning.3 As a model, triple-loop learning recognizes that information gaps exist and acknowledges that to learn (and improve), we can ask ourselves three key questions:
- Are we doing things right? (single-loop learning)
- Are we doing the right things? (double-loop learning)
- How do we know they’re the right things? Are we right about those beliefs? (triple-loop learning)
Listening carefully—to users, colleagues, leaders, and ourselves—as we explore these questions is essential to designing, implementing, and acting on meaningful assessment projects. Triple-loop learning encourages us not only to evaluate actions and strategies, but also to reflect on the values and assumptions that shape them.
3. Articulate the Underlying Impetus for the Assessment
The third step in focusing an assessment with the end in mind is to use what you’ve learned from your investigations to clearly articulate the underlying impetus for the project; this ensures that the assessment responds to the needs of decision-makers, action-takers, and those who will experience the end results of the project. (The articulation process can also help you uncover and challenge any false assumptions revealed in your information gathering.) Most assessments seek to increase understanding; familiarity with four common categories for assessment drivers can help articulate the task ahead.
- Assessments are often rooted in values and priorities—those of users, library workers, administration, institutional leaders, or other communities. In some cases, these values and priorities serve as the guiding principles of an assessment project. They can also shape the attitudes and behaviors of those conducting or participating in the assessment. Choices made during the design, deployment, analysis, and communication of any assessment project reveal the values of those involved.
- Assessments are frequently focused on changes and trends. Some assessments seek to help libraries understand, forecast, or influence trends within or outside the library. These may include institutional, political, economic, cultural, social, technological, environmental, or legal shifts. The goal might be to understand how library users, personnel, services, resources, facilities, or operational workflows currently function—or to anticipate how the library will be impacted in the future.
- Assessments may be used to evaluate whether to sunset a particular service or resource. Because libraries sometimes struggle to discontinue offerings, assessments can help gauge the relevance, impact, political consequences, or the effort required to revitalize them.
- Assessments are usually intended to close a knowledge gap. Most librarians know their libraries and their users well but will acknowledge that there are perspectives and insights that they are unaware of and need to learn more about. While we can’t always anticipate how an assessment can add to our professional knowledge, it helps to begin with a clear sense of the gaps we’re trying to close while also staying open to unanticipated insights.
4. Create User Stories to Guide Assessment
Once you’ve uncovered the purposes of an assessment, creating user stories is a powerful step toward ensuring actionable results. The user story model offers a simple, three-part structure that answers the who, what, and why of assessment.4 By using these three components, user stories link library services, resources, and spaces to intended outcomes. This action-oriented approach keeps solutions open-ended while providing useful constraints, focusing the assessment on outcomes rather than activities.
Each element of a user story emphasizes impact:
- Who needs to make or direct a change?
- What question or need can be acted upon?
- Why are we making the change? Is there an intended purpose or outcome?
User stories can be formatted in various ways, such as:
- As [who], I want [what], so that [why].
- As a [user], I want [goal], so that [reason].
- As [stakeholder], I want [to do something] in order to [achieve outcome].
Here are a few examples:
- As a librarian [who], I want to know whether students who use reference services earn better course grades [what], so that I can advocate for more resources and improve service delivery [why].
- As an administrator [who], I want to understand which engagement activities support student success across populations [what], in order to tailor outreach and services [why].
- As a student [who], I want to know if using library resources saves me money on textbooks [what], so I can reduce debt and stay in school [why].
Conclusion
In short, grounding your library assessment project by using these four key strategies can help to ensure an actionable library assessment project, right from the start. Together, they enable library assessment practitioners to avoid projects that stall or lack direction and increase the likelihood of generating results that support decision-making and action-taking for positive change.
The next article in the series will highlight key strategies for engaging assessment participants early in the process to increase the likelihood of actionable outcomes. 
Notes
- Shelley E. Phipps, “Beyond Measuring Service Quality—Learning from the Voice of the Customers, the Staff, the Processes, and the Organization,” in Proceedings of the ARL Measuring Service Quality Symposium, Washington, DC, October 20–21, 2000, (Washington, DC: Association of Research Libraries, 2001).
- Joseph R. Matthews, “A Culture of Assessment,” in Strategic Planning and Management for Library Managers (Westport, CT: Libraries Unlimited, 2025), 95–107.
- Mark Holmgren, “Becoming a Learning Organization,” Tamarack Institute, May 14,
2014, https://www.tamarackcommunity.ca/latest/becoming-a-learning-organization. - Megan Oakleaf, Library Integration in Institutional Learning Analytics. EDUCAUSE, 2018. https://library.educause.edu/resources/2018/11/library-integration-in-institutional-learning-analytics.
Article Views (By Year/Month)
| 2026 |
| January: 75 |
| 2025 |
| January: 0 |
| February: 0 |
| March: 0 |
| April: 0 |
| May: 0 |
| June: 0 |
| July: 0 |
| August: 0 |
| September: 13 |
| October: 1275 |
| November: 395 |
| December: 127 |