Evidence to Action
Communicating for Meaningful Change
© 2026 Becky Croxton and Megan Oakleaf
Making Assessment Matter is a five-part C&RL News series focused on maximizing the impact of academic library assessment. The first article introduces strategies for launching assessment projects designed for action and impact. The second examines how to equip librarians to use assessment results in practice, while the third identifies pathways for converting those results into action. This fourth article focuses on communication and explores how to present assessment results in ways that resonate with key decision-makers, prompt meaningful change, and sustain engagement and investment in assessment as a practice of continuous improvement. The final article will explore how AI can enhance assessment across the entire cycle while keeping expert judgment and responsible practice at the center. Together, the series empowers librarians to use assessment to drive meaningful change.
Introduction
Effective communication is a powerful tool for maximizing the impact of assessment. When assessment results are clearly and thoughtfully shared, they spark meaningful dialogue, guide evidence-based decision-making, and inspire transformational change. Communicating assessment work ensures that the time, effort, and expertise invested in the process lead to improvements that matter to key audiences. Unfortunately, communication is often an overlooked component of assessment practice. Without intentional and well-crafted messaging, assessments may go unnoticed, be misunderstood, or fail to reach decision-makers. In some cases, the absence of communication can erode trust when changes occur without an explanation or rationale.
To amplify positive impact and avoid these pitfalls, librarians should design communications that reach target audiences with clear, compelling, and actionable messages. The ability to tailor assessment findings to the needs and expectations of diverse stakeholders is an essential professional skill that helps ensure assessments contribute to informed decisions and meaningful improvements.
The following sections outline six strategies librarians can use to design and deliver assessment communications that resonate with key audiences and support positive change.
1. Identify Key Communication Audiences
The second article in this series, “From Subjects to Partners: Centering Participants in Library Assessment,” positioned the identification and engagement of internal and external stakeholders as foundational to assessment work.1 This provides a strong starting point for determining who should receive assessment results. If individuals or groups helped inform a study or will be affected by its findings, they should be included in assessment communications.
Beyond study participants and users, key audiences often include decision-makers responsible for allocating time, resources, roles, and responsibilities; individuals who control or influence funding; and others who may be affected by changes resulting from the assessment, even if they were not directly involved. Decisions about key audiences should be guided by the assessment’s purpose, context, and framing, as well as by the decisions or actions the results are intended to inform.
Identifying key audiences is more than compiling a distribution list. Effective assessment communication requires attention to what will make information meaningful and usable for different groups. Consider audiences’ familiarity with assessment methods, their values and perspectives, and the contexts in which results will be interpreted and applied. Checking in with representatives of key audiences helps ensure that communications align with the intended goals of use, employ appropriate language, and reflect preferences for format and scope.
Examining audiences’ needs for summary or detail, levels of engagement with results and subsequent actions or decisions, and preferred modes of communication helps clarify how assessment information can best be positioned for use. As key audiences are identified, step back to consider the role assessment results are expected to play for each group. Rather than asking only who should receive findings, focus on how assessment information will be used and what conditions are necessary to support action.
Who will need access to the results?
- Individuals or groups represented in the data or affected by the resulting changes
- Decision-makers responsible for setting priorities, allocating resources, or approving next steps
- Partners or collaborators involved in implementing recommendations
- Funders or sponsors with responsibility for sustaining efforts over time
How will the results serve each audience?
- Inform decisions or guide strategic direction
- Support operational or service-level improvements
- Build a shared understanding of challenges or impact
- Demonstrate alignment with institutional goals or accountability expectations
With key audiences and purposes defined, assessment communication can shift from identification to design. Figure 1 outlines core documents that support this shift, helping practitioners articulate positioning, clarify key messages, select appropriate evidence, and choose effective communication media. The next sections explore each of these components and how they work together to support audience-centered use of assessment results.
2. Draft a Positioning Statement
The first step in preparing effective communication of assessment results is to create a document known as a positioning statement. Positioning statements are brief (typically 200 to 500 words in length), internal facing, and designed for those directly responsible for the assessment project. Because positioning statements serve to organize the main results of an assessment project succinctly for an internal audience and articulate the core narrative demonstrated by project results, they may assume a knowledgeable reader and include jargon. The goal of a positioning statement is to summarize the answers to the project’s driving questions and state the main ideas revealed by your assessment results. They should emphasize impact and reflect the values of the project partners. They answer questions such as:
- What do we want others to know about our results?
- What takeaways are important to communicate?
- What results are actionable or help improve understanding of the situation under consideration?
By encapsulating the purpose, results, potential impacts, and likely actions indicated by an assessment for project “insiders,” internal positioning statements are a foundational step for crafting external assessment communications, including key messages customized for individual target audiences. Table 1 provides a useful guide for crafting a positioning statement describing assessment results. You might also consider adding sections on unexpected findings and limitations or caveats to your positioning statement.
Table 1. Positioning Statement Guide
|
Component |
Goal |
Guiding Questions |
Your Response |
|---|---|---|---|
|
Assessment context and purpose |
Establish scope and shared understanding for internal readers. |
|
|
|
User stories and/or driving questions |
Center the statement on the original intent of the project. |
|
|
|
Key findings |
Summarize the most important results succinctly. |
|
|
|
Core narrative |
Synthesize findings into a single, coherent story. |
|
|
|
Impact |
Connect results to institutional or partner values. |
|
|
|
Actionable insights |
Identify decisions or improvements implied by results. |
|
3. Distill the Positioning Statement into Key Messages for Individual Audiences
Building on the foundation provided by a positioning statement, effective communication shifts outward to crafting key messages tailored to the needs of individual audiences. Key messages are external-facing, ready-to-deliver statements that articulate the most important thing a particular audience needs to know about an assessment. Key messages are not slogans or taglines; they should be purposeful and concise (typically twenty words or fewer) and designed to support understanding and action.
While they are grounded in a shared positioning statement, key messages are customized to reflect what matters most to specific audiences, what decisions or actions they influence, and the context in which they will encounter the information. In practice, this often results in multiple distinct messages for a single assessment project.
One useful approach is to develop a simple chart, as illustrated in Table 2, that pairs each target audience with its corresponding key message. This exercise helps clarify priorities, surface gaps or redundancies, and ensure that messages remain focused and aligned with intended audiences. When clearly articulated, key messages provide a strong foundation for selecting evidence, shaping materials, and choosing appropriate communication formats in subsequent steps.
Table 2. Sample Key Messages for an Information Literacy–Related Assessment
|
Target Audience |
Key Message |
|
Provost |
Undergraduate students who participate in library instruction demonstrate higher first-to-second-year retention than non-participants. |
|
University faculty (at large) |
Courses with high DFW rates that integrate library instruction show lower drop, fail, and withdrawal rates. |
|
University faculty who teach entry-level writing (Composition 150) |
Students in Composition 150 sections that include library instruction earn higher course grades than non-participating sections. |
4. Identify Proof Points for Each Key Message
Standing alone, a key message may appear to audiences as an unfounded assertion. Proof points consist of evidence, data, or other information that “backs up” your key messages and helps audiences understand and bolster their belief in them. Proof points are essential for assessment communications that are believable, documented, and answer initial audience questions, such as “How do you know?” or “By how much or to what degree?” Generally, proof points will emerge from your assessment results, though in some cases they might be derived from related evidence collection or research. Proof points should be used to provide evidence of the impact, value, accuracy, and/or validity of your key messages.
When identifying proof points for your key messages, take care not to overwhelm your audience. Stay alert for data points that are not directly relevant, and ensure any data visualizations are clear and accurate. It’s also essential to acknowledge limitations or flaws and not misrepresent or overstate your proof points. If more nuance is necessary, use appendices or links to additional data to ensure that the full picture is accessible to audiences with questions or more detailed information needs. To clearly connect key messages with their supporting evidence, Table 2 can be expanded to include proof points, as illustrated in Table 3.
Table 3. Sample Proof Points Aligned with Audience and Key Messages
|
Target Audience |
Key Message |
Proof Points* |
|
Provost |
Undergraduate students who participate in library instruction demonstrate higher first-to-second-year retention than non-participants. |
Retention rates
|
|
University faculty (at large) |
Courses with high DFW rates that integrate library instruction show lower drop, fail, and withdrawal rates. |
DFW rates in STEM courses
|
|
University faculty who teach entry-level writing (Composition 150) |
Students in Composition 150 sections that include library instruction earn higher course grades than non-participating sections. |
Average course grades for CO 150
|
*Include measures of statistical significance (e.g., ANOVA or T-test results with p-values and effect sizes).
5. Select Communication Media That Match the Target Audience
Preparing a summary report that describes the assessment’s purpose, findings and interpretation, and conclusions is always good practice. Such reports provide an essential record of the work and ensure transparency and rigor. However, reports alone are rarely sufficient to support effective communication that inspires action or informs decision-making. Not all audiences will have the time, interest, or contextual knowledge needed to engage fully with a comprehensive report, even when the findings are relevant to their roles, responsibilities, and decision-making needs.
Selecting communication media that align with audience needs and contexts is a critical step. Alternative formats can help convey key messages more effectively. Depending on the audience, this may include a brief slide deck that combines text and visualizations to highlight major findings; a one-page executive summary that foregrounds implications and recommendations; or infographic-style handouts that make results accessible and scannable. Regardless of format, providing access to the full report remains important for those who wish to explore the assessment in greater detail.
When choosing communication tools, several guiding questions can help inform decisions:
- Does this communication tool address the right audience and speak to their values, priorities, and needs?
- Does it convey the message and image you want to project?
- Will it produce intended results?
- What tasks and resources are required to implement this messaging?
Before finalizing tools and designs or broadly disseminating materials, test them with representatives of the target audience to ensure they resonate and will support intended uses.
6. Construct a Communication Matrix to Organize Messaging
A messaging matrix is an effective tool for managing multiple audiences and delivering tailored messages through an array of communication media. A typical messaging matrix specifies both the audiences being addressed and the communication channels through which key messages will be delivered. By organizing audiences and communication venues, a matrix ensures consistent messaging in terms of professional tone or required branding. It also enables efficiencies. For example, if social media messages are required to undergo an approval process, assessment communicators can “batch” messages designed for diverse audiences through the process systematically, reducing “one-off” processes that result in slower timelines, additional labor, or omissions. Table 4 depicts a simple messaging matrix. A more detailed matrix could include key messages at the intersection of an audience and a communication medium, rather than an “X” or checkmark.
Table 4. Sample Communication Matrix with Audience and Communication Media
|
Audience |
Email Blast |
Social Media |
Library Newsletter |
Campus Communication |
Presentation |
Report |
|
Provost |
X |
X |
X |
|||
|
Library dean |
X |
X |
X |
|||
|
Library public services staff |
X |
X |
X |
|||
|
Faculty |
X |
X |
X |
X |
||
|
Student advisory group |
X |
X |
X |
|||
|
First-year students |
X |
X |
Conclusion
It’s important to remember that even the most carefully crafted communications may prompt questions, hesitation, or resistance. Be prepared to explain your methods, defend your analyses, and articulate the rationale behind the decisions and actions you recommend. Responding with clarity, patience, and a commitment to improvement reinforces assessment as an ongoing, collaborative process and increases the likelihood that insights will lead to positive change.
Approached thoughtfully, assessment communication serves as a catalyst for understanding, decision-making, and action. When assessment results reach the right people, in the right form, at the right time, audiences are empowered to act on insights and create positive change.
As the final article in this series, this piece underscores how communication brings assessment full circle. Library assessments designed with a focus on results, participant and stakeholder engagement, potential impacts, and intentional communication power a values-driven approach that centers people, fosters shared understanding, and strengthens libraries’ capacity to learn, adapt, and evolve. 
Note
1. Megan Oakleaf and Becky Croxton, “From Subjects to Partners: Centering Participants in Library Assessment,” College and Research Libraries News 86, no. 11 (2025): 449–54, doi: https://doi.org/10.5860/crln.86.11.449.
Article Views (By Year/Month)
| 2026 |
| January: 0 |
| February: 0 |
| March: 17 |
| April: 0 |