crln.78.10.541

Be aware: Elevate your news evaluation

Emphasizing media literacy, one library’s initiative

The following article outlines the University of California-Merced Library’s unfolding news evaluation campaign,1 shares our strategies, and reflects on our efforts. The impetus for this campaign came when a colleague shared Vanessa Otero’s News Quality Chart, a graphic that places news sources on X and Y axes, representing quality and partisan bias.2 Otero’s work, combined with increasing public concern and conversation about the legitimacy of news, propelled my colleagues and I to start discussing how we might emphasize media literacy, especially news evaluation. We started our discussion just prior to the spring semester, and we launched our campaign a few weeks later. Though this meant limited time for planning, we wanted to capitalize on this opportunity to promote information literacy by initiating and participating in a broader campus conversation about news evaluation.

Campaign components

Exhibit

Due to our interest in sharing the content of Otero’s graphic, we proposed creating an exhibit with both physical posters and an accompanying digital signage script. We hoped to use the exhibit to generate conversation about news sources and news evaluation. We started by creating outcomes to give focus and boundaries to the exhibit content. Specifically, we wanted viewers to think more deeply about their evaluation and consumption of news media sources, become more aware of the range of news sources available, increase their knowledge about news sources and news types, recognize that news article types have specific purposes, and identify resources to assist in news evaluation.

The exhibit included four main parts. We adapted Otero’s graphic and used this as the centerpiece of the exhibit. Our adaptations included visually simplifying the graphic, adding a couple more right-leaning sources, modifying labels, and adding a title—“Spectrum of News Sources.” Near this graphic, we ran a digital signage script with content outlining our current news environment, reasons for becoming a critical news consumer, and strategies for news evaluation. The final portion of the exhibit included 12 posters, each representing a specific news source found on the “Spectrum of News Sources” graphic. My colleagues and I deliberately chose sources representing the full partisan spectrum and researched these sources to create poster content. For each news source, we included a brief source summary, founding date, political spectrum placement, level of factual reporting, and a couple fun facts. These were arranged on a wall from most biased (left) to most biased (right). We also made all exhibit materials available for online viewing.3

Social media

In conjunction with the exhibit, we floated the idea of highlighting the digital signage content through our library’s Twitter and Facebook channels. We generated ideas for content and our communications coordinator and a student assistant took the lead on posting items that advertised the exhibit, reinforced concepts from our script, and pointed to campaign events. Our followers saw two-to-three postings related to news evaluation most weeks. In addition to regular postings, our campus’ University Communications department featured our deputy university librarian in a 20-minute Facebook Live event titled “Ferreting Out Fake News.”4

Course instruction

In addition to our exhibit and social media posts, we included an instructional piece designed to increase students’ ability to evaluate news. We chose to target classes, specifically introductory Writing Composition courses, since our past experience with voluntary student workshops has often been limited in attendance and reach.

Librarians Elizabeth McMunn-Tetangco, Sara Davidson Squibb, and Lindsay Davis display posters with responses to exhibit questions.

Librarians Elizabeth McMunn-Tetangco, Sara Davidson Squibb, and Lindsay Davis display posters with responses to exhibit questions.

Our lesson plan involved a jigsaw activity in which students were responsible for reading and evaluating a single news article, without news source information, before being exposed to two other articles on the same topic through their peers. Through guided questions, students discussed each article’s level of accuracy and bias. We ended our lesson with a full class discussion and shared resources students could consult when unfamiliar with a specific news source (e.g., MediaBiasFactCheck and AllSides). Our lesson plan and associated materials are posted at Community of Online Research Assignments.5

Faculty workshop

Through our library instruction efforts, we were able to work closely with students, but we also wanted to expand the news evaluation conversation to faculty and staff. To this end, we worked with campus partners to develop a workshop focused on instructional strategies for fostering critical thinking, especially as it applied to news evaluation. Though we could have offered this on our own, we approached our campus’ Center for Engaged Teaching and Learning (CETL) with the idea of presenting this workshop as part of a special topics series. While we prepared workshop content in collaboration with two Writing program faculty, CETL supported the workshop by securing the venue and advertising. All workshop presenters shared approaches and resources they had used to engage students in critical thinking and to increase students’ ability to understand and evaluate news.

Special events

The final pieces of our campaign included several special events. We were able to organize and host a presentation by Emmanuel Vincent, titled “Fostering More Accurate Science Coverage: Using Science Expertise to Evaluate Journalism.” Vincent, project lead of Climate Feedback,6 spoke about the importance of news evaluation, the challenges of our current news environment, and how climatefeedback.org aims to help readers identify trustworthy sources of information and promote critical thinking.

Climatefeedback.org, hosted by our campus’ Center for Climate Communications, includes a network of more than 200 scientists worldwide who annotate news articles on climate change, rating them for accuracy and credibility with the goal of “distinguish[ing] inaccurate climate change narrative from scientifically sound and trustworthy information.”7 We were fortunate to have his expertise available locally.

In addition to this talk, we hosted tabling events at the exhibit on two consecutive days as a way to draw attention to exhibit content and talk with students more deeply about their own experiences consuming and evaluating news. Our tabling events involved setting up three information stations located at different points of the exhibit. If students visited each station and had their participation card initialed by the librarian, they would receive a snack at the end of the activity. At Station 1, we asked students to respond to two questions on large posters: What are some of the news sources you read most? How much of your news do you get through friends, family, Facebook, or other social media? At the second station, we explained the Spectrum of News Sources graphic and showed online resources for learning more about news sources’ biases. Then students viewed the 12 news source posters before they ended at Station 3 to answer a final question: How do you decide what news sources and news articles are trustworthy?

Exhibit logo and brand; slide one of digital signage script. Credit: Kristopher Kline.

Exhibit logo and brand; slide one of digital signage script. Credit: Kristopher Kline.

Reflection

In reviewing the campaign, we know that some of our efforts generated more interest and engagement than others. I also came away from this campaign with two primary observations. (While these are not new, they reinforce key elements for successful outreach and instructional efforts.)

Reflection 1: Partnerships and collaborations broadened our audience and reach. Personally, I would consider our instruction efforts, faculty workshop, Facebook Live participation, and special speaker talk to have had the greatest reach due to working with a variety of members in our campus community. By partnering with Writing faculty to offer our news evaluation lesson in writing classes, we were able to interact with more students than we would have had through offering a voluntary workshop to students. For the faculty workshop, we found value in coordinating with CETL, due to its focus on professional development opportunities for faculty and its advertising avenues. In addition, we ensured that the workshop content included other faculty perspectives and not just library voices.

On the social media front, the Facebook Live event initiated by university communications with our deputy university librarian generated more than 2,200 views. Our own social media postings had a number of views but did not engage our audience to the same extent as the Facebook Live event. Part of this could be attributed to Campus Communications’ larger social media audience. Lastly, we were very fortunate to have

Vincent contribute to our larger campaign by sharing his expertise about news evaluation from the perspective of a scientist and lead of climatefeedback.org. We found much value in these collaborations and partnerships. In retrospect, we may have been able to generate even more attention and engagement by leveraging collaborations, especially by coadvertising and cohosting events.

Reflection 2: Just as partnerships and collaborations were valuable, we also found that interactivity is important. My colleagues and I really enjoyed the activity of the classroom, the one-on-one conversations with students at tabling and a spirited discussion of bias at the faculty workshop. As noted by the number of Facebook Live views, that 20-minute event received a lot of attention. In future endeavors, we would look to incorporate more interactive elements especially for any exhibit.

During planning, we initially toyed with ideas for more exhibit interactivity. For instance, we discussed whether our adaptation of Otero’s graphic could be made interactive with the ability to move sources around. We did not pursue this route largely due to our time crunch and perhaps a little bit of apprehension about where some news sources might be moved to on the spectrum. However, I think there is the potential to take a risk with an interactive news source spectrum graphic, especially if the graphic focused on a single axis, such as the partisan axis. Velcro or clothesline pins anyone? Though we tried tabling with our information stations at the actual exhibit, I think we also could have driven more traffic to the exhibit and had substantial conversations about critical evaluation of sources by hosting tabling events in other campus locations during the first couple weeks of the exhibit opening.

Conclusion

Overall, we learned much, benefited from collaborations, and found that we have more work to do in terms of assessing our project’s impact. We did not formalize a way to measure the value of this campaign prior to launch and are still wrapping our heads around how to assess the impact of a physical exhibit.

Though we did not formally measure our impact, we have had hallway conversations about source evaluation, rich discussions with students in the classroom and at tabling, and kudos from campus colleagues about the campaign’s offerings. I believe it raised the library’s visibility and reiterated our role as educators who are fostering students’ ability to think critically and evaluate sources accurately. We encourage others to launch efforts that help library users navigate a complex, and often confusing, information environment. We have provided our exhibit materials online for those who may wish to use or adapt these for their own purposes.8

Acknowledgements

I would like to thank my colleagues Lindsay Davis and Elizabeth McMunn-Tetangco who were deeply involved in planning. They created exhibit content, taught classes, and staffed our tabling events along with Elizabeth Salmon and Joe Ameen. I also want to give a big shout out to our new colleague Breanna Wright who ran with social media posting ideas and ensured that our physical and digital exhibits were created and displayed. She willingly and enthusiastically supported our efforts. In addition, our student assistants made important contributions. Kristopher Kline designed most campaign graphics, while Borna Zandipour made timely and regular social media posts.

Notes

  1. “UC Merced Library Fights ‘Fake News’ with Campaign, Exhibit,” Panorama, March 30, 2017, accessed May 12, 2017, http://panorama.ucmerced.edu/news/uc-merced-library-fights-fake-news-campaign-exhibit.
  2. Vanessa Otero, “High Resolution File Formats for Full Chart and Blank Versions of News Quality Chart,” All Generalizations are False, last modified January 23, 2017, accessed May 12 2017, www.allgeneralizationsarefalse.com/.
  3. Sara Davidson Squibb, “About the Exhibit,” Elevate Your News Evaluation, accessed May 12, 2017, http://libguides.ucmerced.edu/elevate-news-evaluation/about.
  4. Donald Barclay, “Ferreting Out Fake News,” UC Merced Facebook, last modified March 17, 2017, accessed May 12, 2017, https://www.facebook.com/ucmerced/videos/10154585516649151/.
  5. Sara Davidson Squibb, “News Evaluation—Beyond the Checklist,” Community of Online Research Assignment (CORA), last modified May 9, 2017, accessed May 12, 2017, https://www.projectcora.org/assignment/news-evaluation-%E2%80%93-beyond-checklist.
  6. Climate Feedback, accessed May 12, 2017, http://climatefeedback.org/.
  7. “About,” Climate Feedback, accessed May 12, 2017, http://climatefeedback.org/about/.
  8. Sara Davidson Squibb, “Downloads,” Elevate Your News Evaluation, accessed May 18, 2017, http://libguides.ucmerced.edu/elevate-news-evaluation/downloads.
Copyright Sara Davidson Squibb

Article Views (2017)

No data available

Contact ACRL for article usage statistics from 2010-April 2017.

Comments on this article

View all comments