07_Fielding

Rethinking CRAAP

Getting students thinking like fact-checkers in evaluating web sources

Jennifer A. Fielding is coordinator of library services at Northern Essex Community College-Lawrence Campus, email: jfielding@necc.mass.edu

For over two decades, librarians have been at the forefront of helping their patrons and students discern what online information is reliable, and what may be biased or outright false. Particularly as more formal information literacy programs developed at the college and university level (and the attendant inclusion of information literacy in many general education programs), academic librarians have developed curricula and taught students how to evaluate web sources for credibility. In many institutions, this has frequently been achieved via a “one-shot” session with a checklist of sorts, often some variation of the CRAAP Method (Currency, Reliability, Authority, Accuracy, and Purpose) developed nearly 15 years ago at California State University-Chico.1

The CRAAP method encourages the user to perform an in-depth analysis of the website to determine its credibility, often by finding and analyzing the “About Us” section, and thoroughly exploring the site to determine if there are named authors, what their credentials may be, and the stated purpose or mission of the publishing organization. Additional CRAAP assessments include how recently the site has been updated, whether page links are working and lead to other reliable information, and whether the site is a commercial, nonprofit, or educational site. Students focus on the site itself, performing a “deep dive” into what they find at a particular URL. As currently employed, the CRAAP method does not explicitly encourage leaving the site to place any content found there in a wider context.

However, in recent years, the dissemination of mis- and disinformation online has become increasingly sophisticated and prolific, so restricting analysis to a single website’s content without understanding how that site relates to a wider scope now has the potential to facilitate the acceptance of misinformation as fact. Once a site is deemed “credible,” all information on it is frequently trusted and taken at face value. This is clearly problematic, since studies show that once information has been accepted as valid and assimilated by the user, it becomes far more difficult to counter, even with accurate facts.2,3,4

In addition, considering the disciplinary shift away from static Information Competency Standards for Higher Education to the more organic and contextual approach of the Framework for Information Literacy for Higher Education, it would seem apparent that the pedagogy surrounding the critical assessment of web sources must also evolve.

The Stanford Study

A 2017 Stanford working paper by Sam Wineburg and Sara McGrew highlights this evolution in stark relief, assessing the critical evaluation skills for web content between students, faculty, and professional fact-checkers. They found that faculty (arguably information-savvy critical thinkers) performed barely better than undergraduates in assessing the credibility of web content, primarily because they used the deep-dive type of assessment endorsed by methods like CRAAP—thoroughly examining the site itself.

Fact-checkers, on the other hand, almost immediately began an independent verification process, a strategy the researchers dubbed “lateral reading”—opening multiple tabs, and searching for independent information on the publishing organization, funding sources, and other factors that might indicate the reliability and perspective of the site and its authors or sponsors.

This lateral reading approach produced significantly better results for the fact-checkers—both in critical assessment and in the speed of their conclusions—than the “vertical reading” deep dive did for both the student group and the faculty group.5

Informal trials with lateral searching

In an effort to trial and potentially incorporate this new research finding in teaching methods, a group of community college librarians revised their basic one-shot content to include a “lateral reading” assessment with their first-year students.

Northern Essex Community College (NECC) is a two-year associates degree- granting institution 30 miles north of Boston, Massachusetts. Almost all students in every program take an English 101 and/or English 102 course, during which research and information literacy skills are frequently addressed in both the content and/or with a “one-shot” library visit. In addition, as part of the college’s Core Academic Skills requirements, all students graduating with an associate’s degree must also complete a course designated as “intensive” in information literacy within their discipline.

In the Fall 2018 semester, several NECC librarians adapted the portion of some one-shot sessions where the CRAAP method would have been used to assess the credibility of a website. In one example of this alternative “lateral reading” activity, students analyzed two results from a Google search on “asthma” and compared sites for two of the top ten results: asthma.com versus medlineplus.gov/asthma.html.

While the instructor did note the difference in the domain names as a likely indicator of the purpose of each site (part of the CRAAP method), students were then encouraged to open new tabs and search for information on both GSK (which students had identified as the publisher of asthma.com) and Medline. Students quickly discovered that GSK is the pharmaceutical company GlaxoSmithKline, while the publisher of Medline is the National Library of Medicine.

Students then skimmed information on both of these organizations on sites like Wikipedia and news sites, and were asked for their assessment on which would deliver more trustworthy information regarding asthma diagnosis, prognosis, and treatments. Frequently at this point, students would spontaneously start discussing the information they found regarding GSK’s legal battles, and the ethical implications of drug companies giving health advice, which then often led to animated discussions regarding the responsibilities of information creators.

The simple shift to a lateral reading method not only visibly engaged students more thoroughly in the process, but also directly applied several of the ACRL Framework for Information Literacy for Higher Education principles: Searching as Strategic Exploration, Information Has Value, and Authority is Constructed and Contextual.

In-class discussions around these concepts were often robust and student-driven, and frequently led to additional explorations of various websites. One student remarked they saw the activity like “detective work” and enjoyed that aspect of it. While this feedback is certainly anecdotal, it demonstrates the difference in framing that helped engage students more actively in the evaluation process.

Future classroom exercises are anticipated to focus on specific outcomes and behaviors articulated in the Framework, particularly exhibiting “mental flexibility and creativity” and developing an “awareness of the importance of assessing content with a skeptical stance and with a self-awareness of their own biases and worldview.”6

Since these early, unstructured efforts have been so promising, discussions by the librarians around next steps in applying these pedagogical changes have been focused on adapting the approach to differing disciplinary perspectives across the curriculum (i.e., incorporating student activities evaluating sites specific to history, psychology, criminal justice, etc.), and development of an assessment method to verify lateral searching’s efficacy versus traditional methods for NECC’s students. Should the shift be incorporated into ongoing teaching, communication with faculty will also be important, as (ironically due to the diligent efforts of librarians) many faculty use CRAAP as the website assessment “standard” in their courses (librarians frequently receive requests from faculty to “teach the session on CRAAP”).

It is important to note here, however, that the Stanford study attributed the success of the fact-checkers to both the lateral reading strategy described above, but also their “robust knowledge of sources to inform their decisions,”7 for instance, understanding that a nonprofit site does not necessarily connote altruism, and that purported news sites can lean heavily left or right in their reporting. “Fact checkers also possessed knowledge of online structures, particularly how search results are organized and presented. They knew that the first result was not necessarily the most authoritative.”8

These findings indicate that any strategies taught regarding information evaluation must also be paired with content on search engine ranking, personalization, and Eli Pariser’s now well-known filter bubble effect.9

Conclusions

It is widely acknowledged that the current information landscape places an increasing burden on the information consumer. The lack of editorial control in a web environment, coupled with personalized search engine results and filter bubbles of mis/disformation on social media and other platforms10,11,12 makes obvious the need for librarians to evolve our pedagogy to teach and encourage lateral, fact-checking behaviors and dispositions.

As such, while very useful for many early manifestations of web content, I would argue that the CRAAP “deep-dive” examination of a specific web source is no longer wholly adequate in light of the increasing sophistication of the web, nonexistent barriers to content creation, and the muddling effect of social media on information consumption and sharing. While it is clear that each of CRAAP’s individual assessments have ongoing value, it has become vitally important to place information into a wider context to adequately evaluate its credibility, as well as teach how information is ranked and presented on search engines and social media. As consumers are able to both curate their information to suit their interests and propagate information nearly instantaneously, evolving information literacy instruction across the curriculum has broad implications not just for the research process, but for issues such as citizenship, democracy, and social responsibility.

Notes

  1. Sarah Blakeslee, “The CRAAP Test,” LOEX Quarterly 31, no. 3 (2004).
  2. John Cook and Stephan Lewandowsky, The Debunking Handbook (St. Lucia, Australia: University of Queensland, 2011), http://sks.to/debunk.
  3. Elizabeth Kolbert, “Why Facts Don’t Change Our Minds,” The New Yorker (February 27, 2017).
  4. Alice Marwick and Rebecca Lewis, “Media Manipulation and Disinformation Online,” Data & Society (May 15, 2017), https://datasociety.net/output/media-manipulation-and-disinfo-online/.
  5. Sam Wineburg and Sarah McGrew, “Lateral Reading: Reading Less and Learning More When Evaluating Digital Information,” Stanford History Education Group-Working Paper (September 2017), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3048994.
  6. ACRL “Framework for Information Literacy for Higher Education,” 2015, www.ala.org/acrl/standards/ilframework.
  7. Wineburg and McGrew, “Lateral Reading.”
  8. Ibid.
  9. Eli Pariser, The Filter Bubble: What the Internet is Hiding From You (New York: Penguin Press, 2011).
  10. Ibid.
  11. Amy Mitchell, Elisa Shearer, Jeffrey Gottfried, and Michael Barthel, “Pathways to News,” Pew Research Center, July 7, 2016, www.journalism.org/2016/07/07/pathways-to-news/.
  12. Chris Meserole, “How Misinformation Spreads on Social Media—And What To Do About It,” The Brookings Institution (May 9, 2018), https://www.brookings.edu/blog/order-from-chaos/2018/05/09/how-misinformation-spreads-on-social-media-and-what-to-do-about-it/.
Copyright Jennifer A. Fielding

Article Views (Last 12 Months)

No data available

Contact ACRL for article usage statistics from 2010-April 2017.

Article Views (By Year/Month)

2023
January: 111
February: 143
March: 175
April: 171
May: 134
June: 94
July: 87
August: 136
September: 113
2022
January: 183
February: 195
March: 200
April: 152
May: 180
June: 103
July: 128
August: 119
September: 197
October: 161
November: 158
December: 77
2021
January: 102
February: 191
March: 259
April: 147
May: 183
June: 105
July: 113
August: 146
September: 206
October: 177
November: 219
December: 114
2020
January: 533
February: 520
March: 272
April: 284
May: 148
June: 137
July: 122
August: 123
September: 137
October: 160
November: 150
December: 91
2019
January: 0
February: 0
March: 0
April: 0
May: 0
June: 0
July: 0
August: 0
September: 0
October: 0
November: 0
December: 1731