07_Perspectives

Perspectives on the Framework

Where does ChatGPT fit into the Framework for Information Literacy?

The possibilities and problems of AI in library instruction

Amy B. James is online librarian for education and information literacy at the Baylor University Libraries, email: amy_b_james@baylor.edu. Ellen Hampton Filgo is director of liaison program at the Baylor University Libraries, email: ellen_filgo@baylor.edu.

Figure 1. Screenshot from https://chat.openai.com/chat: Introducing ChatGPT for Librarians.

Figure 1. Screenshot from https://chat.openai.com/chat: Introducing ChatGPT for Librarians.

The above screenshot (figure 1) was what was generated when we asked ChatGPT, the generative AI system that has been the subject of a thousand hot takes about how it’s disrupting academia-as-we-know-it, to describe itself for an academic librarian audience. Perhaps it’s learning a bit too much from the public relations documents that were a part of the vast amounts of data it was trained on, when it describes itself as “highly relevant,” “invaluable,” and “accurate.” It did not, however, bring up the caveat that greets you when you open up ChatGPT itself: that it “may occasionally generate incorrect information,” that it “may occasionally produce harmful instructions or biased content,” or that it has “limited knowledge of the world and events after 2021.”1 In addition, it doesn’t bring up the reddest of academic red flags—that ChatGPT provides an easy way for students to cheat and plagiarize. The Atlantic has claimed that because of ChatGPT and other AI, “the undergraduate essay [which] has been at the center of humanistic pedagogy for generations . . . is about to be disrupted from the ground up.”2 A writer at Times Higher Education has suggested that allowing AI to replace a student’s creative voice means “abandoning our responsibilities as educators.”3

For as many handwringing accounts of how generative AI will destroy academia, there seem to be twice as many researchers, teachers, technologists, and pundits embracing what AI (and specifically ChatGPT) can do for teaching and learning. They suggest using it for overcoming writer’s block, generating outlines, creating summaries, generating prompts for discussion, asking for definitions, or generating flawed examples for critique.4 One compelling argument by Christopher Grobe in the Chronicle of Higher Education suggests that what generative AI can help us with is to “provide new starting points for some of the processes we routinely use to think.”5 We agree with Grobe’s argument that ChatGPT can give us a good starting point from which to work. The text generated by ChatGPT in the screenshot at the start of this article is an overly optimistic and idealized view of itself. We hope that in this article we can add the nuance that it lacks.

Academic librarians serve their students and faculty to help them navigate the research process. Therefore, when a new technological tool blazes through higher education, as ChatGPT has over the last few months, it becomes increasingly important that librarians are aware of the tool and its uses so that they can serve their students and faculty. After decades of the ACRL Information Literacy Competency Standards for Higher Education, the ACRL Framework for Information Literacy for Higher Education was established with a much more flexible route for integration into curricula. The Framework provides librarians and disciplinary faculty with a customizable way to provide information literacy instruction that meets the needs of students and enables them to become participants in the information that they are producing (not just consuming). Because of the Framework’s flexible nature, librarians can incorporate new technology, like ChatGPT, more easily into their instruction.

We have found that the idea of ChatGPT (and generative AI more broadly) can be connected to many of the knowledge practices and dispositions from the six frames of the ACRL Framework. In some places, the Framework enables us to embrace ChatGPT as an exciting new tool that adds value to information literacy instruction. In other places, the Framework’s discussions of evaluating authority and examining bias shines light on the inherent flaws of ChatGPT. In the next section, we will review each of the frames and discuss how ChatGPT fits into each of those Frames.

Authority is Constructed and Contextual

The Authority is Constructed and Contextual frame states that learners who are growing their information literate abilities “develop awareness of the importance of assessing content with a skeptical stance and with a self-awareness of their own biases and worldview.”6 On the opening screen, ChatGPT provides a disclaimer to let users know that the information that it provides may contain biases and that it has a limited knowledge of current events. As the usage of ChatGPT increases it will become even more important that students know how to evaluate whether information that they come across is authoritative within the context of their research. Students need to recognize that bias is everywhere and ChatGPT is getting information that exists out on the open web. Much of the information that it produces derives from text from political organizations, nonprofits, companies, and individuals, and this context shapes the model’s output. ChatGPT is built on predictive language modeling, which means it generates natural sounding language, not necessarily factual language. Students should always approach information, from ChatGPT or elsewhere, with skepticism.

Information Creation as a Process

As with any new technology, there will be people who are hesitant adopters. In the early 2000s many librarians were skeptical of students using Wikipedia. Now we realize that we need to be instructing students about the proper use of Wikipedia, rather than banning it. One of the dispositions for the Information Creation as a Process frame states that “learners who are developing their information literate abilities accept the ambiguity surrounding the potential value of information creation expressed in emerging formats or modes.”7 To us, this is direct confirmation that ChatGPT has a place in the library instruction classroom, both for its use in instruction and for discussion with students. Librarians should absolutely be talking about it with students, trying it out, and discovering together what those problems and promises are. In particular, students should be made aware of the machine learning process: tools like ChatGPT are trained on large amounts of textual data in an iterative process, whereby it “learns” from this data over time. This process is very similar to how information literate scholars learn, research, create, write, and refine their ideas over time, as presented in this frame.

The frame challenges us that “the dynamic nature of information creation and dissemination requires ongoing attention to understand evolving creation processes.”8 A technology columnist for The New York Times advised his readers along the same lines: “today’s students will graduate into a world full of generative A.I. programs. They’ll need to know their way around these tools—their strengths and weaknesses, their hallmarks and blind spots—in order to work alongside them. To be good citizens, they’ll need hands-on experience to understand how this type of A.I. works, what types of bias it contains, and how it can be misused and weaponized.”9 It is imperative librarians bring students to that attention about the process of creating information through ChatGPT and similar AI models.

Information has Value

The Information has Value frame indicates that learners who are developing their information literacy skills need to “learn the importance of giving credit to the original ideas of others through proper attribution and citation.”10 As more learners use ChatGPT for citation assistance, they will need to be extremely careful to verify accuracy of all citations. As explained above, ChatGPT does not always provide factual information. For example, if you ask ChatGPT to give you a list of scholarly articles on a particular topic, it will list articles with full citations, including DOIs. The articles appear to be related titles from reputable-sounding journals. However, if you Google each of the articles, you will discover that some of them do not exist. ChatGPT will automatically generate nonexistent articles through predictive language modeling, making article titles appear genuine, even if they are not. If you ask it to give you citations for scholarly articles on the topic of higher education and information literacy, it will give you several citations that look very convincing. Here is an example:

Figure 2. Screenshot from https://chat.openai.com/chat: False citations.

Figure 2. Screenshot from https://chat.openai.com/chat: False citations.

If a student were to take this AI response and incorporate those citations into a paper or assignment, they would include some nonexistent articles. Students citing articles they have not read of course is a problem, but students citing fake articles that cannot be read compounds the problem. As students become creators of information themselves, they will need to accurately cite all information that they use, not only for the ethical and moral good, but also for the sake of their own reputation in their field(s). ChatGPT is a minefield when it comes to citation, and we would not recommend students use it for that purpose. However, asking ChatGPT to generate citations like this in an in-class exercise and discussing the results with students can help them to see the minefield for what it is and navigate around it.

Research as Inquiry

One of the knowledge practices for the Research as Inquiry frame states that “learners who are developing their information literate abilities deal with complex research by breaking complex questions into simple ones, limiting the scope of investigations.”11 We have found that students will often come to research consultations or library instruction sessions with broad or vague research questions that they often do not know how to simplify or narrow down to research writing sufficiently scoped to a level that they can tackle in five to eight pages. While they might be interested in, let’s say, “the problem of poverty” or “the abortion debate,” they cannot digest the enormous amount of research in multiple academic disciplines that have attempted to address these types of questions. This is where we believe ChatGPT can help students (and even seasoned researchers) in generating ways to break complex problems down. ChatGPT can help refine research questions, determine search terms, come up with synonyms and related terms or phrases for searching, help decide which databases to search, generate textual concept maps, and even help generate citations. For example, here was the response it gave when we asked it “What search terms should we use for our hypothetical research question, ‘why aren’t college athletes paid?’”

Figure 3. Screenshot from https://chat.openai.com/chat: Providing search terms.

Figure 3. Screenshot from https://chat.openai.com/chat: Providing search terms.

ChatGPT can also create textual concept maps to help think through various aspects of a research topic. This can be useful for students who need to narrow or refine their topic. Simply asking the AI to help narrow a research topic can be useful as it will give you a variety of ways to explore a topic. For example, we asked it to help us narrow our search on why college athletes aren’t paid, and it gave us detailed options in an easy-to-read format:

Figure 4. Screenshot from https://chat.openai.com/chat: Narrowing the topic.

Figure 4. Screenshot from https://chat.openai.com/chat: Narrowing the topic.

You can see clearly how ChatGPT can help students push past that inquiry threshold. Often, they don’t even know where to start in their search for information, or how to probe the nuances or facets of a large complex question for a scope they can grasp. Instead of typing their entire research question into Google or a database (as we have all seen students do) and having to sift through a mountain of results, they can type it into ChatGPT and ask “How do I start searching? Where could I go from here? What’s manageable?” As students are growing in their information literacy abilities, ChatGPT can help scaffold their skills enabling them to accomplish this task more confidently in the future.

One caveat that always bears repeating: ChatGPT has biases. It is trained on a large dataset of material from the internet. It may not produce underrepresented or less well-researched aspects of a topic. Because of this, it is important for students to explore topics holistically, with ChatGPT as one tool in their toolbelt.

Scholarship as Conversation

“Learners who are developing their information literate abilities see themselves as contributors to scholarship rather than only consumers of it.”12 This disposition from the Scholarship as Conversation frame, which shows up similarly in Information has Value, means that students are able to see themselves as a part of the scholarly conversation through the ways they critically examine, interact, and synthesize course and research materials, along with how they can contribute their own ideas and research through writing, presenting, and publishing. Whether students are intending to pursue further academic study or not, their voices are valuable in the conversation. This is where we urge caution in the use of ChatGPT, which may undermine the development of their academic voice.

On the extreme end, there will be students who use ChatGPT to generate an essay to turn in, and on the other end, there will be students who use it with great skill to enhance or refine their writing, to find a starting place for scattered thoughts, or to break out of creative blocks. As we teach about how students can join the scholarly conversation with their own voice, we need to emphasize the use of ChatGPT as a supportive resource, along with librarians, and other academic mentors.

Searching as Strategic Exploration

The Searching as Strategic Exploration frame describes searching as “nonlinear and iterative, requiring the evaluation of a range of information sources and the mental flexibility to pursue alternate avenues as new understanding develops.”13 This section is a succinct summary of why we as librarians provide information literacy instruction. We work with students to increase their information literacy skills and subsequently an awareness of the scholarly conversation that is taking place around them, something that ChatGPT will not do. ChatGPT will not teach students how to evaluate itself. Librarians are still necessary to encourage learners to evaluate the information sources and tools that may not be familiar to them.

For example, many students enter college having never searched for peer-reviewed, scholarly articles, so that skill may be entirely new to them. Their range of information sources grows when, through our instruction, we help them see the value in using academic databases for finding the information that they need in an effective, efficient manner. In addition, they will most likely have to take another look at their use of Wikipedia. Through information literacy instruction, they will gain a new understanding of how to use it effectively in a higher education context. ChatGPT is a brand-new source for information and our understanding of it is still developing. Our job, as teaching librarians, is not to shun the tool, but to embrace it and guide students toward using the tool responsibly and ethically. As with any new tool, librarians, as well as students, will need to show their flexibility and adapt to these new tools and resources as they become available.

Conclusion

The ACRL Framework for Information Literacy for Higher Education is open-ended enough for us to try new things in our teaching, explore new tools and new ways of helping students to understand the information in the world around them. These are just a few ways that we see how the Framework addresses ChatGPT and other generative AI tools. When we first were looking at this issue, we took a highlighter to the Framework document and made several dozen connections, and we’re sure that’s not all. In looking at these tools through the lens of the Framework, we can see both the promise and the pitfalls. Ultimately, there must be instruction about these tools: how they were developed, the ethics surrounding their use, and the specificity of the ways they can both help and hurt students. While we are aware that it can seem burdensome to add “one more thing” onto the plate of librarians who provide information literacy instruction, we urge librarians to have discussions with their faculty partners to share the burden of instruction. We are also confident that instruction librarians will create lesson plans that teach about or use ChatGPT in library instruction and will share them with their peers at places like Project CORA, the Framework for Information Literacy Sandbox, and other lesson plan repositories.

Notes

  1. “ChatGPT,” accessed February 24, 2023, https://chat.openai.com.
  2. Stephen Marche, “The College Essay Is Dead,” The Atlantic, December 6, 2022, para. 5, https://www.theatlantic.com/technology/archive/2022/12/chatgpt-ai-writing-college-student-essays/672371/.
  3. Adrian J. Wallbank, “ChatGPT and AI Writers: A Threat to Student Agency and Free Will?,” Time Higher Education, January 18, 2023, para. 9, https://www.timeshighereducation.com/campus/chatgpt-and-ai-writers-threat-student-agency-and-free-will.
  4. Kevin Roose, “Don’t Ban ChatGPT in Schools. Teach With It,” The New York Times, January 12, 2023, sec. Technology, https://www.nytimes.com/2023/01/12/technology/chatgpt-schools-teachers.html; Matt Miller, “ChatGPT, Chatbots and Artificial Intelligence in Education,” Ditch That Textbook (blog), December 17, 2022, https://ditchthattextbook.com/ai/; Leanne Ramer, “Adapt, Evolve, Elevate: ChatGPT Is Calling for Interdisciplinary Action,” Time Higher Education, February 22, 2023, https://www.timeshighereducation.com/campus/adapt-evolve-elevate-chatgpt-calling-interdisciplinary-action.
  5. Christopher Grobe, “Why I’m Not Scared of ChatGPT,” The Chronicle of Higher Education, January 18, 2023, para. 11, https://www.chronicle.com/article/why-im-not-scared-of-chatgpt.
  6. ACRL, “Framework for Information Literacy for Higher Education,” Association of College & Research Libraries, February 9, 2015, https://www.ala.org/acrl/standards/ilframework.
  7. ACRL, “Framework for Information Literacy for Higher Education.”
  8. ACRL, “Framework for Information Literacy for Higher Education.”
  9. Roose, “Don’t Ban ChatGPT in Schools. Teach With It.”
  10. ACRL, “Framework for Information Literacy for Higher Education.”
  11. ACRL, “Framework for Information Literacy for Higher Education.”
  12. ACRL, “Framework for Information Literacy for Higher Education.”
  13. ACRL, “Framework for Information Literacy for Higher Education.”
Copyright Amy B. James, Ellen Hampton Filgo

Article Views (By Year/Month)

2026
January: 117
2025
January: 648
February: 687
March: 734
April: 701
May: 756
June: 558
July: 592
August: 565
September: 637
October: 742
November: 603
December: 579
2024
January: 762
February: 640
March: 828
April: 612
May: 660
June: 911
July: 690
August: 492
September: 730
October: 934
November: 653
December: 499
2023
January: 0
February: 0
March: 0
April: 0
May: 0
June: 0
July: 0
August: 0
September: 6
October: 2577
November: 1564
December: 890