Generative AI tools (like OpenAI's ChatGPT, Google's Gemini, Microsoft's Copilot, etc.) use algorithms to calculate which word will follow the last in its textual output. These generative AI applications are not designed to produce factual output, and so these applications often create texts that contain inaccuracies, biased and debunked claims, and wholesale inventions.
One kind of inaccurate output that Gen AI applications produce is called a ghost citation. Also known as a hallucinated citation, a ghost citation is produced by a generative AI tool to look like a citation for a real source—like a book chapter, journal article, or documentary film—but the source doesn't actually exist. An AI tool metaphorically "hallucinates" the existence of an information source, resulting in a "ghost" citation. Because these citations look convincing, they can send the unsuspecting researcher on a fruitless hunt for a nonexistent resource.
Even though the given citation looks real, the source does not exist.
All information retrieved from a Gen AI tool needs to be verified before it is used as a source for learning and academic study. We librarians recommend saving a step and consulting more credible tools instead. Get in touch with library staff for recommended sources!
To learn more about information literacy skills or about challenges created by AI technology, consider these reads from the library's collection! These books (and more) are on display in the library on the second floor.