Skip to main content Skip to guide navigation

Truxal Library Exhibit: ChatGPT Makes Stuff Up

Did you know? ChatGPT Makes Stuff Up; ChatGPT is not designed to be factual. Illustration of cyborg Pinocchio with long nose and Jiminy Cricket.

Generative AI Is Not Designed to Be Factual

Generative AI tools (like OpenAI's ChatGPT, Google's Gemini, Microsoft's Copilot, etc.) use algorithms to calculate which word will follow the last in its textual output. These generative AI applications are not designed to produce factual output, and so these applications often create texts that contain inaccuracies, biased and debunked claims, and wholesale inventions.

Ghost Citations

One kind of inaccurate output that Gen AI applications produce is called a ghost citation. Also known as a hallucinated citation, a ghost citation is produced by a generative AI tool to look like a citation for a real source—like a book chapter, journal article, or documentary film—but the source doesn't actually exist. An AI tool metaphorically "hallucinates" the existence of an information source, resulting in a "ghost" citation. Because these citations look convincing, they can send the unsuspecting researcher on a fruitless hunt for a nonexistent resource.

Anatomy of a Ghost Citation

An APA-style citation for a nonexistent article. Parts of the citation are numbered in order and explained in text below.

  1. Authors

    • Authors identified as Champlin, E. A., & Sloan, H.
    • The names of real and relevant authors may be included in ghost citations.
  2. Publication Dates

    • Listed date is 2025.
    • Gen AI cannot draw on the most recently published information. Large language models that support Gen AI chat applications have a cutoff date, the last date that new data was added—usually more than a year ago.
  3. Article Titles

    • Title is listed as The impact of African American women librarians in shaping community literacy.
    • A hallucinated citation’s article title will very often seem perfect for addressing a research question or information need. It will frequently contain exact wording or very similar language to what was used in the prompt that produced it.
  4. Journal Titles

    • Journal of Diversity in Library Studies.
    • The journal might actually exist, but it’s just as likely in a ghost citation that the journal title will be hallucinated and, like hallucinated article titles, will often contain language that comes directly from the user’s prompt.
  5. Numbers

    • Volume, issue, and article number are given as 45(2), Article 80139.
    • Volume, issue, page, and article numbers are almost always hallucinated in a ghost citation, but, if the journal exists, the volume and issue numbers in a ghost citation may also exist for that journal. The hallucinated article title won't be contained in that volume and issue. The page or article number generated by Gen AI may correspond to a real article.
  6. D O I s

    • A realistic-looking D O I  is given in the format of a URL.
    • The D O I or URL presented in a ghost citation may be hallucinated, or it might be assigned to a real article. If the D O I  or URL from a ghost citation does work, most or all of the other information in the citation—the article title, year of publication, volume number, issue number, and page range or article number—won’t match up.

Even though the given citation looks real, the source does not exist.

Bottom Line

All information retrieved from a Gen AI tool needs to be verified before it is used as a source for learning and academic study. We librarians recommend saving a step and consulting more credible tools instead. Get in touch with library staff for recommended sources!

Related Reads

To learn more about information literacy skills or about challenges created by AI technology, consider these reads from the library's collection! These books (and more) are on display in the library on the second floor.