ARTFEED — Contemporary Art Intelligence

AI Models Systematically Generate Hallucinated Academic Citations, Study Finds

ai-technology · 2026-04-22

A study investigates the phenomenon of generative AI fabricating and disseminating fictitious academic citations, focusing on the non-existent reference 'Education Governance and Datafication,' which is incorrectly credited to Ben Williamson and Nelli Piattoeva. Utilizing 137 source papers sourced through Google Scholar and searches, the research analyzes the structure, frequency, and subsequent citations of this imaginary reference. Results indicate that these fabricated citations follow recognizable patterns, with nearly 30% of instances showing duplication of real authors, journals, dates, and keywords. The paper also details an examination of ChatGPT 5-mini's citation creation process, revealing that it generates plausible references based on learned patterns rather than factual accuracy. Furthermore, ten AI-generated essays on datafication and school governance were analyzed, most of which included these false citations, underscoring the systematic nature of AI-generated misinformation in academic settings.

Key facts

  • The study investigates hallucinated academic references produced by generative AI.
  • It focuses on the non-existent citation 'Education Governance and Datafication' attributed to Ben Williamson and Nelli Piattoeva.
  • Analysis is based on 137 accessible source papers found through Google Scholar and Google searches.
  • Hallucinated citations are patterned recombinations of real authors, journals, dates, and keywords.
  • Duplication of hallucinated citations occurs in nearly 30% of cases.
  • ChatGPT 5-mini was interrogated about how it generates citations.
  • Without verification, the model reconstructs plausible references from learned patterns rather than factual recall.
  • Ten AI-generated essays on datafication and school governance were examined.

Entities

Institutions

  • Google Scholar
  • arXiv

Sources