LLM Hallucinations Found in 146,932 Scientific Citations in 2025
A study auditing 111 million references across 2.5 million papers on arXiv, bioRxiv, SSRN, and PubMed Central reveals a sharp rise in non-existent citations following widespread LLM adoption. Researchers estimate 146,932 hallucinated citations in 2025 alone, with errors concentrated in fields with rapid AI uptake, papers showing AI-assisted writing patterns, and among small or early-career author teams. Hallucinated references disproportionately credit already prominent and male scholars, suggesting LLM-generated errors may reinforce existing biases.
Key facts
- 111 million references audited
- 2.5 million papers analyzed
- Sources: arXiv, bioRxiv, SSRN, PubMed Central
- 146,932 hallucinated citations estimated in 2025
- Errors more common in fields with rapid AI uptake
- Errors linked to AI-assisted writing linguistic signatures
- Errors more frequent among small and early-career author teams
- Hallucinated citations favor prominent and male scholars
Entities
Institutions
- arXiv
- bioRxiv
- SSRN
- PubMed Central