HalluCiteChecker: Detecting AI-Hallucinated Citations in Scientific Papers
A new toolkit named HalluCiteChecker has been developed by researchers to identify and validate hallucinated citations in academic papers. With the rise of AI assistants suggesting citations, there are instances where these tools produce references that lack any real source, which can damage credibility and increase the workload for reviewers. This toolkit frames the detection of hallucinated citations as a natural language processing (NLP) challenge and can verify citations in mere seconds on a typical laptop, functioning offline and relying solely on CPUs. Its goal is to alleviate the burden on reviewers and enhance the integrity of the scientific review process.
Key facts
- HalluCiteChecker is a toolkit for detecting hallucinated citations.
- AI assistants in academic writing can generate non-existent citations.
- Hallucinated citations undermine scientific credibility.
- The toolkit formalizes detection as an NLP task.
- It can verify citations in seconds on a standard laptop.
- It runs entirely offline using only CPUs.
- The goal is to reduce reviewer workload.
- The toolkit is introduced in arXiv paper 2604.26835.
Entities
Institutions
- arXiv