AI Glossary
Hallucination
When AI confidently states something false
Definition
Hallucination refers to instances where a language model generates plausible-sounding but factually incorrect information. Models do not have access to external truth — they predict text based on patterns in training data, which can lead to fabricated citations, wrong dates, or invented facts. Techniques like RAG and grounding help reduce hallucination for factual tasks.