AI hallucination—the generation of false or misleading information by language...
https://wiki-net.win/index.php/When_Summaries_Lie:_A_Case_study_of_Models_That_Summarize_Well_but_Fail_to_Admit_Ignorance
AI hallucination—the generation of false or misleading information by language models—remains a critical challenge for deploying these systems in real-world applications