LLMs don’t understand the text they generate. Which is how this happens:
“Today, as a test, I entered a criminal judge I knew into Copilot, with the name and place of residence in Tübingen: The judge was promptly named as the perpetrator of a judgment he had made himself a few weeks earlier against a psychotherapist who had been convicted of sexual abuse”
https://www.theregister.com/2024/08/26/microsoft_bing_copilot_ai_halluciation/