@GossiTheDog Once a human prompts Copilot to create a doc and then saves it to M365, that now becomes a human authored source. Next time that hallucinated BS will be attributed to a coworker, making it even harder for anyone to spot the error.
I've asked if there is anything built into #Microsoft365 #Copilot to stop this feedback loop of AI hallucinations. No, there isn't.🤷♂️