@andersbiork @peacememories @Cheeseness @aral With that said, I think this is very fair criticism and we should remove it.
Hallucinations is an integrated part of how LLMs work and I haven't seen anyone be able to solve it. At this point, I'm not even sure it can be and I don't think LLMs have any place in search until it's solved.
I'll put it on my todo list to remove it right away.