@b9AcE I actually have my own theory about the AI hallucination problem. It's like a collection of senses (sight, hearing, etc.) and the capability of speech, with a little bit of cold reasoning tying it together. But it has no will, no directives that are not arbitrary, and because of how they are trained they are taught to seek acceptance of results. They need more sophisticated higher directives. In other words, a subconscious and organized, relational memory of past experiences.