Embed Notice
HTML Code
Corresponding Notice
- Embed this noticeAsking LLMs specific questions only carries the risk of them hallucinating and giving a BS answer.
Talking to LLMs as chatbots with no specific goal can result in both sides of the conversation free associating themselves into la la land. But you really have to lack self-awareness (typical shitlib) to think that you're some prophet because a machine told you so.
On a second thought I wonder if this is some sort of vulnerability of their mental sw/hw that exist in some people. Like how pot can bring schizophrenia out in some people who would be ok without that input. Also reminds me of "BLIT" by David Langford. (good story, highly recommended)