Changing the metaphor for LLM unreliability: they're bullshitting, not hallucinating:https://www.psypost.org/scholars-ai-isnt-hallucinating-its-bullshitting/#AI #LLM