this is incredibly simple but a lot of people have difficulty grasping it. it's not that llm's are stupid, it's not that they're built wrong, it's not that they don't work (tho all of those are also true); it's that what they push out is fundamentally meaningless in a particularly rigorous sense