Minor pet peeve in LLM discussions: claiming that something cannot be correct if the thing producing it doesn’t comprehend it.
1+1 doesn’t stop equaling 2 if you arrive at that result by accident.
There’s lots of good reasons to dunk on LLMs, we don’t need to make up bad ones.