@simon
I think there is a point because something has changed. People are suddenly experiencing something uncannily like all the fictional AIs they've read about and watched in movies.
Many people, including plenty I expect to know better are seeing a conversational UX with a black box behind it, as opposed to a few lines of basic, and then make wildly overblown assumptions about what it is. Deliberately encouraged by those using deceptive framing such as 'hallucinations' to describe errors.