Because of the hype, all of the ways media and irresponsible salespeople are talking about LLMs, and all of the allusions to AGI, I know that a decent chunk of the audience will not get that anthropomorphization is a metaphor.
I know that there are really people out there making decisions, not just people on the street, but CEOs and engineers, that seriously believe that LLMs can think, reason and feel.
Anthropomorphizing LLMs feeds into the misconception and makes people make bad decisions based on a misunderstanding of what LLMs can actually do. We all need to take responsibility for presenting accurately what the process is and what it can do, and perhaps even more importantly what it can't do.