@jk What I've settled on as my internal understanding of what LLMs is, is that they're like someone trained a dog to take meeting notes. Now, even if it kept misspelling words, and would put in random comments about wanting to go for a walk or how great sticks are, that would be the most amazing thing you've ever seen. It would absolutely blow your mind that this was even possible.
But you wouldn't hire the dog to take meeting notes. Except, in this case, people are trying to.