@thomasfuchs i don’t use LLM much for anything so i don’t really have a strong opinion on the matter.
but i’m just curious about this point…
LLM are potentially faulty. And worse they seem very confident while lying as they have no idea if what they’re saying is correct.
But something similar can be said of StackOverflow and other crowd sourced info. Right?
And to some degree even my own memory sometimes lies to me and tries to convince me it knows something it doesn’t. 😂