@clacke So yeah, any one piece doesn't make AGI. An LLM is just a blithering idiot Markov chain. But at some point if you wire up enough of these stupid things together, you get something that acts intelligently, whether or not it's "self-aware". And that's very dangerous to us, just as hominids were very dangerous to every other life on this planet.