Here's my take on generative transformers: they're a one-shot deal. GPT-4 was trained largely on a pre-LLM Web. The 'I' in its 'A.I.' is actual human intelligence and creativity. Later iterations will be trained mostly on the output of earlier iterations. To use @jezhiggins metaphor, the LLM equivalent of Mad Cow Disease is the likely long-term outcome.