@altruios @woe2you @tante You really have no understanding of the mechanics of the LLM fraud and are just repeating their false talking points about learning/training.
LLMs are overfitted (that's the large part) models, and as such, they actually encode copies of large swaths of their training material. The best description is "lossy compression algorithm". I don't get a free pass pirating movies because the compression is lossy.