But the effect is the same, a stolen article gets published, then like word of mouth is picked up by other GTP AI & is then given legitimicy.
In Bing or Google, an alogrithm publishs GTP output after a request for information, does that blur the lines? Is there a "human" element to the display or creation of that output?