All it would take for AI to completely collapse is a ruling in the US saying these companies have to licence the content they used to train these tools.
They simply would never reach a sustainable business model if they had to fairly compensate all the people who wrote, drew, edited, sang or just created the content they use.
Simply being forced to respect attribution and licenses would kill them. Will that ruling ever happen? Maybe not. Should it? I think so.