Nightshade, the AI data poisoning tool, is now available. Artists and photographers can use this defensive tool to prevent their work from being used without permission for AI training models. It's free and easy to use, and it works well on my original art (see attached).
Even if you are personally fine with AI companies using your work without permission, we should all be using whatever tools we have until these companies switch to using ethically sourced models.