But the cat is out of the bag now. Predators will deepfake images of people they know. These images can destroy people. The new offence in the OSB on deepfakes tries to tackle the issue. Really though we also need the personal tools such as E2EE to stop our own children’s images being obtained, or reduce the likelihood or data set for training this material.
Conversation
Notices
-
Embed this notice
James Baker (jamesbaker@social.openrightsgroup.org)'s status on Wednesday, 28-Jun-2023 21:48:15 JST James Baker -
Embed this notice
James Baker (jamesbaker@social.openrightsgroup.org)'s status on Wednesday, 28-Jun-2023 21:48:17 JST James Baker This story about AI generated CSAM is horrifying https://www.bbc.co.uk/news/uk-65932372. As a parent it makes me even more defensive of the need for E2EE technology. If you share family pics on social media you now risk a pedophile using them to deepfake CSAM.
AnthonyJK-Admin repeated this.
-
Embed this notice