In just a few years from now, „deepfake“ will be standard defence by anyone accused.
Conversation
Notices
-
Embed this notice
Jan Wildeboer 😷:krulorange: (jwildeboer@social.wildeboer.net)'s status on Monday, 04-Nov-2024 19:50:41 JST Jan Wildeboer 😷:krulorange: -
Embed this notice
Bence Varga (vbence@mastodon.social)'s status on Monday, 04-Nov-2024 20:45:00 JST Bence Varga @jwildeboer Maybe I live in a bubble, but my first thought is that people don't consume raw sources, they rely on a news outlet to do the verification, add context - and no deep fake can survive analysis more than a photoshopped image
-
Embed this notice
Alexandre Oliva (lxo@gnusocial.jp)'s status on Tuesday, 05-Nov-2024 05:02:39 JST Alexandre Oliva can we at least hope this will reduce video surveillance? -
Embed this notice
翠星石 (suiseiseki@freesoftwareextremist.com)'s status on Tuesday, 05-Nov-2024 09:13:58 JST 翠星石 @lxo @jwildeboer Unfortunately that will not reduce video surveillance - it will make it worse as facial recognition and gait recognition etc will be added to all cameras to prove that the video is real.
-
Embed this notice