Now class, can anyone tell me why this *might* be a bad idea?
Conversation
Notices
-
Embed this notice
Sasha Costanza-Chock (schock@mastodon.lol)'s status on Thursday, 06-Apr-2023 07:44:47 JST Sasha Costanza-Chock
-
Embed this notice
Børge (forteller@tutoteket.no)'s status on Thursday, 06-Apr-2023 08:53:17 JST Børge
@rysiek @schock Some years ago I had this epiphany that the end goal for companies, especially those who use recommendation engines like Netflix and Amazon, is to emulate their users to perfect their "you might also like X" thingies. And by sucking up all our data they might do it, which is freaky enough by itself, but then what happens if these emulations develop consciousness…? This last point is probably extremely unlikely, but still.
This here seems like a step in just that direction.
-
Embed this notice
Michał "rysiek" Woźniak · 🇺🇦 (rysiek@mstdn.social)'s status on Thursday, 06-Apr-2023 08:53:18 JST Michał "rysiek" Woźniak · 🇺🇦
@schock hey if that means that certai researchers stop doing dumb shit like "not-human-research" submitting buggy patches to Linux and checking if maintainers notice, or "not-human-research" sending legal threats to fedi admins and checking if they fall for them, great! :blobcatcoffee:
-
Embed this notice