@rysiek @schock Some years ago I had this epiphany that the end goal for companies, especially those who use recommendation engines like Netflix and Amazon, is to emulate their users to perfect their "you might also like X" thingies. And by sucking up all our data they might do it, which is freaky enough by itself, but then what happens if these emulations develop consciousness…? This last point is probably extremely unlikely, but still.
This here seems like a step in just that direction.