I had a thought with my morning coffee and I'm not sure I'm totally bought into it but here it is: I've always had a real problem with the way that myopic tech "UX product thinking" (the overly simplistic sort) chose "things need to be very easy slash invisible" as its only possible marker of success and AI really shows some of the stupid decision-making scaffolded by making some things very easy slash invisible
Conversation
Notices
-
Embed this notice
Cat Hicks (grimalkina@mastodon.social)'s status on Monday, 03-Mar-2025 00:43:23 JST Cat Hicks
-
Embed this notice
Cat Hicks (grimalkina@mastodon.social)'s status on Monday, 03-Mar-2025 00:43:52 JST Cat Hicks
Designing tools that work for and with us will require, unfortunately, a cultural context that is far more interested in and curious about human potential and human ability. I am baffled all the time by the number of tech places that don't invest in this. The bar is so low that if you invest in it a tiny bit you are just ridiculously ahead of others sometimes. Like my work is really great, but I do find it wild just how much I can regularly AMAZE tech leadership re: logic about human behavior.
-
Embed this notice
Cat Hicks (grimalkina@mastodon.social)'s status on Monday, 03-Mar-2025 00:43:53 JST Cat Hicks
The answer is obviously much more complicated than "make things hard slash always visible" lol, we all live in a world of very helpful abstraction a lot of the time, but still
-
Embed this notice
Cat Hicks (grimalkina@mastodon.social)'s status on Monday, 03-Mar-2025 00:43:53 JST Cat Hicks
It is possible to scaffold more thoughtfulness and it is possible to scaffold less thoughtfulness. I mean our entire education system (pls do not complain to me about the education system; I am comparing it AT LARGE to "not having an education system at all", it is a relatively recent human intervention) is essentially the former
Rich Felker repeated this.
-
Embed this notice