Literally this update’s got me playing Operation. I find Send. I touch Send: I hear Send. I turn off my SR, stab in a hopeful fashion where Send should be, turn VO back on and if i’m lucky my prompt is sent. It is just the kind of awkward friction AI promoters claim to be saving us from.
Love all y'all’s discourse about the future of AI and accessibility, but here’s the present: Blind folk can't use the ChatGPT iPhone app without a silly workaround right now because OpenAI shipped an update that makes the Send button untouchable when our screen reader is on. Could someone find an intern who isn’t busy doing fireside chats and ask them to sort of human-check the code for basic accessibility before it ships? Just, like, until the future gets here?
Please stop using "blindness" as a lazy synonym for "ignorance" in your writing. If there’s one thing about Blind folks: we survive and thrive by being perceptive, observing patterns and responding to cues. Not only is this use of "blindness" rude — it's uninformed. Mistaken. Obtuse. Naive. Counterfactual. There you go — five better ways to say "ignorant" than "Blind"
, restaurant server who knew i was Blind quietly cleared my plate at a medium fancy dinner last week. He did not make a big production of saying "i’m taking this plate away!" nor did he do it as quietly as he could. He just — softly — said "yoink". It has been a week and i am still obsessed with the level of confidence and whimsy this guy can just draw upon whenever
As a Blind person i never thought i would be on social media savoring photos. But the communal Mastodon alt text game is so strong that sweet, poetic or silly descriptions abound on my timeline. Thanks to legions of people who take time to write a meaningful description of the ephemera they post, i learn so much about insects, plants, buildings, memes — all dispatches from a dimension of the world that i otherwise wouldn't experience. If you're wondering whether anybody reads these things: YES.
If you had told my 10-year-old Blind girl self that in 30 years i’d be taking pictures of a touchscreen with another touchscreen to find out from a conversational AI down on land how much longer the flight is; and that this is somehow easier than making the airline fix their website so i can read the damn time for myself; she would have been mad, but maybe not entirely surprised
FML. I left my guide dog Ellie home today because she hates her snow boots and I just — I just called my cane a "good girl!" out loud for anybody to hear when it found the door to Tenichi Mart.
Introducing a different kind of #Accessibility conference at the New York Public Library, October 21-22! Free to attend: focused on maker culture, affordable tech, intersectional / human factors, and emerging technologies. No web accessibility 101 talks and no talks focused on sales. Our proposal window has closed but we’ve left open a crack — fill the form this week and you're good! https://www.nypl.org/blog/2023/07/05/upcoming-nypls-accessible-technology-conference-2023
Today i let a little girl at the library play with my guide dog out of harness as a reward for finishing a Braille lesson her mom wanted her to have. For five minutes kiddo and doggo are happily visiting, wagging, talking and giggling. And then … i hear this rattle. i investigate. Guide dog Ellie has wriggled under her harness and has halfway put it on unassisted. She has now mastered the timeless library worker classic: "I’d love to keep chatting but i need to get back to work!"
I'm a Blind tech educator in New York. I'm passionate about making sure accessible tech isn't harmful or extractive, building digital literacy, and bringing tactile graphics and Braille within reach for all Blind people! I run the Dimensions Project at NYPL — we're the world's only free and public tactile graphics lab and we've got all the equipment and training you need to learn the art of tactile design