Conversation
Notices
-
Embed this notice
Alexandre Oliva (lxo@gnusocial.net)'s status on Wednesday, 05-Jun-2024 12:42:55 JST Alexandre Oliva I recall someone from Brazil, IIRC involved with Big[GNU/]Linux, demoing a system that turned a web cam into a pointing device with eye tracking, and clicking by blinking one eye or the other.
this could be usable with an on-screen keyboard to enter text.
another alternative that occurs to me to enter text, that could probably be used with the same kind of eye tracking, was an interface that used a navigation/expanding tree visual in which different subsequent letters appeared from left to right, taking up different vertical sizes depending on a computed likelihood of that being the right letter. the more you moved the pointer to the right, the faster the tree would scroll left and expand further, and the selections that scrolled to the left would get committed to the paste buffer or somesuch. it sounded like something that, with some user training and a decent predictive language model, would enable far more efficient text entry than a virtual keyboard. unfortunately, I don't have any recollection of the name of the system whose demo I saw, and I'm not sure my description is enough to enable anyone to find it :-(
do these ring any bells?-
Embed this notice
Anna e só (anna@friend.camp)'s status on Wednesday, 05-Jun-2024 12:42:56 JST Anna e só I’m looking for assistive technologies (software or hardware) that could help someone with limited hand mobility control their computer and/or phone. One of their biggest difficulties at the moment is typing for long periods of time. Bonus points if it’s free and open source, but I’d like to offer them as many options as possible. Does anyone have any suggestions?
-
Embed this notice