Interesting post about the "Handmade" programming community: https://www.rfleury.com/p/the-marketplace-of-ideals
Unfortunately, the author isn't on the fediverse, or I'd mention him.
Interesting post about the "Handmade" programming community: https://www.rfleury.com/p/the-marketplace-of-ideals
Unfortunately, the author isn't on the fediverse, or I'd mention him.
I've tried to put forward a compromise with my AccessKit project, allowing a variety of GUI toolkits, based on different architectures and implementation tradeoffs, to share accessibility infrastructure, in the hope that many more non-bloated, or at least less-bloated, GUIs will be accessible.
It has taken me a while to come to terms with the fact that my solution, being a library and an abstraction layer, and one written in a programming language (Rust) that tends to be polarizing, will not be acceptable to everyone. I've been aware of the Handmade community since shortly before I started AccessKit, and I now believe I've fretted too much about trying to make my library acceptable even to that group, to the point of wondering whether I should have used Rust at all.
On the other hand, the "handmade" GUIs I've seen are, without exception, inaccessible with screen readers. Say what you will about software built on towers of abstractions and dependencies; these things do make it more likely that applications will be accessible.
I'm ambivalent about the Handmade ideal. On the one hand, their frustration with the state of modern software, expressed somewhat in the previously linked article and at length in the original Handmade Manifesto (https://web.archive.org/web/20160408150158/https://handmade.network/manifesto), resonates with me.
Lately, though, it has occurred to me that the true "Handmade" response to GUI accessibility might be that the current approach, as implemented by the platform accessibility APIs and current assistive technologies, is misguided, and that something like AccessKit is merely trying to mask the current complexity rather than eliminate it. I'm reminded of another post by Ryan Fleury: https://www.rfleury.com/p/untangling-lifetimes-the-arena-allocator particularly the section "An Alternative Approach: Change The Problem’s Nature"
I've often thought that the current mainstream approach to accessibility, i.e. accessibility APIs and external assistive technologies, is an unequal approach to UI. GUI toolkits and applications have full control and responsibility for the visual UI, but for other modalities, the toolkit or application exposes a generic representation of the UI and leaves the details to an external assistive technology.
I've wondered lately if it would be good to double down on the self-voicing (or more generically, self-outputting?) approach. The current Windows "screen reader APIs" (that's what we call them), including the one I developed myself 10+ years ago, are too simplistic; they don't allow the application to be called back when a speech utterance is complete or when a button is pressed on a Braille display, and they don't allow applications to take over screen reader keyboard commands.
Someone looking at GUI accessibility without knowledge of the current solutions would probably conclude that the obvious way to make a GUI accessible is for the GUI toolkit itself to support alternative input and output methods. For example, the GUI toolkit could directly render the text to be spoken or shown on a Braille display. And in fact, for the most part, the games that have been made accessible to blind people have implemented accessibility this way.
On Windows, what we often call the self-voicing approach is made more practical by the fact that the major third-party screen readers offer APIs for sending text strings to the screen reader to be spoken and/or shown on a Braille display, so applications that take this approach don't have to use their own text-to-speech engine. None of the platform-provided screen readers offer something like this though, even on Windows.
Disclaimer: I'm thinking out loud here, not announcing that I have the definitive answer.
GNU social JP is a social network, courtesy of GNU social JP管理人. It runs on GNU social, version 2.0.2-dev, available under the GNU Affero General Public License.
All GNU social JP content and data are available under the Creative Commons Attribution 3.0 license.