Recently, I visited my local hackerspace for the “Do it Blind” meetup. Do it Blind is the in-house effort to make the hackerspace itself accessible for people with visual impairments.
My friend, who had invited me, was trying out how far he could get in the 3D printing process without any sighted assistance, only using his screen reader. While my holographic Eevee coin was printing, I followed along (minus the screen reader, plus the visuals) while he installed OctoPrint on Windows (which seems worth mentioning because the Linux users in the hackerspace believed it impossible).
I want to say “it worked”, but it would be more accurate to say “he made it work”.
There was a sign-up form with no form labels and a link instead of a button. Sure, with context clues, it’s kinda obvious what 3 consecutive text input fields would want you to enter: A username, a password, and password confirmation.
But the ultimate knock-out: a modal window with 2 options to close it, but neither was present on keyboard navigation. There was no way to get past it without using the mouse.
Regardless of the 2.1.1 keyboard navigation failure, this setup flow was far from accessible. The whole point of this experiment was to find out if the 3D printing process can be done with only a screen reader and a lot of dedication. But not all assistive tech users are power users like that.
Web accessibility best practices focus on assistive tech power users.
Maybe the aforementioned Linux users in attendance were already a clue, but generally speaking, the digital literacy (and 3D printing knowledge) inside a hackerspace is significantly higher than what you find in the average population.
Unlabelled forms and overlay-window softlocks are not uncommon issues - still, in the year of 2025 - and not only screen reader users but assistive tech users all over the world are left with no other option than to make it work. Somehow.
Dark Souls Level User Experience
Again: circling back to Dark Souls. If you don’t know it, all you really need to know about it is that this game is notoriously hard to beat. Which is why I like to call user flows that are “accessible/compliant in theory” Dark Souls UX.
I have compared using assistive tech to Dark Souls before, and I will say it again: Just because it is possible to do something with assistive tech does not mean that most users will have the nerve to complete it. Yet most accessibility evaluations only look at what’s possible for users with substantial digital skills. Just because your accessibility auditor, another power user who knows the differences between NVDA and JAWS in their sleep, can navigate the page just fine, does not mean all other screen reader users will.
Yes, you can get through it. You can make it work.
But at what cost?
Good Experience or Just… Experience?
"For the love of users, stop calling it a good experience if not everyone gets to have it." - Julia Undeutsch
Assistive tech users just don’t receive the same amount of love when it comes to user flow design.
But why aren’t hoverstates designed to delight keyboard navigation users?
Why isn’t microcopy written to seamlessly guide screen reader users through the flow?
Why don’t brand guidelines include instructions on how to write descriptive alt text in line with the tone of voice?
Accessibility is mostly regarded through the technical lens, not the human one. This makes it all too easy to forget that behind all those standards, regulations, and requirements is nothing but a person trying to make it work with their setup.
Top comments (0)