DEV Community

Tudor Brad
Tudor Brad

Posted on • Originally published at betterqa.co

The accessibility lawsuit that almost happened, and what we learned from the audit

In 2024, one of our clients got a demand letter from a law firm specializing in ADA compliance lawsuits. The letter cited specific pages on their e-commerce platform that were inaccessible to screen reader users. The client called us in a panic. They'd never done accessibility testing. Not once in four years of operation.

We ran a full WCAG 2.1 AA audit over three days. We found 67 distinct violations. Missing alt text on product images, form inputs without labels, a checkout flow that trapped keyboard focus inside a modal with no escape route, color contrast ratios as low as 2.1:1 on critical action buttons (WCAG AA requires 4.5:1).

The irony? Most of these were fixable. The engineering team cleared 48 of the 67 violations in eight working days. The remaining 19 required design changes that took another three weeks. The legal issue was resolved before it reached a courtroom. But the client had spent four years accumulating this debt, and it took a legal threat to prioritize it.

That engagement changed how we talk to clients about accessibility.

What automated scanners actually catch

We use Auditi, the accessibility platform we built, along with axe-core and Pa11y for automated scanning. Here's what people don't understand about automated accessibility tools: they catch roughly 30-40% of WCAG violations. That's it.

Automated tools are great at finding missing alt text, broken label associations, insufficient color contrast, missing landmark regions, and duplicate IDs. These are structural issues with clear right-and-wrong answers.

What they miss is everything that requires context. An automated tool can verify that an image has alt text. It cannot tell you whether that alt text is useful. alt="image" passes automated scanning. It tells a screen reader user nothing. alt="Blue wool sweater, front view, size medium" is what they actually need.

Automated tools also can't test keyboard navigation flows. They can check that interactive elements are focusable, but they can't tell you that tabbing through your checkout form goes: name, then submit button, then email, then back to name. That ordering makes no sense to a sighted user and makes even less sense to someone navigating by keyboard alone.

This is why we always pair automated scans with manual testing using actual assistive technology. NVDA on Windows, VoiceOver on Mac, TalkBack on Android. Our testers navigate the entire application using only a keyboard and screen reader combination. The bugs they find in those sessions are consistently more severe than what the scanner reports.

The color contrast problem across our entire ecosystem

I'll share something embarrassing. In early 2026, we ran our ecosystem scanner across all 13 BetterQA product sites and discovered that almost every site had color contrast violations. Our standard purple (#a855f7, Tailwind's purple-400) had a contrast ratio of about 3.3:1 on white backgrounds. WCAG AA needs 4.5:1.

We'd built and deployed a dozen products with a brand color that failed basic accessibility. It took us two weeks to fix across the ecosystem, bumping to purple-600 (#9333ea) which hits 4.6:1. The point isn't that we're bad at this. The point is that accessibility issues are that easy to introduce and that easy to miss when you're not testing for them.

What WCAG levels actually mean in practice

WCAG has three levels: A, AA, and AAA. Most regulations and lawsuits reference AA. Here's the practical difference.

Level A is the bare minimum. Your site has alt text, your forms have labels, your content doesn't rely solely on color to convey information. If you fail Level A, screen reader users literally cannot use parts of your site.

Level AA is what most organizations should target. It adds color contrast requirements (4.5:1 for normal text, 3:1 for large text), requires text resizing up to 200% without loss of content, and mandates consistent navigation patterns. This is where the legal bar sits in most jurisdictions.

Level AAA is aspirational for most products. It requires a 7:1 contrast ratio, sign language interpretation for audio content, and reading level accommodations. Very few commercial products achieve full AAA compliance, and it's not typically required by law.

We recommend AA as the target for all clients. It's achievable, legally defensible, and covers the vast majority of user needs.

The retrofit problem

The client with the demand letter spent roughly $45,000 in engineering time fixing accessibility issues retroactively. If they'd included accessibility testing from the start, the incremental cost would have been a fraction of that.

This is the pattern we see consistently. Teams build for two or three years without accessibility testing. Then something forces the issue, either a legal threat, a government contract requirement, or a user complaint that reaches someone with authority. Suddenly it's a priority, but now it's a retrofit.

Retrofitting accessibility is genuinely harder than building it in. When a React component has been in production for two years with no ARIA attributes, adding them often means rethinking the component's DOM structure. A modal that was built without keyboard trap management might need to be rewritten entirely. A custom dropdown that works fine with a mouse but is invisible to a screen reader can't just have role="listbox" slapped on it and work correctly.

We had one client where retrofitting a single date picker component took three days because the original implementation used div elements styled to look like inputs. The screen reader announced them as "group" elements with no indication they were interactive. Rebuilding it with proper semantic HTML and ARIA took the developer into areas of the spec she'd never touched.

The business case nobody wants to make

About 15-20% of the global population has some form of disability. In the US alone, that's roughly 61 million adults. Many of them are potential customers who will leave your site and go to a competitor if they can't use yours.

But here's the thing nobody says out loud: most companies don't invest in accessibility because of the business case. They invest because of the legal risk. The number of ADA-related web accessibility lawsuits has increased every year since 2017. In 2024, over 4,000 were filed in the US alone.

I'd rather people care about accessibility because it's right than because they're afraid of getting sued. But either motivation leads to the same outcome: a more usable product. I'll take it.

What we tell new clients

Start with an automated scan. Use axe-core, Pa11y, or Auditi. It takes an hour and gives you a baseline. Fix the automated findings first, because they're the easiest wins and they're defensible in court.

Then do a manual audit with real assistive technology. This is where the hard bugs surface. Keyboard navigation, screen reader flows, focus management. Budget two to five days depending on the size of the application.

Integrate accessibility checks into your CI pipeline. axe-core has integrations for Playwright, Cypress, and most other test frameworks. This prevents new violations from shipping.

Train your developers. Not a one-hour webinar. Actual hands-on sessions where they use NVDA to navigate their own product. We've found this single exercise changes behavior more than any policy document. When a developer hears their own form announced as "edit text, edit text, edit text, button" with no labels, they understand the problem in a way no Jira ticket can convey.

And test with actual users with disabilities if you can. We've run usability sessions where a blind user completed a task in 90 seconds that our testers assumed would take 30. We've also watched a user with motor impairments struggle for four minutes with a drag-and-drop interface that had no keyboard alternative. Those sessions change what you build.

At BetterQA, we include accessibility testing in our standard QA engagements because we've seen what happens when it's left out. It's always more expensive later. And someone always gets left behind.

Top comments (0)