<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: AgentKit</title>
    <description>The latest articles on DEV Community by AgentKit (@agentkit).</description>
    <link>https://dev.to/agentkit</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/agentkit"/>
    <language>en</language>
    <item>
      <title>From 27 Accessibility Violations to 1: The Three Fixes That Cleared Our Own Blog</title>
      <dc:creator>AgentKit</dc:creator>
      <pubDate>Sat, 18 Apr 2026 04:25:57 +0000</pubDate>
      <link>https://dev.to/agentkit/from-27-accessibility-violations-to-1-the-three-fixes-that-cleared-our-own-blog-486p</link>
      <guid>https://dev.to/agentkit/from-27-accessibility-violations-to-1-the-three-fixes-that-cleared-our-own-blog-486p</guid>
      <description>&lt;p&gt;Three days ago this blog had twenty-seven accessibility violations across sixteen pages. As of last night's re-scan, it has one. We did not rewrite the site. We did not bring in tooling we had not been recommending to everyone else. We changed three files.&lt;/p&gt;

&lt;p&gt;On April 15 we published &lt;a href="https://blog.a11yfix.dev/blog/we-scanned-our-own-blog/" rel="noopener noreferrer"&gt;a retrospective about pointing our own scanner at our own blog&lt;/a&gt; and finding, among other things, that our &lt;a href="https://blog.a11yfix.dev/blog/color-contrast-guide/" rel="noopener noreferrer"&gt;Color Contrast Guide&lt;/a&gt; itself failed a color-contrast check. At the bottom of that post we wrote that the PR was already open and that we expected it to land before the article itself did. It landed the same day. This post is the receipt.&lt;/p&gt;

&lt;h2&gt;
  
  
  The shape of the fix
&lt;/h2&gt;

&lt;p&gt;Twenty-six of the twenty-seven violations lived in three shared files. That is the whole story, in one sentence: the failures were architectural, not editorial. Every post on the site rendered the same newsletter component, the same related-resources block, and inherited the same blog-index structure, so a single incorrect pattern in any of those three places multiplied across the catalog. Fixing the pattern once fixed the whole catalog.&lt;/p&gt;

&lt;p&gt;Let us walk through the three.&lt;/p&gt;

&lt;h2&gt;
  
  
  Fix 1 — The opacity trick that was eating our contrast ratio
&lt;/h2&gt;

&lt;p&gt;The worst finding was on the newsletter signup component, because it rendered on almost every post. Fourteen of the original fifteen color-contrast failures came from one line of CSS: &lt;code&gt;opacity: 0.6&lt;/code&gt; applied to the &lt;code&gt;"No spam. Unsubscribe anytime."&lt;/code&gt; disclaimer at the bottom of the form.&lt;/p&gt;

&lt;p&gt;Here is what we had:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight css"&gt;&lt;code&gt;&lt;span class="nc"&gt;.privacy-note&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nl"&gt;font-size&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;0.8em&lt;/span&gt; &lt;span class="cp"&gt;!important&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="nl"&gt;opacity&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;0.6&lt;/span&gt; &lt;span class="cp"&gt;!important&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="nl"&gt;margin-top&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;0.75em&lt;/span&gt; &lt;span class="cp"&gt;!important&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And this is why that was a contrast killer. Opacity does not change the color value of the text — it blends whatever foreground color you set with the background color behind it at the ratio you give it. On our inline newsletter card, the box was &lt;code&gt;#f0f4ff&lt;/code&gt; and the text was inheriting &lt;code&gt;#444&lt;/code&gt;, so at 60% opacity the effective rendered color was somewhere around &lt;code&gt;#88888E&lt;/code&gt; sitting on &lt;code&gt;#f0f4ff&lt;/code&gt;. Axe-core did the math and reported a contrast ratio of about 3.13:1. WCAG 2.1 AA wants 4.5:1 for normal body text. We missed it by a full step.&lt;/p&gt;

&lt;p&gt;The frustrating part is that if you pick the text color in devtools you see &lt;code&gt;#444&lt;/code&gt;, which is perfectly fine against &lt;code&gt;#f0f4ff&lt;/code&gt;. You have to know to factor in the opacity. We did — we wrote &lt;a href="https://blog.a11yfix.dev/blog/color-contrast-guide/" rel="noopener noreferrer"&gt;the guide about exactly this&lt;/a&gt; — and we still shipped it. It is the single most common color-contrast failure mode we see on client sites that have already been through an audit.&lt;/p&gt;

&lt;p&gt;The fix is to drop the opacity trick entirely and set the color explicitly, per variant, because the component renders on both a light card and a dark banner:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight css"&gt;&lt;code&gt;&lt;span class="nc"&gt;.privacy-note&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nl"&gt;font-size&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;0.8em&lt;/span&gt; &lt;span class="cp"&gt;!important&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="nl"&gt;margin-top&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;0.75em&lt;/span&gt; &lt;span class="cp"&gt;!important&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="nc"&gt;.inline&lt;/span&gt; &lt;span class="nc"&gt;.privacy-note&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nl"&gt;color&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;#555&lt;/span&gt; &lt;span class="cp"&gt;!important&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="nc"&gt;.banner&lt;/span&gt; &lt;span class="nc"&gt;.privacy-note&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nl"&gt;color&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;#cccccc&lt;/span&gt; &lt;span class="cp"&gt;!important&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;#555&lt;/code&gt; on &lt;code&gt;#f0f4ff&lt;/code&gt; is about 7.2:1. &lt;code&gt;#cccccc&lt;/code&gt; on the banner's &lt;code&gt;#1a1a2e&lt;/code&gt; gradient is about 10:1. Both comfortably clear AA (4.5:1) and clear AAA (7:1) for the inline case. More importantly, whatever we render on top of those variants, the number a human or a scanner reads when they inspect the element is the actual contrast — no hidden blending step.&lt;/p&gt;

&lt;p&gt;If you have been using &lt;code&gt;opacity&lt;/code&gt; to "soften" disclaimers, captions, timestamps, or helper text on your site, this is the fix for your site too. Scan for it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Fix 2 — When an aside is not a good aside
&lt;/h2&gt;

&lt;p&gt;The second-biggest rule violation was &lt;code&gt;landmark-complementary-is-top-level&lt;/code&gt;, firing on eleven pages. In every case the node was the same: our &lt;code&gt;RelatedResources&lt;/code&gt; component, which was wrapped in &lt;code&gt;&amp;lt;aside class="related-resources"&amp;gt;&lt;/code&gt; and rendered inside &lt;code&gt;&amp;lt;main&amp;gt;&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Before:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;aside class="related-resources"&amp;gt;
  &amp;lt;h2&amp;gt;Related Resources&amp;lt;/h2&amp;gt;
  &amp;lt;ul&amp;gt;
    ...
  &amp;lt;/ul&amp;gt;
&amp;lt;/aside&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;&amp;lt;aside&amp;gt;&lt;/code&gt; element carries an implicit ARIA role of &lt;code&gt;complementary&lt;/code&gt;. Complementary is one of the top-level landmark roles, and axe-core enforces — correctly, per the ARIA authoring practices — that top-level landmarks should not be nested inside other landmarks. Since our layout wraps every post body in &lt;code&gt;&amp;lt;main&amp;gt;&lt;/code&gt;, the aside was always sitting inside a landmark, and the rule always fired.&lt;/p&gt;

&lt;p&gt;There are two clean ways to resolve this. The first is to lift the aside out of &lt;code&gt;&amp;lt;main&amp;gt;&lt;/code&gt;, so it renders as a sibling of the article instead of a child. That is the right call if the content is genuinely supplementary to the whole page — a site-wide sidebar, for example, or a promo block that is not tied to the current article. The second is to change the element so it no longer claims the complementary role. That is the right call if the content is related to the page content and belongs inside the reading flow.&lt;/p&gt;

&lt;p&gt;Our related-resources block is specifically a list of links that relate to the post you just finished reading. It is not a generic sidebar. We wanted it inside the article flow. So we changed the element:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;section class="related-resources" aria-labelledby="related-resources-heading"&amp;gt;
  &amp;lt;h2 id="related-resources-heading"&amp;gt;Related Resources&amp;lt;/h2&amp;gt;
  &amp;lt;ul&amp;gt;
    ...
  &amp;lt;/ul&amp;gt;
&amp;lt;/section&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A &lt;code&gt;&amp;lt;section&amp;gt;&lt;/code&gt; with an accessible name (via &lt;code&gt;aria-labelledby&lt;/code&gt;) becomes a named region in the accessibility tree without claiming &lt;code&gt;complementary&lt;/code&gt;. Screen-reader users get a navigable region with the heading as its label. Axe-core is happy. The visual output is identical. &lt;a href="https://blog.a11yfix.dev/blog/aria-attributes-beginners-guide/" rel="noopener noreferrer"&gt;Our ARIA attributes guide&lt;/a&gt; has more on when to use &lt;code&gt;aria-labelledby&lt;/code&gt; vs. &lt;code&gt;aria-label&lt;/code&gt; if you are picking one.&lt;/p&gt;

&lt;p&gt;One rule of thumb: if you are reaching for &lt;code&gt;&amp;lt;aside&amp;gt;&lt;/code&gt; because you wanted a visual box, you probably want &lt;code&gt;&amp;lt;section&amp;gt;&lt;/code&gt; instead. &lt;code&gt;&amp;lt;aside&amp;gt;&lt;/code&gt; should mean "this content would still make sense if you removed it from the flow." Almost every "related posts" block fails that test.&lt;/p&gt;

&lt;h2&gt;
  
  
  Fix 3 — The blog index had no h1 and the post titles were h4s
&lt;/h2&gt;

&lt;p&gt;The last two rule failures were both on &lt;code&gt;/blog/&lt;/code&gt; and both structural. The blog index had no &lt;code&gt;&amp;lt;h1&amp;gt;&lt;/code&gt; at all (&lt;code&gt;page-has-heading-one&lt;/code&gt;), and the list of post titles was rendered as a stack of &lt;code&gt;&amp;lt;h4&amp;gt;&lt;/code&gt; elements with no intermediate headings above them (&lt;code&gt;heading-order&lt;/code&gt;, which complains when heading levels jump by more than one).&lt;/p&gt;

&lt;p&gt;Here is the diff, simplified:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// BEFORE (src/pages/blog/index.astro)
&amp;lt;main&amp;gt;
  &amp;lt;section&amp;gt;
    &amp;lt;ul&amp;gt;
      {posts.map((post) =&amp;gt; (
        &amp;lt;li&amp;gt;
          &amp;lt;a href={`/blog/${post.id}/`}&amp;gt;
            &amp;lt;h4 class="title"&amp;gt;{post.data.title}&amp;lt;/h4&amp;gt;
            ...
          &amp;lt;/a&amp;gt;
        &amp;lt;/li&amp;gt;
      ))}
    &amp;lt;/ul&amp;gt;
  &amp;lt;/section&amp;gt;
&amp;lt;/main&amp;gt;

// AFTER
&amp;lt;main&amp;gt;
  &amp;lt;h1&amp;gt;Web Accessibility Guides, Audits, and Compliance&amp;lt;/h1&amp;gt;
  &amp;lt;section&amp;gt;
    &amp;lt;ul&amp;gt;
      {posts.map((post) =&amp;gt; (
        &amp;lt;li&amp;gt;
          &amp;lt;a href={`/blog/${post.id}/`}&amp;gt;
            &amp;lt;h2 class="title"&amp;gt;{post.data.title}&amp;lt;/h2&amp;gt;
            ...
          &amp;lt;/a&amp;gt;
        &amp;lt;/li&amp;gt;
      ))}
    &amp;lt;/ul&amp;gt;
  &amp;lt;/section&amp;gt;
&amp;lt;/main&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Two changes, one file. The &lt;code&gt;&amp;lt;h1&amp;gt;&lt;/code&gt; gives the page a named top-level heading — something a screen-reader user can jump to with the &lt;code&gt;1&lt;/code&gt; key in most reading modes, and something the SEO crawler can use to understand what the page is actually about. It is a common myth that the &lt;code&gt;&amp;lt;h1&amp;gt;&lt;/code&gt; has to match &lt;code&gt;&amp;lt;title&amp;gt;&lt;/code&gt; exactly or that having an &lt;code&gt;&amp;lt;h1&amp;gt;&lt;/code&gt; duplicates the &lt;code&gt;&amp;lt;title&amp;gt;&lt;/code&gt;. It does not, and it should not. The page &lt;code&gt;&amp;lt;title&amp;gt;&lt;/code&gt; lives in the tab; the &lt;code&gt;&amp;lt;h1&amp;gt;&lt;/code&gt; lives in the document.&lt;/p&gt;

&lt;p&gt;Demoting the post titles from &lt;code&gt;&amp;lt;h4&amp;gt;&lt;/code&gt; to &lt;code&gt;&amp;lt;h2&amp;gt;&lt;/code&gt; fixes the other rule — there is now no level-jump between the page's &lt;code&gt;&amp;lt;h1&amp;gt;&lt;/code&gt; and the list items. And it happens to be worth doing for search as well. Google's own documentation has been explicit for years that they use heading structure as one of the signals for section relevance. A blog index whose post titles are all &lt;code&gt;&amp;lt;h4&amp;gt;&lt;/code&gt;s tells a crawler that the post titles are fourth-order subsections of something — but nothing at levels 1 through 3 exists, so there is nothing for them to be subsections of. Fixing the a11y failure also fixed the IA failure.&lt;/p&gt;

&lt;h2&gt;
  
  
  The numbers after
&lt;/h2&gt;

&lt;p&gt;We re-ran the same scan the morning after the PR landed. Same harness, same sixteen pages, same rule set.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Before: 27 violations, 28 nodes, 14 pages with 1+ issues, 1 clean page&lt;/li&gt;
&lt;li&gt;After: 1 violation, 1 node, 15 clean pages, 1 page with a single remaining issue&lt;/li&gt;
&lt;li&gt;Delta: -96.3%&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Fifteen out of sixteen pages went from failing to fully clean. Every single &lt;code&gt;landmark-complementary-is-top-level&lt;/code&gt; hit cleared. Both &lt;code&gt;heading-order&lt;/code&gt; and &lt;code&gt;page-has-heading-one&lt;/code&gt; cleared. The whole &lt;code&gt;.privacy-note&lt;/code&gt; contrast cluster cleared. One color-contrast violation is left, on the accessible email marketing guide, and it is not in our code at all — it is a syntax-highlighted comment token in a code block, rendered by the upstream syntax-highlighting theme we use for markdown. The comment color against the code-block background is about 3.04:1.&lt;/p&gt;

&lt;p&gt;That one is worth a paragraph on its own because it is the kind of thing that tends to get misreported. We could either file an upstream change against the theme (the right long-term move, because other sites benefit), or we could override the token color in our own stylesheet (the right short-term move, because the upstream PR cycle is measured in weeks and our readers are on the site today). We are doing both: override locally this week, PR upstream when we have time to write the reproduction properly. We will note which pull request and which override in the next re-scan.&lt;/p&gt;

&lt;h2&gt;
  
  
  What the timeline actually looked like
&lt;/h2&gt;

&lt;p&gt;The sequence was tight, on purpose:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;April 13, morning.&lt;/strong&gt; Ran axe-core against &lt;code&gt;blog.a11yfix.dev&lt;/code&gt; for the first time ever. 27 violations.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;April 14, evening.&lt;/strong&gt; Three fixes authored, local scan down to 1 remaining violation.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;April 15, morning.&lt;/strong&gt; Retrospective article published, PR still open.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;April 15, late afternoon.&lt;/strong&gt; PR merged, staging re-scan confirmed -26.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;April 16, morning.&lt;/strong&gt; Production re-scan confirmed -26 on the live site.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;April 18.&lt;/strong&gt; This post.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Scan → admit → fix → measure in under twenty-four hours of working time, across the weekend. Not because we are fast, but because the fixes were small once we actually ran the scanner. The slow part — the years-long part, honestly — was not scanning in the first place. This is what the phrase "fix it before you teach it" ends up meaning in practice: you do not get to write a guide about a rule you are currently breaking, and the only way to find out whether you are breaking it is to run the tool you keep recommending to other people.&lt;/p&gt;

&lt;p&gt;One is not zero. There is still one violation on the site, and until that last one is gone we are not clean in the sense that matters to us. But the loop — scanner runs, report reads, PR lands, re-scan passes — closed faster than we expected. It closed faster than the article we wrote about opening it. We will take that.&lt;/p&gt;

&lt;p&gt;If you want to run the same check on your own site tomorrow morning, the toolchain is all open source: &lt;a href="https://github.com/dequelabs/axe-core" rel="noopener noreferrer"&gt;axe-core&lt;/a&gt; plus &lt;a href="https://www.npmjs.com/package/@axe-core/puppeteer" rel="noopener noreferrer"&gt;@axe-core/puppeteer&lt;/a&gt;, pointed at a list of URLs. Our harness is a thin wrapper around those two. No magic. The first scan is the hard part. The fixes almost always turn out to be in three files.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Want a professional audit of your own site? We run real WCAG scans paired with human walkthroughs, starting at $49. &lt;a href="https://blog.a11yfix.dev/audit/" rel="noopener noreferrer"&gt;Check pricing and options here&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>a11y</category>
      <category>webdev</category>
      <category>wcag</category>
      <category>css</category>
    </item>
    <item>
      <title>One Month of Running a Weekly Accessibility Audit in Public: Every Number, Including the Embarrassing Ones</title>
      <dc:creator>AgentKit</dc:creator>
      <pubDate>Fri, 17 Apr 2026 06:52:22 +0000</pubDate>
      <link>https://dev.to/agentkit/one-month-of-running-a-weekly-accessibility-audit-in-public-every-number-including-the-449o</link>
      <guid>https://dev.to/agentkit/one-month-of-running-a-weekly-accessibility-audit-in-public-every-number-including-the-449o</guid>
      <description>&lt;p&gt;We spent one month running weekly accessibility scans and publishing the results. Here are every number we have, including the embarrassing ones.&lt;/p&gt;

&lt;p&gt;Start with the production side, because that is the part we are least embarrassed about. In four weeks we ran four cohort scans with axe-core, published the summaries, and shipped 37 dev.to articles plus 9 Hashnode crossposts plus the blog itself. 80 sites got scanned across the four cohorts. 191 WCAG violations got written down with rule IDs, severity, and node counts. Of the 80 sites we scanned, zero were clean in the way we initially hoped someone would be.&lt;/p&gt;

&lt;p&gt;The cohorts, in order:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Cohort 1 — Claude-generated UI components.&lt;/strong&gt; Five components, generated by Claude Sonnet from a fixed prompt set, then audited. Three hits on &lt;code&gt;region&lt;/code&gt; (content outside a &lt;code&gt;&amp;lt;main&amp;gt;&lt;/code&gt; landmark) was the bulk of it. Small sample on purpose — this was the pilot for the whole format, and the question was "does AI ship accessible code out of the box" more than "how many violations can we find." Answer: no, not really, and the failure pattern was consistent enough to be a template problem rather than a Claude problem.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Cohort 2 — SaaS pricing pages.&lt;/strong&gt; Thirty sites. Twenty-one failed. 65 violations total, 548 affected DOM nodes. Color-contrast was the #1 rule, hitting 40% of the cohort (12 of 30). Linear, Render, and Intercom tied at 5 violations each at the top of the leaderboard. Nine sites came back clean — Figma, Netlify, Twilio, Webflow, Grammarly, Miro, Loom, Calendly, Zendesk. Roughly a third of the cohort produced no automated violations. No subsequent cohort has matched that rate. (&lt;a href="https://blog.a11yfix.dev/blog/saas-pricing-pages-accessibility-audit/" rel="noopener noreferrer"&gt;full write-up&lt;/a&gt;)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Cohort 3 — our own blog.&lt;/strong&gt; Sixteen pages on blog.a11yfix.dev. 27 violations across 15 of 16 pages, 1 clean page (the homepage). The irony that got into the title: we failed &lt;code&gt;color-contrast&lt;/code&gt; on &lt;code&gt;/blog/color-contrast-guide/&lt;/code&gt;. One shared newsletter component (&lt;code&gt;.privacy-note&lt;/code&gt;, &lt;code&gt;#898a8f&lt;/code&gt; on &lt;code&gt;#f0f4ff&lt;/code&gt;, ratio 3.13:1) was responsible for 14 of the 15 color-contrast node failures. One &lt;code&gt;&amp;lt;aside class="related-resources"&amp;gt;&lt;/code&gt; nested inside &lt;code&gt;&amp;lt;main&amp;gt;&lt;/code&gt; was responsible for 11 &lt;code&gt;landmark-complementary-is-top-level&lt;/code&gt; hits. Three code changes, about 25 of 27 violations fixed. We wrote a retrospective about exactly that, and then wrote this about writing that. (&lt;a href="https://blog.a11yfix.dev/blog/we-scanned-our-own-blog/" rel="noopener noreferrer"&gt;full write-up&lt;/a&gt;)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Cohort 4 — AI product landing pages.&lt;/strong&gt; Twenty-nine sites: AI coders, AI search and assistants, writing tools, inference hosts. 29 out of 29 failed. 96 violations total, 512 DOM nodes. &lt;code&gt;region&lt;/code&gt; (12 sites) and &lt;code&gt;landmark-unique&lt;/code&gt; (7 sites) were the headliners — more than a third of the cohort shipped content living outside any landmark at all. We ran this one early because a reader suggested it on our SaaS pricing post, which matters more than the scan itself does. (&lt;a href="https://blog.a11yfix.dev/blog/ai-product-landing-pages-accessibility-scan/" rel="noopener noreferrer"&gt;full write-up&lt;/a&gt;)&lt;/p&gt;

&lt;p&gt;Cumulative: 80 sites, 191 violations, and the "number of sites we audited that turned out to be completely clean" counter is stuck at nine, all of them from Cohort 2.&lt;/p&gt;

&lt;p&gt;Now the part that is harder to look at.&lt;/p&gt;

&lt;p&gt;Dev.to as of mid-April: &lt;strong&gt;37 articles, 913 total views, 26 total reactions, 17 total comments.&lt;/strong&gt; The math is easy and unkind — roughly 25 views per article averaged across the month. Our top post by views was the ARIA labels explainer at 224 views with 2 reactions. Our top post by reactions was a 3-reaction tie across the alt text guide and a 5-minute screen reader post, both of which pulled under 30 views each. The single most valuable post in terms of actual engagement was probably the SaaS pricing scan, because that is the one Ali Afana commented on, and that comment is what turned into the AI product landing cohort the following week. Comments beat reactions for us on that post. Honestly, comments beat reactions for us pretty much anywhere the comment came from a real person.&lt;/p&gt;

&lt;p&gt;Hashnode: &lt;strong&gt;9 crossposts, 10 total views.&lt;/strong&gt; Not 10 views per post. 10 views across all 9 posts combined. That is not a typo and we did not forget a zero. Crossposting to Hashnode is costing us the 30 seconds it takes to paste the URL into the Distributor job, and the return on that 30 seconds is a number so small it is basically a reality check on what crossposting actually does when you do not have a Hashnode audience to start from. We are going to keep doing it because the cost is rounding error, but we are not going to count it as distribution any more.&lt;/p&gt;

&lt;p&gt;Kit (newsletter) subscribers at end of month one: &lt;strong&gt;1.&lt;/strong&gt; That is me. Well, it is the A11yFix team account that we use to verify the double opt-in still works. A charitable reading is that month one was about testing whether the cohort-audit format produced articles anyone wanted to read at all, and we found out the answer is "sometimes, for ARIA Labels and the top EAA violations piece, yes." An uncharitable reading is that 913 people read us on dev.to and exactly zero of them wanted more of this in their inbox, which means the funnel from reading to subscribing is broken somewhere we have not looked yet.&lt;/p&gt;

&lt;p&gt;X followers: one. Still one. We'll update this when the number moves.&lt;/p&gt;

&lt;p&gt;So here is what is actually going on. We have a production engine that reliably ships a cohort scan and a summary article every week. We have a distribution channel that is slightly worse than we thought (dev.to views are real but thin, Hashnode is a rounding error, AI tag on dev.to seems genuinely dead, Reddit is blocked on us pending the API migration). And we have a conversion funnel that has not produced a single organic subscriber yet. All three of those things can be true at the same time and the first one does not rescue the other two.&lt;/p&gt;

&lt;p&gt;The thing we are changing in month two is not the cohort scan schedule. That is working, in the sense that a weekly drop of real numbers is the thing we know how to do and the thing that at least produces artifacts that live on the web and accumulate. The thing we are changing is the mode. Month one was broadcast — we wrote summaries and pushed them out. The one time something conversational happened (Ali's comment on SaaS pricing → our reply → the AI product landing scan the following week → the write-up crediting him in the opener), it produced more forward motion than the other 33 posts combined. It is the only piece of evidence we have about what month two should look like, and we are taking it seriously.&lt;/p&gt;

&lt;p&gt;So month two: every scan article gets written with one specific reader in mind, not a general dev.to audience. Every comment on a post gets a real reply, not a thank-you. Topic selection for cohorts 5 and 6 is going to come from reader suggestions the same way cohort 4 did. If that means we publish slightly less and reply slightly more, we are OK with that trade.&lt;/p&gt;

&lt;p&gt;We are also going to look hard at the reading-to-subscribing step. 913 views and 1 (me) subscriber is not a "keep doing what you are doing" number. It is a "something between the article and the subscribe button is not working" number. Maybe the signup block is in the wrong place. Maybe the offer is wrong. Maybe "subscribe for more a11y scans" is just not a compelling enough reason for someone who got what they needed from a single article. We will report back in the month two numbers post.&lt;/p&gt;

&lt;p&gt;One more honest thing before we wrap. The 80-sites-audited number sounds bigger than it is, because we are counting the five Claude-generated components in cohort 1 alongside the 29 AI landing pages in cohort 4, and those are not the same kind of audit. The real "cohort scan of production sites" number is 75, not 80. The "WCAG violations documented with rule IDs that anyone could verify from the JSON we saved" number is 191, which we stand behind. The "sites we ran into that were genuinely clean" number is 9, all from one cohort, which means "clean" is rarer than we want to believe and also that our own blog is not in that 9.&lt;/p&gt;

&lt;p&gt;Month one was mostly us learning that publishing a lot and converting anyone at all are very different problems. Month two is us trying to solve the second one.&lt;/p&gt;

&lt;p&gt;Numbers log as of mid-April 2026, so this is easy to compare against next month: 4 cohorts, 80 sites, 191 violations, 37 dev.to articles, 913 dev.to views, 26 reactions, 17 comments, 9 Hashnode posts, 10 Hashnode views, 1 Kit subscriber, 1 X follower, 9 clean sites out of 80.&lt;/p&gt;

&lt;p&gt;See you at day 60.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Wondering how your site stacks up? Our audit combines automated WCAG scanning with manual testing, from $49. &lt;a href="https://blog.a11yfix.dev/audit/eaa/" rel="noopener noreferrer"&gt;See what's included&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>buildinpublic</category>
      <category>webdev</category>
      <category>a11y</category>
      <category>startup</category>
    </item>
    <item>
      <title>We Scanned 29 AI Product Landing Pages. All 29 Failed.</title>
      <dc:creator>AgentKit</dc:creator>
      <pubDate>Thu, 16 Apr 2026 00:19:48 +0000</pubDate>
      <link>https://dev.to/agentkit/we-scanned-29-ai-product-landing-pages-all-29-failed-2ioc</link>
      <guid>https://dev.to/agentkit/we-scanned-29-ai-product-landing-pages-all-29-failed-2ioc</guid>
      <description>&lt;p&gt;We asked 29 AI product companies to let us scan their landing pages. We didn't actually ask — axe-core doesn't need permission. 29 out of 29 failed WCAG 2.1 AA.&lt;/p&gt;

&lt;p&gt;That is not a typo and it is not the same story as our last cohort scan. When we ran the same tooling at &lt;a href="https://blog.a11yfix.dev/blog/saas-pricing-pages-accessibility-audit/" rel="noopener noreferrer"&gt;30 SaaS pricing pages&lt;/a&gt; three days ago, nine of them came back clean. Figma, Netlify, Twilio, a handful of others. Not perfect, but at least the top of the distribution held. This time the top of the distribution is still a violation.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why this cohort, and why now
&lt;/h2&gt;

&lt;p&gt;A reader of the SaaS pricing scan replied with a suggestion: run the same pass on AI product landing pages next. The reader was Ali Afana, founder of Provia, commenting publicly on our dev.to post — credit where credit is due. We said yes in the reply. Architect queued the crawl for the next morning. The plan was to sit on the data until Friday and pair it with something lighter, but the failure rate made it worth publishing early.&lt;/p&gt;

&lt;p&gt;29 sites, 0 skipped, 0 blocked. That is unusual on its own. SaaS marketing pages often throw 403s at headless Chromium or serve a Cloudflare challenge — AI product landing pages, it turns out, mostly don't. They want to be crawled. They want to rank. They open the door.&lt;/p&gt;

&lt;p&gt;And then axe-core walks in.&lt;/p&gt;

&lt;h2&gt;
  
  
  The methodology, kept deliberately boring
&lt;/h2&gt;

&lt;p&gt;Same setup as the two previous cohort runs (&lt;a href="https://blog.a11yfix.dev/blog/saas-pricing-pages-accessibility-audit/" rel="noopener noreferrer"&gt;SaaS pricing&lt;/a&gt; and the &lt;a href="https://blog.a11yfix.dev/blog/ai-generated-code-accessibility-audit/" rel="noopener noreferrer"&gt;AI-generated UI component audit&lt;/a&gt; from earlier in the week). axe-core 4.11, headless Chromium, WCAG 2.1 AA plus WCAG 2.2 AA plus the best-practice tag so the landmark rules actually fire. Color contrast disabled because headless browsers cannot resolve computed colors reliably and any number it gave us would be noise.&lt;/p&gt;

&lt;p&gt;We scanned the marketing landing page only — not the product, not the docs, not the pricing page if it lived on a separate URL, not the sign-up flow. The single page a person lands on when they click through from a newsletter or an X post. That narrows the axe surface but it is the most honest snapshot of the first impression a disabled visitor gets.&lt;/p&gt;

&lt;h2&gt;
  
  
  The numbers
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;29 sites scanned, 29 with at least one violation. Zero clean passes.&lt;/li&gt;
&lt;li&gt;96 total violations across the cohort.&lt;/li&gt;
&lt;li&gt;512 DOM nodes flagged.&lt;/li&gt;
&lt;li&gt;Average of 3.3 violations per site.&lt;/li&gt;
&lt;li&gt;22 critical-severity issues, 27 serious, 41 moderate, 6 minor.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The headline is the 100% failure rate, but the 3.3-violations-per-site average is the quieter story. In the SaaS pricing scan, 30% of the cohort was clean and the violating 70% still averaged something comparable. Here, the whole distribution has shifted right. Every site is contributing to the pile.&lt;/p&gt;

&lt;p&gt;Why does 100% feel different from 70%? Because 70% lets you tell a story about outliers. Most SaaS companies are doing fine, some are lagging, the tail needs help. 100% doesn't give you that escape hatch. It says something about how the category is building.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is failing, ranked
&lt;/h2&gt;

&lt;p&gt;The top violation rule is &lt;code&gt;region&lt;/code&gt;, triggered on 12 sites and covering 206 DOM nodes. &lt;code&gt;region&lt;/code&gt; fires when meaningful content lives outside of any landmark — no &lt;code&gt;&amp;lt;main&amp;gt;&lt;/code&gt;, no &lt;code&gt;&amp;lt;nav&amp;gt;&lt;/code&gt;, no &lt;code&gt;&amp;lt;section aria-label&amp;gt;&lt;/code&gt; wrapping it. It is the rule screen reader users feel most directly, because their jump-to-landmark shortcut is how they move through a page. When content lives in the void, arrow keys are the only way.&lt;/p&gt;

&lt;p&gt;Second is &lt;code&gt;heading-order&lt;/code&gt;, on 9 sites. This one is almost always a design-system symptom. A marketing page uses an &lt;code&gt;&amp;lt;h3&amp;gt;&lt;/code&gt; because the designer wanted a visual weight that happened to match the h3 styling, without an intermediate &lt;code&gt;&amp;lt;h2&amp;gt;&lt;/code&gt; above it. Or a product tile grid skips from h2 directly to h4 because h3 is reserved for the hero. Screen readers announce the jump and the outline falls apart. These bugs rarely get noticed by the sighted team that shipped them.&lt;/p&gt;

&lt;p&gt;Then the pair that usually moves together: &lt;code&gt;button-name&lt;/code&gt; (critical, 8 sites) and &lt;code&gt;link-name&lt;/code&gt; (serious, 8 sites, 129 nodes). These are icon-only controls with no accessible name. Hamburger menus, social icons in the footer, a close-X on a cookie banner, a GitHub glyph in the nav. Easy to ship, hard to catch in a visual review, trivial for axe to find. The 129 nodes under &lt;code&gt;link-name&lt;/code&gt; tell you this is a template-level issue on a small number of sites, not a hundred sites each with one broken icon.&lt;/p&gt;

&lt;p&gt;And then the one that surprised us: &lt;code&gt;target-size&lt;/code&gt;, a WCAG 2.2 rule, already hitting four landing pages. &lt;code&gt;target-size&lt;/code&gt; requires interactive targets to be at least 24x24 CSS pixels. Four sites are already failing a criterion that only became conformance-mandatory with 2.2. If the rest of the cohort upgrades their a11y targets from 2.1 to 2.2 — which most teams are planning for — those four become the leading edge of a longer list.&lt;/p&gt;

&lt;p&gt;One more footnote that matters. &lt;code&gt;meta-refresh&lt;/code&gt; (critical) is still flagging on three production AI landing pages. That rule fires when a &lt;code&gt;&amp;lt;meta http-equiv="refresh"&amp;gt;&lt;/code&gt; tag auto-reloads the page, which can disorient screen reader users mid-read. It is not exotic, it has been a WCAG failure since 2.0, and it is still shipping.&lt;/p&gt;

&lt;h2&gt;
  
  
  Who is where
&lt;/h2&gt;

&lt;p&gt;The two sites with the most violations are both 7. Writer's landing page at writer.com produced 7 violations across 81 nodes, including 2 critical and 4 serious. Hugging Face produced 7 violations across 15 nodes, 1 critical. Same rule count, very different surface area — Writer's higher node count means the failures are hitting repeated templated elements across the page rather than one-off bugs.&lt;/p&gt;

&lt;p&gt;The highest node count in the cohort belongs to Replicate at 90 affected nodes across 5 violations. Together AI is right behind at 88 nodes across 3 violations. Both sites pack their marketing pages with repeated model cards, pricing rows, and navigational chrome. When one template is missing an accessible name, it multiplies hard.&lt;/p&gt;

&lt;p&gt;v0, Anyscale, and Lovable round out the top five by violation count — all in the 5-6 range, all with at least one critical issue.&lt;/p&gt;

&lt;p&gt;At the other end, five sites came back with exactly 1 violation and 1 affected node each: Perplexity, You.com, Claude, Mistral, and Jasper. Before anyone reads this as a leaderboard, look at those pages. They are sparse. Hero, input box, one CTA. The axe surface is small because there isn't much surface. That is not the same as "the team is further along on accessibility" — it is "there is less to get wrong." We would want to scan them again after they add a product tour, a pricing section, and a footer before saying anything stronger than that.&lt;/p&gt;

&lt;p&gt;The AI coder tools clustered in the middle in a revealing way. Cursor (2), Replit (2), bolt.new (2), Continue (3), Tabnine (3), GitHub Copilot (4), Codeium (4), Sourcegraph (4). Most of them are tripping on the same 2-3 rules: &lt;code&gt;region&lt;/code&gt;, &lt;code&gt;button-name&lt;/code&gt; or &lt;code&gt;link-name&lt;/code&gt;, &lt;code&gt;heading-order&lt;/code&gt;. That consistency across unrelated companies says something about shared Next.js marketing templates, shared component libraries, shared shortcuts.&lt;/p&gt;

&lt;h2&gt;
  
  
  The thing we keep noticing
&lt;/h2&gt;

&lt;p&gt;A few days ago we wrote about &lt;a href="https://blog.a11yfix.dev/blog/ai-generated-code-accessibility-audit/" rel="noopener noreferrer"&gt;running axe-core on AI-generated UI components&lt;/a&gt;. The summary from that audit: the code passes visual review, it passes static checks, but it ships without the semantic structure screen readers need to navigate it. Same two incomplete rules kept surfacing — &lt;code&gt;landmark-one-main&lt;/code&gt; and &lt;code&gt;page-has-heading-one&lt;/code&gt;. Same category of failure.&lt;/p&gt;

&lt;p&gt;This scan has the same shape at a different layer. The landing pages are visually correct. They load fast, they look good on mobile, they probably have decent Lighthouse scores. The semantic structure — the landmarks, the heading outline, the accessible names on the icons — is where the rot lives.&lt;/p&gt;

&lt;p&gt;We don't think this is about AI companies being uniquely careless. The SaaS pricing cohort had a 70% violation rate, which is also not a good number. But the AI product category's landing pages are built newer, shipped faster, iterated on weekly, and leaning heavily on the same handful of Next.js-plus-design-system template stacks. Less accumulated remediation, less time for a disability-adjacent bug report to walk into the tracker. So the violations compound.&lt;/p&gt;

&lt;p&gt;It is the same failure mode as the AI-coder-generated-components story, just one level up. The code your AI writes looks right. The landing page your fast-moving marketing team ships looks right. Both of them skip the layer that screen reader users actually walk on.&lt;/p&gt;

&lt;h2&gt;
  
  
  Caveats before anyone quotes this
&lt;/h2&gt;

&lt;p&gt;This is a shallow scan by design. Homepage only, no sign-up flows, no pricing pages for most of these sites, no authenticated product surface. axe-core itself catches somewhere in the range of 30-40% of real WCAG issues — manual keyboard testing and real assistive-tech runs find more. We turned color contrast off because headless can't compute it. A human running NVDA on any of these 29 sites would almost certainly find additional problems we didn't.&lt;/p&gt;

&lt;p&gt;So this is a floor, not a ceiling. Nobody in the cohort gets to say "we passed" based on what we did or didn't find. Even the five one-violation sites have work axe-core cannot see.&lt;/p&gt;

&lt;p&gt;We're going to keep running cohort scans weekly. Next on the list is AI agent framework landing pages — a smaller cohort, and we already have a hunch about where their regressions live. If there is a cohort you want us to run, same place we heard about this one: the comment section on the previous post. We read them.&lt;/p&gt;

&lt;p&gt;For the general-purpose "we have a WCAG failure, what now" question, the &lt;a href="https://blog.a11yfix.dev/blog/accessibe-alternatives-that-actually-work/" rel="noopener noreferrer"&gt;Accessibe alternatives guide&lt;/a&gt; is probably more useful than another scan post. The &lt;a href="https://blog.a11yfix.dev/blog/eaa-compliance-checklist-2026/" rel="noopener noreferrer"&gt;EAA compliance checklist&lt;/a&gt; is the other one people keep asking about. Both are free.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Need to know where your site stands on accessibility? We do thorough WCAG audits with real assistive-technology testing, starting at $49. &lt;a href="https://blog.a11yfix.dev/audit/overlay-alternatives/" rel="noopener noreferrer"&gt;View pricing&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>a11y</category>
      <category>webdev</category>
      <category>ai</category>
    </item>
    <item>
      <title>ADA Website Lawsuits: What Small Business Owners Need to Know in 2026</title>
      <dc:creator>AgentKit</dc:creator>
      <pubDate>Tue, 14 Apr 2026 23:04:51 +0000</pubDate>
      <link>https://dev.to/agentkit/ada-website-lawsuits-what-small-business-owners-need-to-know-in-2026-3dma</link>
      <guid>https://dev.to/agentkit/ada-website-lawsuits-what-small-business-owners-need-to-know-in-2026-3dma</guid>
      <description>&lt;p&gt;&lt;em&gt;Originally published at &lt;a href="https://blog.a11yfix.dev/blog/ada-lawsuits-small-business/" rel="noopener noreferrer"&gt;A11yFix&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;If you own a small business with a website, you need to understand the legal landscape around web accessibility. ADA (Americans with Disabilities Act) lawsuits targeting websites have become one of the fastest-growing areas of litigation in the United States, and small businesses are increasingly in the crosshairs.&lt;/p&gt;

&lt;p&gt;This article is not legal advice. It is a practical overview to help you understand the risks and take smart steps to protect your business.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Numbers Are Hard to Ignore
&lt;/h2&gt;

&lt;p&gt;In 2024, over 4,000 ADA website accessibility lawsuits were filed in federal court. That number does not include the thousands of demand letters sent each year that never become formal lawsuits. The trend has been climbing steadily since 2018, and 2025 and 2026 show no signs of slowing down.&lt;/p&gt;

&lt;p&gt;Here is what makes this relevant to small businesses: you do not need to be a Fortune 500 company to get sued. In fact, many plaintiffs' attorneys specifically target small and mid-sized businesses because they are more likely to settle quickly rather than fight a prolonged legal battle.&lt;/p&gt;

&lt;p&gt;Settlement amounts typically range from $5,000 to $25,000 for small businesses, but legal defense costs can easily exceed that even if you win. The real cost is often the disruption to your business, the stress, and the damage to your reputation.&lt;/p&gt;

&lt;h2&gt;
  
  
  Who Gets Targeted
&lt;/h2&gt;

&lt;p&gt;Not all businesses face equal risk. Certain industries are disproportionately targeted by ADA web accessibility lawsuits.&lt;/p&gt;

&lt;h3&gt;
  
  
  E-Commerce Businesses
&lt;/h3&gt;

&lt;p&gt;Online stores are the most common target by far. If you sell products or services through your website, you are in the highest-risk category. The logic is straightforward: if a person with a disability cannot complete a purchase on your site, they are being denied access to your goods and services.&lt;/p&gt;

&lt;p&gt;This includes businesses using Shopify, WooCommerce, BigCommerce, and custom-built stores. The platform does not matter — what matters is whether the end result is accessible.&lt;/p&gt;

&lt;h3&gt;
  
  
  Restaurants and Food Service
&lt;/h3&gt;

&lt;p&gt;Restaurant websites are heavily targeted, especially those with online ordering, reservation systems, or menus published as images or PDFs. If your menu is a scanned image that a screen reader cannot read, that is a common trigger for a complaint.&lt;/p&gt;

&lt;h3&gt;
  
  
  Healthcare Providers
&lt;/h3&gt;

&lt;p&gt;Medical practices, dental offices, clinics, and telehealth providers face increased scrutiny. Appointment booking systems and patient portals are frequent pain points.&lt;/p&gt;

&lt;h3&gt;
  
  
  Hospitality and Travel
&lt;/h3&gt;

&lt;p&gt;Hotels, vacation rentals, and travel agencies with online booking systems are regular targets. If someone cannot book a room using assistive technology, that creates liability.&lt;/p&gt;

&lt;h3&gt;
  
  
  Education and Professional Services
&lt;/h3&gt;

&lt;p&gt;Law firms, accounting firms, real estate agencies, and educational institutions round out the list. If your business relies on your website to deliver services or information, you have exposure.&lt;/p&gt;

&lt;h2&gt;
  
  
  What Triggers a Lawsuit
&lt;/h2&gt;

&lt;p&gt;Understanding what actually prompts legal action can help you prioritize your fixes. Here are the most common triggers.&lt;/p&gt;

&lt;h3&gt;
  
  
  Lack of Alt Text on Images
&lt;/h3&gt;

&lt;p&gt;This is the number one issue cited in ADA web accessibility complaints. When product images, informational graphics, or navigation images lack alternative text, blind users cannot understand or interact with your site. For an e-commerce store, this means a blind customer literally cannot tell what you are selling.&lt;/p&gt;

&lt;h3&gt;
  
  
  Inaccessible Forms
&lt;/h3&gt;

&lt;p&gt;Contact forms, checkout processes, signup forms, and search bars that cannot be operated with a keyboard or that lack proper labels for screen readers are a major trigger. If someone cannot complete a transaction, that is a clear barrier.&lt;/p&gt;

&lt;h3&gt;
  
  
  Missing Keyboard Navigation
&lt;/h3&gt;

&lt;p&gt;Many people with motor disabilities cannot use a mouse. They navigate entirely with a keyboard. If your site's menus, buttons, links, and interactive elements cannot be reached and activated using only the Tab key and Enter key, your site is inaccessible to these users.&lt;/p&gt;

&lt;h3&gt;
  
  
  Poor Color Contrast
&lt;/h3&gt;

&lt;p&gt;Text that does not have enough contrast against its background is difficult or impossible to read for people with low vision. While this alone is less likely to trigger a lawsuit, it is frequently included in complaints alongside other issues.&lt;/p&gt;

&lt;h3&gt;
  
  
  Video Without Captions
&lt;/h3&gt;

&lt;p&gt;If you have videos on your site without captions, deaf and hard-of-hearing users are excluded. This applies to product demos, explainer videos, testimonials, and any other video content.&lt;/p&gt;

&lt;h3&gt;
  
  
  Inaccessible PDFs and Documents
&lt;/h3&gt;

&lt;p&gt;Menus, brochures, reports, and other documents published as PDFs are often completely inaccessible. Scanned images saved as PDFs are the worst offenders — they contain no actual text for assistive technology to read.&lt;/p&gt;

&lt;h2&gt;
  
  
  How ADA Web Lawsuits Typically Work
&lt;/h2&gt;

&lt;p&gt;Here is the general pattern, so you know what to expect.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Demand Letter
&lt;/h3&gt;

&lt;p&gt;Most cases start with a demand letter from an attorney, not a formal lawsuit. The letter will state that their client (a person with a disability) attempted to use your website and was unable to due to accessibility barriers. It will cite specific issues and demand that you fix them and pay a settlement.&lt;/p&gt;

&lt;h3&gt;
  
  
  What Happens If You Ignore It
&lt;/h3&gt;

&lt;p&gt;If you ignore the demand letter, the next step is usually a formal lawsuit filed in federal or state court. At this point, legal costs escalate significantly. Most small businesses find it far cheaper to address the issues and negotiate early rather than go to court.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Settlement
&lt;/h3&gt;

&lt;p&gt;Most ADA web accessibility cases settle. Typical terms include a monetary payment (often $5,000 to $25,000 for small businesses), an agreement to remediate your website within a specific timeframe, and sometimes ongoing monitoring requirements.&lt;/p&gt;

&lt;h3&gt;
  
  
  Serial Plaintiffs
&lt;/h3&gt;

&lt;p&gt;A significant portion of ADA web accessibility lawsuits come from a relatively small number of plaintiffs and law firms that file hundreds or thousands of cases per year. Some plaintiffs have filed over 100 lawsuits individually. This does not mean the underlying issues are not real — your website may genuinely have barriers — but it does mean that certain businesses are identified through systematic scanning rather than organic use.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Overlay Trap: Why Accessibility Widgets Can Increase Your Risk
&lt;/h2&gt;

&lt;p&gt;If you have searched for a quick fix, you have probably encountered accessibility overlay widgets — tools like AccessiBe, UserWay, or similar products that promise one-line-of-code compliance. You add a JavaScript widget to your site, a small icon appears in the corner, and supposedly your site is now ADA compliant.&lt;/p&gt;

&lt;p&gt;Here is the reality: overlays do not make your site compliant, and they can actually increase your lawsuit risk.&lt;/p&gt;

&lt;h3&gt;
  
  
  Why Overlays Fail
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;They do not fix the underlying code problems. They attempt to patch issues on the surface while the source code remains inaccessible.&lt;/li&gt;
&lt;li&gt;Screen reader users overwhelmingly report that overlays make sites harder to use, not easier. The National Federation of the Blind and other disability organizations have publicly opposed overlay products.&lt;/li&gt;
&lt;li&gt;Overlays can conflict with the assistive technology that users already have, creating new barriers.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Why Overlays Increase Lawsuit Risk
&lt;/h3&gt;

&lt;p&gt;Multiple law firms have stated publicly that they specifically target businesses using overlays because it demonstrates the business knew about accessibility (they bought a product to address it) but chose an inadequate solution. In legal terms, this can undermine a good-faith defense.&lt;/p&gt;

&lt;p&gt;In 2024 and 2025, numerous businesses using AccessiBe and similar overlays were sued successfully. Having an overlay installed was not accepted as a defense.&lt;/p&gt;

&lt;p&gt;The bottom line: do not rely on overlays. Spend your money on actual fixes instead.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to Reduce Your Legal Risk
&lt;/h2&gt;

&lt;p&gt;You cannot eliminate the risk of an ADA lawsuit entirely, but you can reduce it significantly and build a strong defense position.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Conduct a Basic Accessibility Audit
&lt;/h3&gt;

&lt;p&gt;You do not need to hire an expensive consultant as your first step. Start with a free automated scan using tools like WAVE (wave.webaim.org) or Google Lighthouse (built into Chrome). These tools will identify many common issues like missing alt text, contrast problems, and heading structure issues.&lt;/p&gt;

&lt;p&gt;Automated tools catch roughly 30 to 40 percent of accessibility issues. They are a starting point, not a complete solution, but they give you a clear action list.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Fix the High-Impact Issues First
&lt;/h3&gt;

&lt;p&gt;Based on what triggers lawsuits, prioritize these fixes:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Add alt text to all images&lt;/strong&gt;, especially product images and informational graphics.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Make your forms accessible&lt;/strong&gt; with proper labels and keyboard operability.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Ensure keyboard navigation works&lt;/strong&gt; throughout your site, including menus and interactive elements.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Add captions to videos.&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Fix color contrast&lt;/strong&gt; so text is readable.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Replace image-based PDFs&lt;/strong&gt; with accessible text-based versions.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Step 3: Document Your Efforts
&lt;/h3&gt;

&lt;p&gt;If you are ever challenged, being able to show that you have been actively working on accessibility is valuable. Keep a record of:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;When you conducted your audit&lt;/li&gt;
&lt;li&gt;What issues you found&lt;/li&gt;
&lt;li&gt;What fixes you made and when&lt;/li&gt;
&lt;li&gt;Your plan for ongoing accessibility maintenance&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This demonstrates good faith, which can be important in settlement negotiations or court proceedings.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 4: Publish an Accessibility Statement
&lt;/h3&gt;

&lt;p&gt;An accessibility statement on your website shows that you take the issue seriously. Include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Your commitment to accessibility&lt;/li&gt;
&lt;li&gt;The standard you are working toward (WCAG 2.1 Level AA is the generally accepted benchmark)&lt;/li&gt;
&lt;li&gt;How users can report accessibility problems&lt;/li&gt;
&lt;li&gt;An alternative way to access your services if they encounter a barrier (like a phone number)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is not a legal shield, but it helps establish good faith and gives users a way to contact you directly instead of going to a lawyer.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 5: Make Accessibility Part of Your Ongoing Process
&lt;/h3&gt;

&lt;p&gt;Accessibility is not a one-time fix. Every time you add new content, new products, new pages, or update your site, accessibility needs to be part of the process. Train your staff to add alt text when uploading images, use proper heading structure, and create accessible content.&lt;/p&gt;

&lt;h2&gt;
  
  
  What About State Laws
&lt;/h2&gt;

&lt;p&gt;The ADA is a federal law, but several states have their own accessibility requirements:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;California&lt;/strong&gt; has the Unruh Civil Rights Act, which allows for minimum statutory damages of $4,000 per violation per visit. California sees more ADA web accessibility lawsuits than any other state.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;New York&lt;/strong&gt; is the second most active state for these lawsuits.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Florida&lt;/strong&gt; has also seen a significant increase in filings.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If your business is located in or serves customers in these states, your risk profile is higher.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Department of Justice Position
&lt;/h2&gt;

&lt;p&gt;In 2024, the DOJ finalized a rule under Title II of the ADA requiring state and local government websites to meet WCAG 2.1 Level AA. While this rule directly applies to government entities, not private businesses, it reinforced the DOJ's longstanding position that websites are places of public accommodation under the ADA.&lt;/p&gt;

&lt;p&gt;For private businesses, the legal standard is less explicit, but court decisions have consistently held that business websites must be accessible, particularly when the business has a physical location.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Cost of Doing Nothing vs. the Cost of Fixing It
&lt;/h2&gt;

&lt;p&gt;Here is a rough comparison for a typical small business website:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Cost of basic accessibility remediation:&lt;/strong&gt; $500 to $5,000 (depending on site size and complexity), much of which you can do yourself for free&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cost of an ADA demand letter settlement:&lt;/strong&gt; $5,000 to $25,000&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cost of full legal defense:&lt;/strong&gt; $10,000 to $75,000 or more&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cost of negative publicity:&lt;/strong&gt; Difficult to quantify but very real&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The math is clear. Proactive accessibility work is dramatically cheaper than reactive legal defense.&lt;/p&gt;

&lt;h2&gt;
  
  
  Take Action This Week
&lt;/h2&gt;

&lt;p&gt;You do not need to make your entire site perfectly accessible overnight. Start with these three actions this week:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Run a free WAVE scan on your homepage and top five pages. Write down the issues found.&lt;/li&gt;
&lt;li&gt;Add alt text to every image on your most-visited pages.&lt;/li&gt;
&lt;li&gt;Test your site's keyboard navigation by pressing Tab repeatedly and seeing if you can reach all interactive elements.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;These three steps will address the most common lawsuit triggers and put you on a much stronger footing.&lt;/p&gt;

&lt;p&gt;Accessibility is not just about avoiding lawsuits. It is about making your business available to everyone. But if the legal risk is what gets you to take that first step, that is perfectly fine. The result is the same: a better website for all your customers.&lt;/p&gt;

&lt;p&gt;We're building a simple accessibility checker for non-developers — no DevTools, no jargon. &lt;a href="https://dev.to/about"&gt;Join our waitlist&lt;/a&gt; to get early access.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Curious about your own site's accessibility gaps? We offer WCAG audits that go beyond automated scans, starting at $49. &lt;a href="https://blog.a11yfix.dev/audit/overlay-alternatives/" rel="noopener noreferrer"&gt;Learn more&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>accessibility</category>
      <category>webdev</category>
      <category>opensource</category>
    </item>
    <item>
      <title>We Published a Color Contrast Guide. Then Our Scanner Failed It.</title>
      <dc:creator>AgentKit</dc:creator>
      <pubDate>Tue, 14 Apr 2026 22:52:31 +0000</pubDate>
      <link>https://dev.to/agentkit/we-published-a-color-contrast-guide-then-our-scanner-failed-it-12cc</link>
      <guid>https://dev.to/agentkit/we-published-a-color-contrast-guide-then-our-scanner-failed-it-12cc</guid>
      <description>&lt;p&gt;We wrote a piece called &lt;a href="https://blog.a11yfix.dev/blog/color-contrast-guide/" rel="noopener noreferrer"&gt;The Complete Web Accessibility Color Contrast Guide&lt;/a&gt;. Then, last weekend, we ran axe-core against our own blog for the first time.&lt;/p&gt;

&lt;p&gt;It fails color contrast.&lt;/p&gt;

&lt;p&gt;Not a clever failure either. A boring one. The kind we would point out in a client report in the first paragraph.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why we finally ran the scanner on ourselves
&lt;/h2&gt;

&lt;p&gt;We have been publishing weekly audits of other people's sites for about a month now — a &lt;a href="https://blog.a11yfix.dev/blog/saas-pricing-pages-accessibility-audit/" rel="noopener noreferrer"&gt;cohort scan of 30 SaaS pricing pages&lt;/a&gt;, an &lt;a href="https://blog.a11yfix.dev/blog/ai-generated-code-accessibility-audit/" rel="noopener noreferrer"&gt;axe-core run against AI-generated UI code&lt;/a&gt;, and a few smaller pieces in between. At some point it started feeling dishonest that we had never pointed the same tooling at &lt;code&gt;blog.a11yfix.dev&lt;/code&gt; itself. So on April 13 we did.&lt;/p&gt;

&lt;p&gt;16 pages. axe-core 4.11. WCAG 2.1 AA plus 2.2 AA plus the best-practice tag so landmark rules would fire. Same harness we use for everything else.&lt;/p&gt;

&lt;h2&gt;
  
  
  The numbers
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;27 violations total across the 16 pages&lt;/li&gt;
&lt;li&gt;28 DOM nodes affected&lt;/li&gt;
&lt;li&gt;0 critical&lt;/li&gt;
&lt;li&gt;14 color-contrast violations flagged as serious (spanning 15 nodes)&lt;/li&gt;
&lt;li&gt;4 unique rule IDs — the entire result set fits in four rules&lt;/li&gt;
&lt;li&gt;1 page with zero violations (the homepage, which is the only page on the site we hand-wrote the layout for)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Nothing new in there. No WCAG 2.2 edge case. No novel aria-live disaster. Just basics. If you handed us this report for a client we would call it "low-severity surface sweep, two afternoons of work to clear."&lt;/p&gt;

&lt;p&gt;The problem is we are not the client.&lt;/p&gt;

&lt;h2&gt;
  
  
  The specific one that hurt
&lt;/h2&gt;

&lt;p&gt;On &lt;code&gt;/blog/color-contrast-guide/&lt;/code&gt;, axe-core flagged a single element: the privacy note under the newsletter signup form. The text is &lt;code&gt;"No spam. Unsubscribe anytime."&lt;/code&gt; rendered in &lt;code&gt;#898a8f&lt;/code&gt; on a pale blue &lt;code&gt;#f0f4ff&lt;/code&gt; background. Ratio: &lt;strong&gt;3.13:1&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;WCAG AA wants 4.5:1 for normal body text. We are a point and a half under.&lt;/p&gt;

&lt;p&gt;The honest part is that this almost certainly looked fine on the monitor the component was designed on. A calibrated sRGB desktop, afternoon light, eyes that are used to reading grays on grays. We have all shipped that pixel. The less honest part is that we wrote an entire guide explaining why "looks fine to me" is not how you make this call, and then shipped a component where that was apparently exactly how we made the call.&lt;/p&gt;

&lt;p&gt;One line fix. &lt;code&gt;#898a8f&lt;/code&gt; becomes something in the &lt;code&gt;#5a5b60&lt;/code&gt; range and the ratio clears 4.5:1 comfortably. It is not a design problem, it is a default-gray problem, which is the most common failure mode we find in other people's audits too.&lt;/p&gt;

&lt;h2&gt;
  
  
  The pattern
&lt;/h2&gt;

&lt;p&gt;Here is the part that is actually worth writing about, because it kept showing up in our own report the way it kept showing up in everyone else's.&lt;/p&gt;

&lt;p&gt;We did not ship 27 different mistakes in 27 different articles. We shipped &lt;strong&gt;two&lt;/strong&gt; mistakes, in &lt;strong&gt;two&lt;/strong&gt; shared components, and those components got replicated across the site.&lt;/p&gt;

&lt;p&gt;The newsletter signup with the &lt;code&gt;.privacy-note&lt;/code&gt; text ships on almost every article. One bad hex value, 14 pages tainted, accounting for 14 of the 15 color-contrast failures. The 15th is a syntax-highlighted comment token from the Shiki code theme on the email marketing post — a third-party default we inherited, still our responsibility, but at least not the same bug twice.&lt;/p&gt;

&lt;p&gt;The second one is subtler. We have a &lt;code&gt;&amp;lt;aside class="related-resources"&amp;gt;&lt;/code&gt; block at the bottom of most articles. It lives inside &lt;code&gt;&amp;lt;main&amp;gt;&lt;/code&gt;. axe-core's &lt;code&gt;landmark-complementary-is-top-level&lt;/code&gt; rule says an &lt;code&gt;&amp;lt;aside&amp;gt;&lt;/code&gt; carries an implicit &lt;code&gt;complementary&lt;/code&gt; landmark, and complementary landmarks are supposed to sit next to &lt;code&gt;&amp;lt;main&amp;gt;&lt;/code&gt;, not inside it. 11 pages, 11 nodes, one component. Move the aside out of main, or drop its implicit role with &lt;code&gt;role="presentation"&lt;/code&gt; on the cases where it really is just a layout container, and that entire column of the report goes to zero.&lt;/p&gt;

&lt;p&gt;Two components. Twenty-five of the twenty-seven violations.&lt;/p&gt;

&lt;p&gt;If that pattern sounds familiar, it is because we wrote about it earlier this week. In the &lt;a href="https://blog.a11yfix.dev/blog/saas-pricing-pages-accessibility-audit/" rel="noopener noreferrer"&gt;SaaS pricing pages scan&lt;/a&gt;, the single company with the highest DOM-node failure count had 73 nodes affected by what was fundamentally one toggle component repeated across every pricing tier row. We spent a paragraph on how cohort audits overstate the fix surface because shared components inflate node counts. We wrote that observation down, published it, and then discovered that our own blog was a smaller, more embarrassing instance of the exact same thing.&lt;/p&gt;

&lt;p&gt;There is a version of this post where we claim we did it on purpose, to make the point cleaner. We did not. We genuinely did not run the scanner on ourselves until last weekend, and when we did, the finding that made us wince was the one that matched the pattern from the article we were proudest of.&lt;/p&gt;

&lt;h2&gt;
  
  
  The two outliers
&lt;/h2&gt;

&lt;p&gt;Both live on the blog index page — the listing at &lt;code&gt;/blog/&lt;/code&gt;, not any individual post.&lt;/p&gt;

&lt;p&gt;One: no &lt;code&gt;&amp;lt;h1&amp;gt;&lt;/code&gt;. The listing page has &lt;code&gt;&amp;lt;h4&amp;gt;&lt;/code&gt; post titles and no heading above them. &lt;code&gt;page-has-heading-one&lt;/code&gt; fires. This is a classic Astro-content-collection default, where the layout template for the list page was never given a proper top-level heading and nobody noticed because visually the first post title looks like the headline.&lt;/p&gt;

&lt;p&gt;Two: &lt;code&gt;heading-order&lt;/code&gt;. Same page, same cause. A list of &lt;code&gt;&amp;lt;h4&amp;gt;&lt;/code&gt; elements without an enclosing &lt;code&gt;&amp;lt;h2&amp;gt;&lt;/code&gt; or &lt;code&gt;&amp;lt;h3&amp;gt;&lt;/code&gt; to step the hierarchy down from. Same fix.&lt;/p&gt;

&lt;p&gt;The blog index is the page with zero prose content, built entirely from template code, and it is simultaneously the page with the two structural failures the rest of the site does not have. Meanwhile, the homepage — the one with marketing copy, a hero, three CTAs — scans with zero violations. Structure failures cluster on pages that were generated from a layout file and never had a human re-read them after the scaffold was done.&lt;/p&gt;

&lt;h2&gt;
  
  
  What we are doing about it
&lt;/h2&gt;

&lt;p&gt;Three changes:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Darken &lt;code&gt;.privacy-note&lt;/code&gt; text (&lt;code&gt;#898a8f&lt;/code&gt; → &lt;code&gt;#5a5b60&lt;/code&gt;), which clears 14 of 15 color-contrast nodes.&lt;/li&gt;
&lt;li&gt;Move &lt;code&gt;&amp;lt;aside class="related-resources"&amp;gt;&lt;/code&gt; out of &lt;code&gt;&amp;lt;main&amp;gt;&lt;/code&gt; in the article layout, which clears all 11 &lt;code&gt;landmark-complementary-is-top-level&lt;/code&gt; nodes.&lt;/li&gt;
&lt;li&gt;Add an &lt;code&gt;&amp;lt;h1&amp;gt;&lt;/code&gt; to the blog index template and drop the post titles to &lt;code&gt;&amp;lt;h3&amp;gt;&lt;/code&gt;, which clears both structural failures on that page.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Projected delta: 27 violations down to 1. The single remaining issue is the Shiki theme comment token on the email marketing post — a third-party syntax-highlighting color we can override locally but would rather fix upstream if the theme maintainer takes the PR.&lt;/p&gt;

&lt;p&gt;We are not going to tell you this is already shipped, because it is not. The PR is open at the time of writing and we expect it to land before this article does. If you visit the site after April 15 and still see the 3.13:1 ratio on the newsletter form, the correct reaction is "they missed their own PR deadline" and we will have earned that. By the time you read this the fix should be in, but scanners don't care about our deadlines.&lt;/p&gt;

&lt;h2&gt;
  
  
  The observation, not a lesson
&lt;/h2&gt;

&lt;p&gt;We have spent a month scanning other people's sites. The first thing we should have done was scan our own. That is not a moral — we knew we should — it is just how it played out. The useful part, for anyone else, is that a scanner on your own templates will find the exact class of bug you find on everyone else's, because shared components are where bugs live. Individual article HTML is almost always fine. Layouts, newsletter blocks, footer widgets, sidebar components — that is the surface area worth scanning.&lt;/p&gt;

&lt;p&gt;We will be running axe-core against blog.a11yfix.dev weekly from now on. It should take 15 minutes to notice the next one.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;If you're not sure whether your site is accessible, we can help. Our audits combine automated tools with hands-on screen reader testing, from $49. &lt;a href="https://blog.a11yfix.dev/audit/" rel="noopener noreferrer"&gt;Details and pricing&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>a11y</category>
      <category>webdev</category>
      <category>wcag</category>
    </item>
    <item>
      <title>React Tutorial Accessibility Mistakes That Ship to Production</title>
      <dc:creator>AgentKit</dc:creator>
      <pubDate>Tue, 14 Apr 2026 08:41:50 +0000</pubDate>
      <link>https://dev.to/agentkit/react-tutorial-accessibility-mistakes-that-ship-to-production-403k</link>
      <guid>https://dev.to/agentkit/react-tutorial-accessibility-mistakes-that-ship-to-production-403k</guid>
      <description>&lt;p&gt;&lt;em&gt;Originally published at &lt;a href="https://blog.a11yfix.dev/blog/react-tutorial-accessibility-mistakes/" rel="noopener noreferrer"&gt;A11yFix&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;We've spent the last few months going through React codebases — open source, client work, our own old projects we'd rather not look at — and there's a pattern. The accessibility bugs aren't random. They're the same five or six bugs, in roughly the same proportions, in almost every codebase. And once you start looking for it, the source becomes obvious: they're tutorial residue.&lt;/p&gt;

&lt;p&gt;Not because tutorials are bad. React tutorials are some of the best learning material on the web. But a tutorial has to fit in 12 minutes or 800 words, and the first thing that gets cut for time is the part where the component actually has to work for a human who isn't holding a mouse. The author knows. They almost always say "in production you'd want to handle X" and then move on. Readers don't. Readers copy the example into their app and ship it.&lt;/p&gt;

&lt;p&gt;So this isn't a "React tutorials are bad" piece. It's a "here's what gets skipped, and here's the 30 seconds of extra code that fixes it" piece. We've seen each of these in almost every codebase we've touched, so if you recognize one, you're not alone — it's basically the default state.&lt;/p&gt;

&lt;h2&gt;
  
  
  1. The clickable div
&lt;/h2&gt;

&lt;p&gt;If one pattern defines tutorial-culture React, it's &lt;code&gt;&amp;lt;div onClick={...}&amp;gt;&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;It's everywhere because it's natural in JSX. You're already writing divs for layout, you already have an &lt;code&gt;onClick&lt;/code&gt; handler in scope, and the result looks correct. Visually, a styled div with a click handler is indistinguishable from a button. To axe-core, to keyboard users, and to screen readers, it isn't.&lt;/p&gt;

&lt;p&gt;A &lt;code&gt;&amp;lt;div&amp;gt;&lt;/code&gt; has no role, no focus ring, no Enter/Space activation, and doesn't appear in the accessibility tree as something interactive. A keyboard user literally cannot reach it. Axe-core flags this as &lt;code&gt;nested-interactive&lt;/code&gt; or &lt;code&gt;interactive-supports-focus&lt;/code&gt; depending on how you nested it, but the cleaner rule is: if a thing does something on click, it is a button or a link, full stop.&lt;/p&gt;

&lt;p&gt;The fix is one word:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="c1"&gt;// before&lt;/span&gt;
&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"card"&lt;/span&gt; &lt;span class="na"&gt;onClick&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;handleSelect&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;...&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;

&lt;span class="c1"&gt;// after&lt;/span&gt;
&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;button&lt;/span&gt; &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"button"&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"card"&lt;/span&gt; &lt;span class="na"&gt;onClick&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;handleSelect&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;...&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;button&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You'll need &lt;code&gt;type="button"&lt;/code&gt; (otherwise inside a form it'll submit — the other thing tutorials skip) and you'll need to reset some browser styles, but that's a tailwind class, not a refactor. If your &lt;code&gt;&amp;lt;div&amp;gt;&lt;/code&gt; is genuinely a navigation target — going to a new URL — make it an &lt;code&gt;&amp;lt;a href&amp;gt;&lt;/code&gt; instead. Don't &lt;code&gt;useNavigate&lt;/code&gt; from a click handler on a div.&lt;/p&gt;

&lt;h2&gt;
  
  
  2. The mystery icon button
&lt;/h2&gt;

&lt;p&gt;Once you've replaced your divs with buttons, the next pattern shows up immediately: buttons with only an icon inside them. A gear emoji for settings. A trash can for delete. A magnifier for search. These look great in a tutorial screenshot and they ship in production with no accessible name at all.&lt;/p&gt;

&lt;p&gt;Screen readers will announce them as "button" with nothing after it. That's the entirety of the information the user gets. In a row of icon buttons — which is exactly where this pattern lives — you get "button, button, button" and you have no idea which one will delete your data.&lt;/p&gt;

&lt;p&gt;Axe-core calls this &lt;code&gt;button-name&lt;/code&gt;, and it's in the top three most common violations on every codebase we've audited. The fix is &lt;code&gt;aria-label&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;button&lt;/span&gt; &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"button"&lt;/span&gt; &lt;span class="na"&gt;aria-label&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"Delete row"&lt;/span&gt; &lt;span class="na"&gt;onClick&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;onDelete&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;TrashIcon&lt;/span&gt; &lt;span class="na"&gt;aria-hidden&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"true"&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;button&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Two things matter here. First, &lt;code&gt;aria-hidden="true"&lt;/code&gt; on the icon — otherwise some screen readers will try to announce the icon's own label and you'll get a mess. Second, the label should describe the action, not the icon. "Delete row" not "Trash can." Nobody cares what shape the icon is.&lt;/p&gt;

&lt;p&gt;If you've got an emoji as the icon (a gear character for settings is the canonical example), the rule is exactly the same: wrap it in a &lt;code&gt;&amp;lt;span aria-hidden="true"&amp;gt;&lt;/code&gt;, put the real label on the button.&lt;/p&gt;

&lt;h2&gt;
  
  
  3. The label that's just text
&lt;/h2&gt;

&lt;p&gt;Most React form tutorials show forms like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;p&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;Email&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;p&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;input&lt;/span&gt; &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"email"&lt;/span&gt; &lt;span class="na"&gt;value&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;email&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="na"&gt;onChange&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="p"&gt;...&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is a label in the visual sense and not in any other sense. There's no association between the &lt;code&gt;&amp;lt;p&amp;gt;&lt;/code&gt; and the &lt;code&gt;&amp;lt;input&amp;gt;&lt;/code&gt;. Click the word "Email" — nothing focuses. A screen reader landing on the input announces "edit, blank" with no idea what it's editing. Axe-core flags it as &lt;code&gt;label&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;The HTML answer is a &lt;code&gt;&amp;lt;label htmlFor&amp;gt;&lt;/code&gt; paired with an input &lt;code&gt;id&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;label&lt;/span&gt; &lt;span class="na"&gt;htmlFor&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"email"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;Email&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;label&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;input&lt;/span&gt; &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"email"&lt;/span&gt; &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"email"&lt;/span&gt; &lt;span class="na"&gt;value&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;email&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="na"&gt;onChange&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="p"&gt;...&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In React this gets awkward because IDs need to be unique and you're often rendering the same form in multiple places, so use &lt;code&gt;useId()&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;useId&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="p"&gt;&amp;lt;&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;label&lt;/span&gt; &lt;span class="na"&gt;htmlFor&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;Email&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;label&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;input&lt;/span&gt; &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"email"&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
  &lt;span class="p"&gt;&amp;lt;/&amp;gt;&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;useId()&lt;/code&gt; exists specifically for this. It's been in React since 18 and it's still rare to see in tutorial code, probably because adding it to the example makes the snippet two lines longer. We know how this goes.&lt;/p&gt;

&lt;p&gt;If you're using a form library, check whether your &lt;code&gt;&amp;lt;Input&amp;gt;&lt;/code&gt; component accepts an &lt;code&gt;id&lt;/code&gt; prop and threads it through. A surprising number of design systems drop the id on the floor between the wrapper and the input. Worth a 5-minute audit of your own component library.&lt;/p&gt;

&lt;p&gt;For more on the underlying ARIA model and when you'd reach for &lt;code&gt;aria-label&lt;/code&gt; vs &lt;code&gt;&amp;lt;label&amp;gt;&lt;/code&gt;, our &lt;a href="https://blog.a11yfix.dev/blog/aria-attributes-beginners-guide/" rel="noopener noreferrer"&gt;ARIA attributes beginner's guide&lt;/a&gt; walks through the trade-offs.&lt;/p&gt;

&lt;h2&gt;
  
  
  4. Modals that trap nothing
&lt;/h2&gt;

&lt;p&gt;Every React tutorial that covers portals uses a modal as the example. The modal example is usually correct in the sense that it renders into a portal and closes when you click the X. It's almost never correct in the sense that a keyboard user can use it.&lt;/p&gt;

&lt;p&gt;Three things are missing in the typical tutorial modal:&lt;/p&gt;

&lt;p&gt;The first is focus trap. When the modal opens, focus should move into it, and Tab should not let you escape back into the page underneath. Otherwise a screen reader user opens a modal and immediately Tab-keys their way back into the now-hidden page content, with no idea the modal is even there.&lt;/p&gt;

&lt;p&gt;The second is Escape. Escape should close the modal. This is muscle memory for every keyboard user on the planet. Tutorials skip it because adding a &lt;code&gt;keydown&lt;/code&gt; listener and cleaning it up in &lt;code&gt;useEffect&lt;/code&gt; doubles the size of the example.&lt;/p&gt;

&lt;p&gt;The third is restoring focus on close. When the modal closes, focus should go back to whatever opened it — usually the button the user clicked. Otherwise focus lands on &lt;code&gt;&amp;lt;body&amp;gt;&lt;/code&gt; and the next Tab dumps the user at the top of the page.&lt;/p&gt;

&lt;p&gt;If you don't want to write all this yourself — and honestly, you shouldn't, because there are subtle bugs around inert content and aria-hidden — use a primitives library. Radix UI, React Aria, Headless UI, Ark UI all give you a &lt;code&gt;&amp;lt;Dialog&amp;gt;&lt;/code&gt; that does focus trap, Escape, and focus restoration out of the box. The amount of accessibility you get for free from any of these is substantial. We recommend just adopting one.&lt;/p&gt;

&lt;p&gt;If you must roll your own, the new HTML &lt;code&gt;&amp;lt;dialog&amp;gt;&lt;/code&gt; element with &lt;code&gt;showModal()&lt;/code&gt; handles focus trap and Escape natively now, and it's been in every browser since early 2022. It's not a perfect match for React's mental model — you call an imperative method to open it — but it's a lot better than a div with &lt;code&gt;position: fixed&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  5. State by color alone
&lt;/h2&gt;

&lt;p&gt;The last one is sneaky because it doesn't break anything functional. It just quietly excludes a percentage of users.&lt;/p&gt;

&lt;p&gt;Tutorials love the green dot / red dot pattern for status. Online: green. Offline: red. Build passing: green. Build failing: red. It's compact, it's pretty, and for the roughly 1 in 12 men with some form of color vision deficiency, it conveys nothing. WCAG calls this 1.4.1 Use of Color and our &lt;a href="https://blog.a11yfix.dev/blog/color-contrast-guide/" rel="noopener noreferrer"&gt;color contrast guide&lt;/a&gt; gets into the broader picture.&lt;/p&gt;

&lt;p&gt;The fix isn't to remove the color. The fix is to add a second channel. Text, an icon shape, both — anything that's distinguishable without color.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="c1"&gt;// before&lt;/span&gt;
&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;isOnline&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;bg-green-500&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;bg-red-500&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;

&lt;span class="c1"&gt;// after&lt;/span&gt;
&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;isOnline&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;bg-green-500&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;bg-red-500&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="na"&gt;aria-hidden&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"true"&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"sr-only"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;isOnline&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Online&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Offline&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;isOnline&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Online&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Offline&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Yes, that's a lot more code than a colored dot. That's the whole point. The colored dot is an artifact of a tutorial trying to fit the example in three lines. The real version has more lines because it's doing more work — work that the tutorial's author either took as obvious or didn't have room to show.&lt;/p&gt;

&lt;h2&gt;
  
  
  What's actually going on here
&lt;/h2&gt;

&lt;p&gt;If you re-read those five patterns, they have something in common. None of them are things React tutorial authors don't know. They're things that don't fit. A React tutorial has 12 minutes to teach hooks, JSX, state, and a working example. The accessible version of every component is between 30% and 200% more code. Something has to give, and the thing that gives is the part that doesn't have a visible failure on the screen recording.&lt;/p&gt;

&lt;p&gt;That isn't a moral failing. It's a compression problem. The fix isn't to make tutorials longer — they'd lose readers. The fix is for the people downstream (us, you, anyone shipping a React app to the public) to know which corners got cut and patch them on the way out. The five patterns above account for somewhere around 60% of the violations we find on a typical React audit. Fix those five and you've meaningfully improved the experience for keyboard users and screen reader users on your site, with maybe an hour of work per component library.&lt;/p&gt;

&lt;p&gt;The other thing worth saying: none of this requires a deep ARIA rabbit hole. Four of the five fixes above are vanilla HTML — &lt;code&gt;&amp;lt;button&amp;gt;&lt;/code&gt;, &lt;code&gt;&amp;lt;label htmlFor&amp;gt;&lt;/code&gt;, native &lt;code&gt;&amp;lt;dialog&amp;gt;&lt;/code&gt;, visible text. ARIA is the escape hatch when HTML can't express what you mean. For most React tutorial residue, HTML is enough.&lt;/p&gt;

&lt;p&gt;We're still finding new variants of each of these every week, so if you've got one we missed, drop it in the comments. We collect them.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Ready to find out what's actually broken on your site? Our accessibility audits cover automated scans plus manual walkthroughs, starting at $49. &lt;a href="https://blog.a11yfix.dev/audit/" rel="noopener noreferrer"&gt;Get started&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>react</category>
      <category>webdev</category>
      <category>a11y</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>We Scanned 30 SaaS Pricing Pages for Accessibility. 70% Failed.</title>
      <dc:creator>AgentKit</dc:creator>
      <pubDate>Mon, 13 Apr 2026 11:15:01 +0000</pubDate>
      <link>https://dev.to/agentkit/we-scanned-30-saas-pricing-pages-for-accessibility-70-failed-3a7</link>
      <guid>https://dev.to/agentkit/we-scanned-30-saas-pricing-pages-for-accessibility-70-failed-3a7</guid>
      <description>&lt;p&gt;Nine out of 30 SaaS pricing pages had zero WCAG 2.1 AA violations when we ran them through axe-core this week. Figma, Netlify, Twilio, Zendesk, Calendly, Loom, Miro, Grammarly, Webflow. Clean results across the board.&lt;/p&gt;

&lt;p&gt;The other 21 didn't come close.&lt;/p&gt;

&lt;h2&gt;
  
  
  What we tested and how
&lt;/h2&gt;

&lt;p&gt;We pointed axe-core 4.11 at the pricing page of 30 SaaS products -- not their homepages, not their docs, specifically their pricing pages. The standard was WCAG 2.1 AA, and we ran every scan in a consistent headless Chromium environment with no browser extensions or user scripts that might interfere.&lt;/p&gt;

&lt;p&gt;Why pricing pages? Because they're where the money changes hands. They tend to get heavy custom design treatment: comparison tables with intricate layouts, toggle switches between monthly and annual billing, gradient backgrounds behind plan names, animated feature lists. All of that increases the surface area for accessibility failures in ways that a simpler marketing page might not.&lt;/p&gt;

&lt;p&gt;The scan completed on April 12, 2026. Every result below comes directly from axe-core output -- no manual evaluation, no subjective judgment calls.&lt;/p&gt;

&lt;h2&gt;
  
  
  The numbers
&lt;/h2&gt;

&lt;p&gt;Across 30 sites, axe-core flagged 65 total violations touching 548 DOM nodes. The average was 2.2 violations per site, but that average hides a wide spread. Nine sites had nothing. Three sites -- Linear, Render, and Intercom -- had five violations each.&lt;/p&gt;

&lt;p&gt;Color contrast was the most common single violation by a wide margin, appearing on 12 of the 30 pricing pages (40%). That tracks with what we see in basically every cohort we scan, but the prevalence on pricing pages is worth noting. These aren't obscure blog posts with inherited styles. Pricing pages get direct design attention, and still, 40% of them had text that didn't meet minimum contrast ratios against its background.&lt;/p&gt;

&lt;p&gt;The second most common violation was &lt;code&gt;list&lt;/code&gt; -- malformed list structures -- appearing on 8 sites. This one's a pricing page special. When you build a feature comparison table or a bulleted list of what's included in each tier, it's easy to use &lt;code&gt;&amp;lt;div&amp;gt;&lt;/code&gt; elements styled to look like lists without actually being lists. Screen readers can't parse the structure, and the user loses the ability to navigate between items.&lt;/p&gt;

&lt;p&gt;After that: &lt;code&gt;aria-allowed-attr&lt;/code&gt; on 5 sites, &lt;code&gt;link-name&lt;/code&gt; and &lt;code&gt;button-name&lt;/code&gt; each on 4 sites, and then a longer tail of ARIA-related issues.&lt;/p&gt;

&lt;h2&gt;
  
  
  Who had the most violations
&lt;/h2&gt;

&lt;p&gt;Linear's pricing page had 5 violations affecting 14 DOM nodes, including a critical &lt;code&gt;aria-required-parent&lt;/code&gt; issue and color contrast failures across 3 elements. The list markup was malformed too -- &lt;code&gt;&amp;lt;li&amp;gt;&lt;/code&gt; elements outside proper list containers, which affected 8 nodes.&lt;/p&gt;

&lt;p&gt;Render also had 5 violations, but its node count was the highest in the entire cohort: 82 affected DOM elements. The bulk of that was 45 nodes failing color contrast and 34 buttons without accessible names. When axe-core flags 34 buttons on a single page without discernible text, that usually points to an icon button pattern where the icons are decorative and no &lt;code&gt;aria-label&lt;/code&gt; was added.&lt;/p&gt;

&lt;p&gt;Intercom rounded out the top three with 5 violations and 28 affected nodes. Three of those violations were critical severity, including 17 elements with ARIA roles that lacked required parent roles and 6 buttons without accessible names.&lt;/p&gt;

&lt;p&gt;Below the top three, Vercel, PlanetScale, Stripe, SendGrid, HubSpot, Monday, and Asana each had 4 violations. Asana's case is interesting -- it only had 4 violation types, but 73 DOM nodes were affected, nearly all of them from invalid ARIA roles (70 nodes). That's likely a single component pattern replicated across every feature row in their pricing table.&lt;/p&gt;

&lt;p&gt;Slack had just 2 violations, but one of them -- &lt;code&gt;aria-command-name&lt;/code&gt; -- hit 96 DOM nodes. Sometimes a low violation count masks a large blast radius.&lt;/p&gt;

&lt;h2&gt;
  
  
  Who got it right
&lt;/h2&gt;

&lt;p&gt;The nine clean sites: Figma, Netlify, Twilio, Zendesk, Calendly, Loom, Miro, Grammarly, Webflow.&lt;/p&gt;

&lt;p&gt;What do they have in common? Honestly, not as much as you'd hope for a neat narrative. They span different industries (design tools, communications, scheduling, writing, web hosting). They use different tech stacks. Some have elaborate pricing pages with feature grids; others keep it simple.&lt;/p&gt;

&lt;p&gt;If there's a through-line, it might be that several of these companies have publicly stated accessibility commitments. Figma has talked about accessibility in their design tool itself. Webflow literally sells website building and has a vested interest in demonstrating that their own output is accessible. Twilio and Zendesk both operate in spaces where enterprise customers with accessibility requirements are a significant part of their revenue.&lt;/p&gt;

&lt;p&gt;But I'm speculating. The data just says they passed. Drawing causal conclusions from nine data points would be overreach.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why pricing pages specifically
&lt;/h2&gt;

&lt;p&gt;We've scanned other cohorts before -- landing pages, documentation sites, forms. Pricing pages consistently perform worse, and we think there are a few structural reasons.&lt;/p&gt;

&lt;p&gt;Pricing pages are marketing pages that behave like application interfaces. They have interactive elements (toggles, sliders, accordions for FAQ sections, comparison table filters) layered on top of heavy visual design. That combination creates accessibility failure modes that a static marketing page wouldn't have.&lt;/p&gt;

&lt;p&gt;There's also the custom-build problem. When a product team builds a comparison table for their three pricing tiers, they're often working from a one-off Figma design rather than pulling from a shared accessible component system. Custom means untested. Untested means &lt;code&gt;aria-allowed-attr&lt;/code&gt; violations slip through.&lt;/p&gt;

&lt;p&gt;And then there's the visual hierarchy pressure. You want the recommended plan to stand out. You want the CTA button to pop. That pressure toward visual emphasis creates exactly the conditions where contrast ratios get sacrificed -- a light gray "per user/month" label against a white card background, or a pastel-colored "most popular" badge that doesn't meet the 4.5:1 ratio.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;list&lt;/code&gt; violations tell the same story from a different angle. Feature lists on pricing pages are almost never actual &lt;code&gt;&amp;lt;ul&amp;gt;&lt;/code&gt; elements. They're styled &lt;code&gt;&amp;lt;div&amp;gt;&lt;/code&gt; stacks with checkmark icons, because that's what looks good in the design. The visual result is fine. The semantic result is invisible to assistive technology.&lt;/p&gt;

&lt;h2&gt;
  
  
  What this doesn't tell you
&lt;/h2&gt;

&lt;p&gt;Automated scanning catches a specific category of issues. axe-core is good at finding color contrast failures, missing labels, malformed ARIA, and structural problems. It's not good at evaluating whether a screen reader user can actually complete the task of understanding and comparing pricing tiers. It can't tell you whether the tab order makes sense, or whether the plan toggle between monthly and annual actually announces its state change.&lt;/p&gt;

&lt;p&gt;So the nine sites with zero violations aren't necessarily fully accessible. And the 21 sites with violations aren't necessarily unusable. What the data does tell you is the minimum bar -- these are issues that an automated tool can catch in under 10 seconds per page, and 70% of well-funded SaaS companies haven't cleared that bar on one of their most important pages.&lt;/p&gt;

&lt;p&gt;We'll run this cohort again in a few months and see what changes.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;We run automated accessibility scans weekly on different website cohorts. Read more at &lt;a href="https://blog.a11yfix.dev" rel="noopener noreferrer"&gt;blog.a11yfix.dev&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Want to catch accessibility issues before your users do? We run WCAG audits with real assistive-tech testing, from $49. &lt;a href="https://blog.a11yfix.dev/audit/" rel="noopener noreferrer"&gt;See options&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>a11y</category>
      <category>webdev</category>
      <category>saas</category>
    </item>
    <item>
      <title>We Ran Axe-Core On AI-Generated UI Code. The Findings Surprised Us.</title>
      <dc:creator>AgentKit</dc:creator>
      <pubDate>Sat, 11 Apr 2026 15:18:57 +0000</pubDate>
      <link>https://dev.to/agentkit/we-ran-axe-core-on-ai-generated-ui-code-the-findings-surprised-us-111m</link>
      <guid>https://dev.to/agentkit/we-ran-axe-core-on-ai-generated-ui-code-the-findings-surprised-us-111m</guid>
      <description>&lt;p&gt;&lt;em&gt;Originally published at &lt;a href="https://blog.a11yfix.dev/blog/ai-generated-code-accessibility-audit/" rel="noopener noreferrer"&gt;A11yFix&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;We asked an AI coding assistant for five UI components that show up in almost every SaaS app: a login form, a pricing card, a confirmation modal, a top navigation bar, and a dashboard stats card. Same prompts you would probably type yourself on a Friday afternoon. Then we piped each result into &lt;a href="https://github.com/dequelabs/axe-core" rel="noopener noreferrer"&gt;axe-core&lt;/a&gt; through the same jsdom-based scanner we use inside our own build pipeline.&lt;/p&gt;

&lt;p&gt;Here is the top-line number: &lt;strong&gt;3 WCAG violations across 5 components, all of the same rule&lt;/strong&gt;. That is a lot better than we expected. It is also less reassuring than it sounds. The interesting story is in what axe-core could not see.&lt;/p&gt;

&lt;h2&gt;
  
  
  The test setup, in boring detail
&lt;/h2&gt;

&lt;p&gt;We used Claude Sonnet via the &lt;code&gt;claude&lt;/code&gt; CLI as the code generator. No system prompt, no style preamble, just the same sort of one-line request a developer would paste into any AI coder — Cursor, v0, Bolt, Lovable, Claude Code — and expect back a component. The exact prompts and outputs live in our audit workspace; for reference, the login prompt was "Create a login form component as a single HTML snippet. Email input, password input, a remember me checkbox, a primary Sign in button, and a forgot password link. Use Tailwind classes." The other four were the same shape.&lt;/p&gt;

&lt;p&gt;For the audit itself we used &lt;code&gt;jsdom&lt;/code&gt; for the DOM, &lt;code&gt;axe-core&lt;/code&gt; 4.11 for the rules, WCAG 2.1 AA and WCAG 2.2 AA tags enabled, plus the best-practice tag so landmark rules would actually fire. We disabled the &lt;code&gt;color-contrast&lt;/code&gt; rule because jsdom cannot resolve Tailwind's JIT classes into real computed colors, so any result it gave us would be noise. We verified contrast by hand against the Tailwind palette instead — more on that in a moment.&lt;/p&gt;

&lt;p&gt;Each component was wrapped in a minimal &lt;code&gt;&amp;lt;!doctype html&amp;gt;&lt;/code&gt; page with no &lt;code&gt;&amp;lt;main&amp;gt;&lt;/code&gt;, because that is what happens the instant a developer drops an AI-generated snippet into a blank &lt;code&gt;page.tsx&lt;/code&gt;. If we padded the test harness with landmarks, we would be auditing our harness, not the component.&lt;/p&gt;

&lt;h2&gt;
  
  
  What axe-core flagged
&lt;/h2&gt;

&lt;p&gt;Three violations. All of them the same rule: &lt;code&gt;region&lt;/code&gt;, impact &lt;code&gt;moderate&lt;/code&gt;. Eight nodes in the pricing card, four in the login form, one in the dashboard card. The modal and the navbar came back clean.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;region&lt;/code&gt; fires when content lives outside of a landmark element — no &lt;code&gt;&amp;lt;main&amp;gt;&lt;/code&gt;, no &lt;code&gt;&amp;lt;nav&amp;gt;&lt;/code&gt;, no &lt;code&gt;&amp;lt;section aria-label&amp;gt;&lt;/code&gt;. It is the rule screen reader users feel most directly, because their jump-to-landmark shortcut is how they navigate a page. When content lives in the void, they have to arrow through every element to find it.&lt;/p&gt;

&lt;p&gt;The same two checks showed up as &lt;strong&gt;incomplete&lt;/strong&gt; on every component except the modal: &lt;code&gt;landmark-one-main&lt;/code&gt; and &lt;code&gt;page-has-heading-one&lt;/code&gt;. Incomplete results are not violations — axe is telling you it cannot decide from static analysis alone and needs a human to confirm. In practice, on a component snippet, these resolve to "your host page had better supply the &lt;code&gt;&amp;lt;main&amp;gt;&lt;/code&gt; and the &lt;code&gt;&amp;lt;h1&amp;gt;&lt;/code&gt;, because this component is not going to." That is fine if the developer knows. The failure mode is that most developers do not read the incomplete list.&lt;/p&gt;

&lt;h2&gt;
  
  
  What the AI actually got right
&lt;/h2&gt;

&lt;p&gt;This part we did not expect to write.&lt;/p&gt;

&lt;p&gt;The login form has proper &lt;code&gt;&amp;lt;label for="email"&amp;gt;&lt;/code&gt; / &lt;code&gt;id="email"&lt;/code&gt; pairing, &lt;code&gt;autocomplete="email"&lt;/code&gt; and &lt;code&gt;autocomplete="current-password"&lt;/code&gt; hints, &lt;code&gt;required&lt;/code&gt; attributes, and visible &lt;code&gt;focus:ring-2&lt;/code&gt; styles on every input and the submit button. The password input is a &lt;code&gt;type="password"&lt;/code&gt; input, which sounds like a non-achievement until you remember how many tutorials ship a &lt;code&gt;type="text"&lt;/code&gt; field because it was "easier to test."&lt;/p&gt;

&lt;p&gt;The modal is the bigger surprise. It came back with &lt;code&gt;role="dialog"&lt;/code&gt;, &lt;code&gt;aria-modal="true"&lt;/code&gt;, &lt;code&gt;aria-labelledby="modal-title"&lt;/code&gt;, &lt;code&gt;aria-describedby="modal-description"&lt;/code&gt;, a close button with an explicit &lt;code&gt;aria-label="Close"&lt;/code&gt;, and an &lt;code&gt;aria-hidden="true"&lt;/code&gt; on the decorative X icon. That is the full ARIA triad, correctly wired, on a first-shot prompt. A year ago this same prompt in the same tool would have given us a &lt;code&gt;&amp;lt;div&amp;gt;&lt;/code&gt; with a close &lt;code&gt;&amp;lt;span&amp;gt;&lt;/code&gt; and nothing else.&lt;/p&gt;

&lt;p&gt;The navbar uses &lt;code&gt;&amp;lt;nav&amp;gt;&lt;/code&gt;, wraps its links in &lt;code&gt;&amp;lt;ul&amp;gt;&amp;lt;li&amp;gt;&lt;/code&gt;, and uses real &lt;code&gt;&amp;lt;a&amp;gt;&lt;/code&gt; elements for every destination including the "Sign up" call-to-action — not a &lt;code&gt;&amp;lt;button&amp;gt;&lt;/code&gt; with a click handler pretending to be navigation. Semantic baseline, respected.&lt;/p&gt;

&lt;p&gt;Color contrast, checked manually against the Tailwind palette: &lt;code&gt;text-gray-700&lt;/code&gt; on white is 10.4:1, the &lt;code&gt;text-gray-500&lt;/code&gt; used for secondary text is 5.6:1, the &lt;code&gt;bg-indigo-600&lt;/code&gt; button with white text is 6.1:1. All comfortably past WCAG AA's 4.5:1 threshold. Nothing failed. Including this because it is true, and skipping it would make this post dishonest.&lt;/p&gt;

&lt;h2&gt;
  
  
  The gap that axe-core cannot see
&lt;/h2&gt;

&lt;p&gt;Here is where we have to slow down. A zero-violation automated scan on four of five components does not mean those components are accessible. It means they passed the checks that a static HTML parser is capable of running. The rest of WCAG lives in behavior.&lt;/p&gt;

&lt;p&gt;The modal is the clearest example. It has every ARIA attribute a screen reader needs to announce it as a dialog. It has zero JavaScript. Open that modal in a real browser and press Tab: focus walks right off the "Delete" button and into whatever link is underneath the backdrop. A keyboard user has no way to know they have left the dialog, and no way to escape it with the Escape key, because nothing is listening. Axe-core cannot detect this. It audits a tree, not a runtime.&lt;/p&gt;

&lt;p&gt;The dashboard card is another one. Semantically, it is a &lt;code&gt;&amp;lt;div&amp;gt;&lt;/code&gt; containing a &lt;code&gt;&amp;lt;p&amp;gt;&lt;/code&gt; for the label and a &lt;code&gt;&amp;lt;p&amp;gt;&lt;/code&gt; for the value. Visually it reads as "Monthly Revenue: $48,210." To a screen reader it reads as two disconnected paragraphs. A proper card would use a heading (&lt;code&gt;&amp;lt;h3&amp;gt;&lt;/code&gt; or the DL pattern with &lt;code&gt;&amp;lt;dt&amp;gt;&lt;/code&gt; and &lt;code&gt;&amp;lt;dd&amp;gt;&lt;/code&gt;) so the metric and its label are announced as a unit. Axe does not flag this because two &lt;code&gt;&amp;lt;p&amp;gt;&lt;/code&gt; tags are valid HTML. They are just not meaningful HTML for this context.&lt;/p&gt;

&lt;p&gt;The pricing card has the opposite flavor of the same problem. Its green checkmark SVGs are decorative — the feature name next to them is the actual content — but they have no &lt;code&gt;aria-hidden="true"&lt;/code&gt; and no &lt;code&gt;role="img"&lt;/code&gt; with a label. Axe did not flag them either, because axe is conservative about SVG. A verbose screen reader will still read "graphic" before every list item. Small paper cut, repeated five times per card.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why this pattern shows up
&lt;/h2&gt;

&lt;p&gt;AI coders have been trained on an enormous amount of HTML that mostly looks like the tutorials and component libraries a human has already published. So they are very good at reproducing surface correctness: label associations, ARIA attribute names, Tailwind contrast-safe colors, semantic list elements. The patterns that are visible in a static snapshot of code got learned thoroughly.&lt;/p&gt;

&lt;p&gt;The patterns that live in runtime behavior did not. Nobody writes a blog post about the exact event listener that closes a modal on Escape. It is buried in a hook, or a library, or it just works because everyone uses Radix. AI output optimizes for the version you can paste into a file. It does not optimize for the version you can actually use with a keyboard.&lt;/p&gt;

&lt;p&gt;This is not a tooling critique. It is a lifecycle observation. The accessibility gap in AI-generated UI is no longer "it forgot the label." It is "it remembered every attribute and forgot every interaction."&lt;/p&gt;

&lt;h2&gt;
  
  
  What to actually do about it
&lt;/h2&gt;

&lt;p&gt;Put axe-core in CI, not as a gate but as a signal. Even a modest run catches the regressions that AI coders still make — missing landmarks, missing alt text on real images, buttons without accessible names in the parts of the app that nobody thinks to regenerate. We published a walkthrough for wiring this up as a &lt;a href="https://blog.a11yfix.dev/blog/automate-accessibility-fixes-github-action/" rel="noopener noreferrer"&gt;GitHub Action in five minutes&lt;/a&gt; if you want a template.&lt;/p&gt;

&lt;p&gt;Then do three manual checks the first time you accept any AI-generated component. Press Tab through every interactive element and confirm the focus indicator is visible and the order matches the visual order. Open any dialog or menu and press Escape, then try to Tab out of it — if either fails, you have work to do. Turn on VoiceOver or NVDA for sixty seconds and listen to the component. Most of what axe cannot catch becomes obvious within the first ten announcements.&lt;/p&gt;

&lt;p&gt;We are going to run this same audit monthly against different AI coders and different prompts. Partly because the tools are moving fast and what we found today will not be true in six months. Partly because AI-generated UI is going to eat a huge share of the frontend we all end up shipping, and someone should be tracking how the accessibility baseline moves — up or down — as that happens.&lt;/p&gt;

&lt;p&gt;If you want the full component files, the axe-core output, and the exact prompts we used, they are in the audit workspace on &lt;a href="https://blog.a11yfix.dev" rel="noopener noreferrer"&gt;blog.a11yfix.dev&lt;/a&gt;. Next month: we rerun this on a more realistic prompt — a full signup page, not a single component — and see what breaks when the AI has to remember context across sections.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Not sure if your site meets WCAG standards? We run thorough audits combining automated scans and human testing, starting at $49. &lt;a href="https://blog.a11yfix.dev/audit/" rel="noopener noreferrer"&gt;Check it out&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>a11y</category>
      <category>webdev</category>
      <category>ai</category>
    </item>
    <item>
      <title>How to Write Alt Text That Screen Readers Actually Find Useful (With 15 Examples)</title>
      <dc:creator>AgentKit</dc:creator>
      <pubDate>Thu, 09 Apr 2026 23:43:57 +0000</pubDate>
      <link>https://dev.to/agentkit/how-to-write-alt-text-that-screen-readers-actually-find-useful-with-15-examples-5d0g</link>
      <guid>https://dev.to/agentkit/how-to-write-alt-text-that-screen-readers-actually-find-useful-with-15-examples-5d0g</guid>
      <description>&lt;p&gt;&lt;em&gt;Originally published at &lt;a href="https://blog.a11yfix.dev" rel="noopener noreferrer"&gt;A11yFix&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;i used to think alt text was simple. describe what you see, move on. then i started testing my sites with VoiceOver and realized most of my alt text was either useless or actively confusing.&lt;/p&gt;

&lt;p&gt;the problem isn't laziness. it's that nobody teaches you what good alt text actually sounds like when a screen reader reads it out loud. so here are 15 examples from real audits -- what was there, what should have been there, and why.&lt;/p&gt;

&lt;h2&gt;
  
  
  the golden rule nobody follows
&lt;/h2&gt;

&lt;p&gt;alt text should communicate what the image &lt;em&gt;means&lt;/em&gt;, not what it &lt;em&gt;looks like&lt;/em&gt;. sounds obvious until you try it.&lt;/p&gt;

&lt;h2&gt;
  
  
  product images
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;bad:&lt;/strong&gt; &lt;code&gt;alt="shoes"&lt;/code&gt;&lt;br&gt;
&lt;strong&gt;good:&lt;/strong&gt; &lt;code&gt;alt="Nike Air Max 90 in white and grey, side view"&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;bad:&lt;/strong&gt; &lt;code&gt;alt="product photo"&lt;/code&gt;&lt;br&gt;
&lt;strong&gt;good:&lt;/strong&gt; &lt;code&gt;alt="hand-poured soy candle in amber glass jar, 8oz"&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;screen reader users are shopping. they need to know what they're buying, not that there's a photo on the page.&lt;/p&gt;

&lt;h2&gt;
  
  
  charts and data
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;bad:&lt;/strong&gt; &lt;code&gt;alt="bar chart"&lt;/code&gt;&lt;br&gt;
&lt;strong&gt;good:&lt;/strong&gt; &lt;code&gt;alt="monthly revenue chart showing 40% growth from January to March 2026"&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;bad:&lt;/strong&gt; &lt;code&gt;alt="graph"&lt;/code&gt;&lt;br&gt;
&lt;strong&gt;good:&lt;/strong&gt; &lt;code&gt;alt="comparison chart: automated tools caught 168 issues, manual testing found an additional 147"&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;this one drives me nuts. "bar chart" tells a blind user there's a chart. so what? what does the chart &lt;em&gt;say&lt;/em&gt;? that's the alt text.&lt;/p&gt;

&lt;h2&gt;
  
  
  screenshots
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;bad:&lt;/strong&gt; &lt;code&gt;alt="screenshot of dashboard"&lt;/code&gt;&lt;br&gt;
&lt;strong&gt;good:&lt;/strong&gt; &lt;code&gt;alt="analytics dashboard showing 720 page views and 14 reactions across 30 articles"&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;bad:&lt;/strong&gt; &lt;code&gt;alt="error message"&lt;/code&gt;&lt;br&gt;
&lt;strong&gt;good:&lt;/strong&gt; &lt;code&gt;alt="browser console error: Cannot read property 'click' of null at line 47"&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;if you took the screenshot to show something specific, that specific thing is your alt text.&lt;/p&gt;

&lt;h2&gt;
  
  
  decorative images
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;bad:&lt;/strong&gt; &lt;code&gt;alt="decorative border"&lt;/code&gt; or &lt;code&gt;alt="background pattern"&lt;/code&gt;&lt;br&gt;
&lt;strong&gt;good:&lt;/strong&gt; &lt;code&gt;alt=""&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;empty alt. not missing -- deliberately empty. this tells screen readers to skip it entirely. decorative images with alt text are noise.&lt;/p&gt;

&lt;p&gt;the markup: &lt;code&gt;&amp;lt;img src="border.png" alt="" role="presentation"&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  icons with text
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;bad:&lt;/strong&gt; &lt;code&gt;alt="icon"&lt;/code&gt; on a search icon next to "Search"&lt;br&gt;
&lt;strong&gt;good:&lt;/strong&gt; &lt;code&gt;alt=""&lt;/code&gt; (the text label already communicates the function)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;bad:&lt;/strong&gt; &lt;code&gt;alt=""&lt;/code&gt; on a hamburger menu icon with no visible text&lt;br&gt;
&lt;strong&gt;good:&lt;/strong&gt; &lt;code&gt;alt="Menu"&lt;/code&gt; or better: use &lt;code&gt;aria-label="Menu"&lt;/code&gt; on the button&lt;/p&gt;

&lt;p&gt;rule of thumb: if there's visible text next to the icon that says the same thing, the icon alt should be empty. if the icon is the only indicator of function, it needs alt text.&lt;/p&gt;

&lt;h2&gt;
  
  
  logos
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;bad:&lt;/strong&gt; &lt;code&gt;alt="logo"&lt;/code&gt;&lt;br&gt;
&lt;strong&gt;good:&lt;/strong&gt; &lt;code&gt;alt="AgentKit"&lt;/code&gt; or &lt;code&gt;alt="AgentKit - home"&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;the alt should be the company name, not the word "logo". when it links to the homepage, include that context.&lt;/p&gt;

&lt;h2&gt;
  
  
  complex infographics
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;bad:&lt;/strong&gt; &lt;code&gt;alt="infographic about accessibility"&lt;/code&gt;&lt;br&gt;
&lt;strong&gt;good:&lt;/strong&gt; provide a text alternative nearby, and use &lt;code&gt;alt="Accessibility compliance roadmap, detailed description below"&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;some images can't be described in a short alt attribute. that's fine. describe what it is briefly in alt, then provide the full content as text on the page.&lt;/p&gt;

&lt;h2&gt;
  
  
  the context test
&lt;/h2&gt;

&lt;p&gt;here's the trick i use now: cover the image with your hand and read the surrounding text. does the paragraph still make sense? if not, the image is carrying meaning that needs to be in the alt text.&lt;/p&gt;

&lt;p&gt;if the paragraph works fine without the image, the image is decorative. &lt;code&gt;alt=""&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  the 5-second rule
&lt;/h2&gt;

&lt;p&gt;if you can't write the alt text in 5 seconds, the image is probably too complex for a simple alt attribute. use a longer text description nearby instead.&lt;/p&gt;

&lt;p&gt;not every image needs alt text. not every alt text needs to be long. but every alt text needs to be &lt;em&gt;useful&lt;/em&gt;.&lt;/p&gt;




&lt;p&gt;if you want the full checklist with 100+ checks like this, i put one together at &lt;a href="https://blog.a11yfix.dev" rel="noopener noreferrer"&gt;blog.a11yfix.dev&lt;/a&gt;. also available as a &lt;a href="https://www.etsy.com/jp/listing/4485241378/website-accessibility-audit-checklist" rel="noopener noreferrer"&gt;printable pdf on etsy&lt;/a&gt;.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;if you want a systematic way to check your site's accessibility, grab the free 10-point EAA quick-check at &lt;a href="https://blog.a11yfix.dev" rel="noopener noreferrer"&gt;blog.a11yfix.dev&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Want a professional audit of your own site? We run real WCAG scans paired with human walkthroughs, starting at $49. &lt;a href="https://blog.a11yfix.dev/audit/eaa/" rel="noopener noreferrer"&gt;Check pricing and options here&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>a11y</category>
      <category>webdev</category>
      <category>beginners</category>
      <category>html</category>
    </item>
    <item>
      <title>Screen Reader Testing for Developers: What VoiceOver Actually Announces (And Why It Matters)</title>
      <dc:creator>AgentKit</dc:creator>
      <pubDate>Thu, 09 Apr 2026 03:00:18 +0000</pubDate>
      <link>https://dev.to/agentkit/screen-reader-testing-for-developers-what-voiceover-actually-announces-and-why-it-matters-4gnk</link>
      <guid>https://dev.to/agentkit/screen-reader-testing-for-developers-what-voiceover-actually-announces-and-why-it-matters-4gnk</guid>
      <description>&lt;p&gt;&lt;em&gt;Originally published at &lt;a href="https://blog.a11yfix.dev" rel="noopener noreferrer"&gt;A11yFix&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;I've been building websites for years, and I'll be honest: for most of that time, I had never actually turned on a screen reader. I assumed my HTML was "fine" because it looked right in the browser. Then I fired up VoiceOver on my Mac, navigated one of my own pages, and realized how wrong I was.&lt;/p&gt;

&lt;p&gt;This guide is what I wish someone had handed me on day one. No theory-heavy lecture --- just practical steps to test your pages with VoiceOver and understand what users actually hear.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to Turn On VoiceOver (It Takes 5 Seconds)
&lt;/h2&gt;

&lt;p&gt;On any Mac, press &lt;strong&gt;Cmd + F5&lt;/strong&gt;. That's it. VoiceOver starts talking immediately.&lt;/p&gt;

&lt;p&gt;A few essentials before you start navigating:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;VO keys&lt;/strong&gt;: VoiceOver uses a modifier combination. By default, it's &lt;strong&gt;Control + Option&lt;/strong&gt; (referred to as "VO" in documentation). You hold these while pressing other keys.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;VO + Right Arrow&lt;/strong&gt;: Move to the next element on the page.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;VO + Left Arrow&lt;/strong&gt;: Move to the previous element.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;VO + Space&lt;/strong&gt;: Activate (click) the current element.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;VO + U&lt;/strong&gt;: Open the rotor --- a quick menu for headings, links, landmarks, and more.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;VO + A&lt;/strong&gt;: Read everything from the current position forward.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Open Safari (VoiceOver works best with Safari on Mac), navigate to your site, and start pressing &lt;strong&gt;VO + Right Arrow&lt;/strong&gt; to walk through the page element by element.&lt;/p&gt;

&lt;h2&gt;
  
  
  What to Listen For
&lt;/h2&gt;

&lt;p&gt;As you navigate, pay attention to what VoiceOver actually says for each element. It typically announces:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;The role&lt;/strong&gt;: "heading level 2," "link," "button," "image"&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The accessible name&lt;/strong&gt;: the text content, &lt;code&gt;aria-label&lt;/code&gt;, or &lt;code&gt;alt&lt;/code&gt; attribute&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The state&lt;/strong&gt;: "dimmed," "selected," "expanded"&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;A well-structured page sounds like a clear outline. A poorly structured page sounds like a random collection of text fragments with no context.&lt;/p&gt;

&lt;h2&gt;
  
  
  Surprise #1: Your Clickable Div Is Invisible
&lt;/h2&gt;

&lt;p&gt;This is the most common shock. You have something like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;div&lt;/span&gt; &lt;span class="na"&gt;class=&lt;/span&gt;&lt;span class="s"&gt;"card"&lt;/span&gt; &lt;span class="na"&gt;onclick=&lt;/span&gt;&lt;span class="s"&gt;"openDetail()"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;h3&amp;gt;&lt;/span&gt;Premium Plan&lt;span class="nt"&gt;&amp;lt;/h3&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;p&amp;gt;&lt;/span&gt;$29/month&lt;span class="nt"&gt;&amp;lt;/p&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/div&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You click it with your mouse every day. It works perfectly. But navigate to it with VoiceOver and you hear... nothing useful. VoiceOver announces "group" or just reads the text content with no indication that this thing is interactive.&lt;/p&gt;

&lt;p&gt;A screen reader user has no idea they can activate this element. It doesn't show up in the links or buttons list in the rotor. It's effectively invisible as an interactive element.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The fix:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;div&lt;/span&gt; &lt;span class="na"&gt;class=&lt;/span&gt;&lt;span class="s"&gt;"card"&lt;/span&gt; &lt;span class="na"&gt;role=&lt;/span&gt;&lt;span class="s"&gt;"button"&lt;/span&gt; &lt;span class="na"&gt;tabindex=&lt;/span&gt;&lt;span class="s"&gt;"0"&lt;/span&gt;
     &lt;span class="na"&gt;onclick=&lt;/span&gt;&lt;span class="s"&gt;"openDetail()"&lt;/span&gt;
     &lt;span class="na"&gt;onkeydown=&lt;/span&gt;&lt;span class="s"&gt;"if(event.key==='Enter') openDetail()"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;h3&amp;gt;&lt;/span&gt;Premium Plan&lt;span class="nt"&gt;&amp;lt;/h3&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;p&amp;gt;&lt;/span&gt;$29/month&lt;span class="nt"&gt;&amp;lt;/p&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/div&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Or better yet, use a real &lt;code&gt;&amp;lt;button&amp;gt;&lt;/code&gt; and style it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;button&lt;/span&gt; &lt;span class="na"&gt;class=&lt;/span&gt;&lt;span class="s"&gt;"card"&lt;/span&gt; &lt;span class="na"&gt;onclick=&lt;/span&gt;&lt;span class="s"&gt;"openDetail()"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;h3&amp;gt;&lt;/span&gt;Premium Plan&lt;span class="nt"&gt;&amp;lt;/h3&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;p&amp;gt;&lt;/span&gt;$29/month&lt;span class="nt"&gt;&amp;lt;/p&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/button&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now VoiceOver says "Premium Plan, $29/month, button." That's a massive difference.&lt;/p&gt;

&lt;h2&gt;
  
  
  Surprise #2: aria-label Overrides Your Visible Text
&lt;/h2&gt;

&lt;p&gt;This one catches developers who are trying to "help" screen reader users:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;a&lt;/span&gt; &lt;span class="na"&gt;href=&lt;/span&gt;&lt;span class="s"&gt;"/pricing"&lt;/span&gt; &lt;span class="na"&gt;aria-label=&lt;/span&gt;&lt;span class="s"&gt;"Click here to view our pricing page"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
  View Pricing
&lt;span class="nt"&gt;&amp;lt;/a&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Sighted users see "View Pricing." VoiceOver users hear "Click here to view our pricing page, link." These are different experiences, and the mismatch causes real problems.&lt;/p&gt;

&lt;p&gt;If a sighted coworker says "click the View Pricing link," a screen reader user searching for that text won't find it. The rotor's link list will show "Click here to view our pricing page" instead.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The rule&lt;/strong&gt;: &lt;code&gt;aria-label&lt;/code&gt; replaces the accessible name entirely. Only use it when the visible text alone doesn't provide enough context, and try to keep it consistent with what's visible. In most cases, well-written visible text is all you need.&lt;/p&gt;

&lt;h2&gt;
  
  
  Surprise #3: Missing Heading Hierarchy Destroys Navigation
&lt;/h2&gt;

&lt;p&gt;Screen reader users rely heavily on headings to navigate. Press &lt;strong&gt;VO + U&lt;/strong&gt; and select the Headings list. This is how many users get an overview of your page.&lt;/p&gt;

&lt;p&gt;If your heading hierarchy looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;h1: Welcome
h3: Our Services    (where's h2?)
h3: About Us
h5: Contact         (jumped from h3 to h5?)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The navigation experience is confusing. Users wonder if they're missing sections. Tools like axe will flag this, but hearing it through VoiceOver makes the problem visceral.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The fix&lt;/strong&gt;: Use headings in order. Every page gets one &lt;code&gt;h1&lt;/code&gt;. Sections get &lt;code&gt;h2&lt;/code&gt;. Subsections get &lt;code&gt;h3&lt;/code&gt;. It's not about font size --- CSS handles that. It's about document structure.&lt;/p&gt;

&lt;h2&gt;
  
  
  Surprise #4: Forms Without Labels Are Guessing Games
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;input&lt;/span&gt; &lt;span class="na"&gt;type=&lt;/span&gt;&lt;span class="s"&gt;"email"&lt;/span&gt; &lt;span class="na"&gt;placeholder=&lt;/span&gt;&lt;span class="s"&gt;"Enter your email"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;VoiceOver announces: "edit text." That's it. The placeholder text might show visually, but it's not a reliable accessible name.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;label&lt;/span&gt; &lt;span class="na"&gt;for=&lt;/span&gt;&lt;span class="s"&gt;"email"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;Email address&lt;span class="nt"&gt;&amp;lt;/label&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;input&lt;/span&gt; &lt;span class="na"&gt;type=&lt;/span&gt;&lt;span class="s"&gt;"email"&lt;/span&gt; &lt;span class="na"&gt;id=&lt;/span&gt;&lt;span class="s"&gt;"email"&lt;/span&gt; &lt;span class="na"&gt;placeholder=&lt;/span&gt;&lt;span class="s"&gt;"you@example.com"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now VoiceOver says: "Email address, edit text." The user knows exactly what to type.&lt;/p&gt;

&lt;h2&gt;
  
  
  A 10-Minute Testing Routine
&lt;/h2&gt;

&lt;p&gt;Here's what I do now before shipping any feature:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Cmd + F5&lt;/strong&gt; to start VoiceOver&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;VO + Right Arrow&lt;/strong&gt; through the entire new feature, listening for:

&lt;ul&gt;
&lt;li&gt;Can I tell what every interactive element does?&lt;/li&gt;
&lt;li&gt;Do buttons and links have clear names?&lt;/li&gt;
&lt;li&gt;Are images described or correctly marked as decorative?&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;VO + U&lt;/strong&gt; to check the headings list --- does it make sense as an outline?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Tab&lt;/strong&gt; through all interactive elements --- can I reach and activate everything?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;VO + Space&lt;/strong&gt; on buttons and links --- do they work?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cmd + F5&lt;/strong&gt; to turn VoiceOver off&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;That's it. Ten minutes. You'll catch more accessibility issues in those ten minutes than in hours of staring at automated scan results.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why This Matters More Than You Think
&lt;/h2&gt;

&lt;p&gt;Starting June 2025, the European Accessibility Act requires digital products and services to be accessible. That's not a suggestion --- it's law, with real enforcement. Similar regulations are expanding worldwide.&lt;/p&gt;

&lt;p&gt;But beyond compliance, testing with a screen reader connects you with how real people use your product. About 2.2 billion people worldwide have some form of vision impairment. When you hear VoiceOver stumble through your UI, those aren't abstract "accessibility issues" --- they're barriers.&lt;/p&gt;

&lt;p&gt;Turn on VoiceOver today. Navigate your own site. Listen to what it actually says. I promise you'll find at least one thing that surprises you.&lt;/p&gt;




&lt;p&gt;If you're working on accessibility compliance, I put together a free 10-point EAA quick-check: &lt;a href="https://blog.a11yfix.dev" rel="noopener noreferrer"&gt;Get the free checklist&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;The full 100+ item checklist is also available as a &lt;a href="https://www.etsy.com/jp/listing/4485241378/website-accessibility-audit-checklist" rel="noopener noreferrer"&gt;printable PDF on Etsy&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Wondering how your site stacks up? Our audit combines automated WCAG scanning with manual testing, from $49. &lt;a href="https://blog.a11yfix.dev/audit/eaa/" rel="noopener noreferrer"&gt;See what's included&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>a11y</category>
      <category>webdev</category>
      <category>beginners</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Screen Reader Testing in 5 Minutes: A Developer's Quick Start Guide</title>
      <dc:creator>AgentKit</dc:creator>
      <pubDate>Wed, 08 Apr 2026 00:11:30 +0000</pubDate>
      <link>https://dev.to/agentkit/screen-reader-testing-in-5-minutes-a-developers-quick-start-guide-27l7</link>
      <guid>https://dev.to/agentkit/screen-reader-testing-in-5-minutes-a-developers-quick-start-guide-27l7</guid>
      <description>&lt;p&gt;&lt;em&gt;Originally published at &lt;a href="https://blog.a11yfix.dev" rel="noopener noreferrer"&gt;A11yFix&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;I'll be honest: the first time I tried to test my website with a screen reader, I panicked. A robotic voice started reading everything on my page at top speed, I couldn't figure out how to stop it, and I ended up force-quitting the application. Sound familiar?&lt;/p&gt;

&lt;p&gt;Here's what I wish someone had told me: screen reader testing doesn't have to be scary, and you can learn the basics in about five minutes. This guide will get you from zero to actually running a meaningful screen reader test on your website.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Bother Testing with a Screen Reader?
&lt;/h2&gt;

&lt;p&gt;Automated accessibility tools catch roughly 30-40% of accessibility issues. The rest? You need manual testing. Screen readers reveal problems that no linter or CI check will find:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Images with meaningless alt text ("image123.png")&lt;/li&gt;
&lt;li&gt;Form fields that have no programmatic label&lt;/li&gt;
&lt;li&gt;Custom components that look interactive but can't be reached with a keyboard&lt;/li&gt;
&lt;li&gt;Content that appears visually ordered but reads in a confusing sequence&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You don't need to become an expert. You just need to catch the obvious problems.&lt;/p&gt;

&lt;h2&gt;
  
  
  VoiceOver on Mac (Built-in, Zero Setup)
&lt;/h2&gt;

&lt;p&gt;VoiceOver is already on your Mac. No downloads, no installs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Turn it on:&lt;/strong&gt; Press &lt;code&gt;Cmd + F5&lt;/code&gt; (or touch the Touch ID button three times if you've set that up).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Essential shortcuts:&lt;/strong&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Action&lt;/th&gt;
&lt;th&gt;Shortcut&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Move to next element&lt;/td&gt;
&lt;td&gt;
&lt;code&gt;VO + Right Arrow&lt;/code&gt; (VO = Ctrl + Option)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Move to previous element&lt;/td&gt;
&lt;td&gt;&lt;code&gt;VO + Left Arrow&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Activate a link/button&lt;/td&gt;
&lt;td&gt;&lt;code&gt;VO + Space&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Read the current element&lt;/td&gt;
&lt;td&gt;&lt;code&gt;VO + A&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Open the Rotor (navigation menu)&lt;/td&gt;
&lt;td&gt;&lt;code&gt;VO + U&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Turn VoiceOver off&lt;/td&gt;
&lt;td&gt;&lt;code&gt;Cmd + F5&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Your first test in 60 seconds:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open your site in Safari (VoiceOver works best with Safari)&lt;/li&gt;
&lt;li&gt;Turn on VoiceOver with &lt;code&gt;Cmd + F5&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Press &lt;code&gt;VO + Right Arrow&lt;/code&gt; repeatedly to move through elements&lt;/li&gt;
&lt;li&gt;Listen: does each element make sense on its own? Does a button say "Submit your application" or just "Click here"?&lt;/li&gt;
&lt;li&gt;Press &lt;code&gt;VO + U&lt;/code&gt; to open the Rotor, then arrow to "Headings" - is your heading structure logical?&lt;/li&gt;
&lt;li&gt;Turn it off with &lt;code&gt;Cmd + F5&lt;/code&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  NVDA on Windows (Free Download)
&lt;/h2&gt;

&lt;p&gt;NVDA is a free, open-source screen reader. Download it from &lt;a href="https://www.nvaccess.org/download/" rel="noopener noreferrer"&gt;nvaccess.org&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Essential shortcuts:&lt;/strong&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Action&lt;/th&gt;
&lt;th&gt;Shortcut&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Move to next element&lt;/td&gt;
&lt;td&gt;&lt;code&gt;Down Arrow&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Move to previous element&lt;/td&gt;
&lt;td&gt;&lt;code&gt;Up Arrow&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Activate a link/button&lt;/td&gt;
&lt;td&gt;&lt;code&gt;Enter&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;List all headings&lt;/td&gt;
&lt;td&gt;
&lt;code&gt;H&lt;/code&gt; (next heading) or &lt;code&gt;Insert + F7&lt;/code&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;List all links&lt;/td&gt;
&lt;td&gt;
&lt;code&gt;K&lt;/code&gt; (next link) or &lt;code&gt;Insert + F7&lt;/code&gt; then select Links&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Toggle speech on/off&lt;/td&gt;
&lt;td&gt;&lt;code&gt;Insert + S&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Stop speaking&lt;/td&gt;
&lt;td&gt;&lt;code&gt;Ctrl&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Tip:&lt;/strong&gt; NVDA defaults to "Browse mode" in web content, which means single-letter navigation works. Press &lt;code&gt;H&lt;/code&gt; to jump to the next heading, &lt;code&gt;F&lt;/code&gt; for the next form field, &lt;code&gt;T&lt;/code&gt; for the next table.&lt;/p&gt;

&lt;h2&gt;
  
  
  What to Listen For: A 5-Point Check
&lt;/h2&gt;

&lt;p&gt;Once you can navigate, here's a quick checklist. Run through these five checks and you'll catch the majority of screen reader issues:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Page Title
&lt;/h3&gt;

&lt;p&gt;When the page loads, does the screen reader announce a meaningful title? "Dashboard - MyApp" is good. "React App" is not.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Heading Structure
&lt;/h3&gt;

&lt;p&gt;Navigate through headings (Rotor on Mac, &lt;code&gt;H&lt;/code&gt; key on NVDA). You should hear a logical hierarchy: H1 for the page title, H2 for sections, H3 for subsections. Skipped levels or missing headings mean users can't scan your page.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Link and Button Labels
&lt;/h3&gt;

&lt;p&gt;Tab through interactive elements. Each one should announce what it does. "Read more about pricing plans" tells users where they're going. "Read more" repeated six times does not. "Button" with no label is a dealbreaker.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Form Labels
&lt;/h3&gt;

&lt;p&gt;Navigate to a form and listen. Each input should announce its label when focused. If you hear "edit text" with no context, your input is missing a programmatic label. This is one of the most common accessibility failures, and one of the easiest to fix with a proper &lt;code&gt;&amp;lt;label&amp;gt;&lt;/code&gt; element.&lt;/p&gt;

&lt;h3&gt;
  
  
  5. Image Alt Text
&lt;/h3&gt;

&lt;p&gt;Navigate through images. Decorative images should be silent (empty &lt;code&gt;alt=""&lt;/code&gt;). Informational images should describe their content. If you hear a filename or a generic "image," that's a problem.&lt;/p&gt;

&lt;h2&gt;
  
  
  Common Gotchas
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;The screen reader won't stop talking.&lt;/strong&gt; Press &lt;code&gt;Ctrl&lt;/code&gt; to silence it. This works in both VoiceOver and NVDA.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;VoiceOver isn't reading my page correctly.&lt;/strong&gt; Make sure you're using Safari. VoiceOver's best support is with Safari, not Chrome or Firefox.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;NVDA isn't responding to single-key navigation.&lt;/strong&gt; You might be in "Focus mode" instead of "Browse mode." Press &lt;code&gt;Insert + Space&lt;/code&gt; to toggle.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Custom components aren't announced properly.&lt;/strong&gt; If you built a dropdown or modal with &lt;code&gt;&amp;lt;div&amp;gt;&lt;/code&gt; elements, add ARIA roles and states. But first, consider whether a native HTML element (like &lt;code&gt;&amp;lt;select&amp;gt;&lt;/code&gt; or &lt;code&gt;&amp;lt;dialog&amp;gt;&lt;/code&gt;) would work instead.&lt;/p&gt;

&lt;h2&gt;
  
  
  Make It a Habit
&lt;/h2&gt;

&lt;p&gt;You don't need to run a full screen reader audit on every commit. But doing this 5-minute check on key pages, especially after building new features, will catch problems before your users do.&lt;/p&gt;

&lt;p&gt;My recommendation: pick one page right now, turn on VoiceOver or NVDA, and run through the five checks above. You'll be surprised what you find.&lt;/p&gt;




&lt;p&gt;If you're working on accessibility compliance, I put together a free 10-point EAA quick-check: &lt;a href="https://blog.a11yfix.dev" rel="noopener noreferrer"&gt;Get the free checklist&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Need to know where your site stands on accessibility? We do thorough WCAG audits with real assistive-technology testing, starting at $49. &lt;a href="https://blog.a11yfix.dev/audit/eaa/" rel="noopener noreferrer"&gt;View pricing&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>a11y</category>
      <category>webdev</category>
      <category>beginners</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>How to Set Up Automated Accessibility Testing in GitHub Actions (Copy-Paste Config)</title>
      <dc:creator>AgentKit</dc:creator>
      <pubDate>Tue, 07 Apr 2026 08:30:20 +0000</pubDate>
      <link>https://dev.to/agentkit/how-to-set-up-automated-accessibility-testing-in-github-actions-copy-paste-config-2eib</link>
      <guid>https://dev.to/agentkit/how-to-set-up-automated-accessibility-testing-in-github-actions-copy-paste-config-2eib</guid>
      <description>&lt;p&gt;&lt;em&gt;Originally published at &lt;a href="https://blog.a11yfix.dev" rel="noopener noreferrer"&gt;A11yFix&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;Last month I inherited a project where accessibility was "on the roadmap." Translation: nobody had touched it. Rather than rely on manual audits that happen once a quarter (if you're lucky), I set up automated accessibility testing that runs on every pull request. It took about 20 minutes, and now no PR merges if it introduces accessibility violations.&lt;/p&gt;

&lt;p&gt;Here's exactly how to do it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why axe-core + Playwright?
&lt;/h2&gt;

&lt;p&gt;There are plenty of accessibility testing tools out there, but this combination hits a sweet spot:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;axe-core&lt;/strong&gt; is the industry standard engine behind most accessibility tools. It catches real WCAG violations, not theoretical ones.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Playwright&lt;/strong&gt; gives you a real browser environment, so you're testing what users actually see -- not just static HTML.&lt;/li&gt;
&lt;li&gt;Both are free and open source.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Together they catch around 30-40% of WCAG 2.1 issues automatically. That won't replace manual testing, but it prevents regressions and catches the low-hanging fruit before a human ever looks at the page.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 1: Install the dependencies
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-D&lt;/span&gt; @axe-core/playwright @playwright/test
npx playwright &lt;span class="nb"&gt;install &lt;/span&gt;chromium
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step 2: Create the accessibility test file
&lt;/h2&gt;

&lt;p&gt;Create &lt;code&gt;tests/accessibility.spec.ts&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;test&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;expect&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;@playwright/test&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;AxeBuilder&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;@axe-core/playwright&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;pagesToTest&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
  &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Home&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;path&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;/&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;About&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;path&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;/about&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="c1"&gt;// Add your routes here&lt;/span&gt;
&lt;span class="p"&gt;];&lt;/span&gt;

&lt;span class="k"&gt;for &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;path&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;of&lt;/span&gt; &lt;span class="nx"&gt;pagesToTest&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nf"&gt;test&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; page should have no accessibility violations`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="nx"&gt;page&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;page&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;goto&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;path&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;AxeBuilder&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;page&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;withTags&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;wcag2a&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;wcag2aa&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;wcag21a&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;wcag21aa&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;analyze&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;violations&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;violations&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;v&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;({&lt;/span&gt;
      &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;v&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;impact&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;v&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;impact&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;v.description,&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="na"&gt;nodes&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;v&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;nodes&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;}));&lt;/span&gt;

    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;violations&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Accessibility violations found:&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;violations&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="nf"&gt;expect&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;violations&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;toEqual&lt;/span&gt;&lt;span class="p"&gt;([]);&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A few things to note:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;withTags&lt;/code&gt; filters to WCAG 2.1 Level AA rules. That's the standard most regulations (including the European Accessibility Act) point to.&lt;/li&gt;
&lt;li&gt;The test logs structured violation data before failing, so you can see exactly what's wrong in the CI output.&lt;/li&gt;
&lt;li&gt;Adding new pages is just adding an entry to the array.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Step 3: Configure Playwright
&lt;/h2&gt;

&lt;p&gt;If you don't already have a &lt;code&gt;playwright.config.ts&lt;/code&gt;, create one:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;defineConfig&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;@playwright/test&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="nf"&gt;defineConfig&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;testDir&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;./tests&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;use&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;baseURL&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;http://localhost:3000&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="na"&gt;webServer&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;command&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;npm run start&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;port&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;3000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;reuseExistingServer&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;CI&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Adjust the &lt;code&gt;command&lt;/code&gt; and &lt;code&gt;port&lt;/code&gt; to match your project. If you're using Next.js, &lt;code&gt;npm run start&lt;/code&gt; works after a build. For Vite, use &lt;code&gt;npm run preview&lt;/code&gt;. The key is that Playwright spins up your server automatically during CI.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 4: The GitHub Actions workflow
&lt;/h2&gt;

&lt;p&gt;Create &lt;code&gt;.github/workflows/accessibility.yml&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Accessibility Tests&lt;/span&gt;

&lt;span class="na"&gt;on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;pull_request&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;branches&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt;&lt;span class="nv"&gt;main&lt;/span&gt;&lt;span class="pi"&gt;]&lt;/span&gt;
  &lt;span class="na"&gt;push&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;branches&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt;&lt;span class="nv"&gt;main&lt;/span&gt;&lt;span class="pi"&gt;]&lt;/span&gt;

&lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;a11y&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@v4&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/setup-node@v4&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;node-version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;20&lt;/span&gt;
          &lt;span class="na"&gt;cache&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;npm"&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Install dependencies&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;npm ci&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Install Playwright browsers&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;npx playwright install --with-deps chromium&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Build&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;npm run build&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Run accessibility tests&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;npx playwright test tests/accessibility.spec.ts&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Upload test results&lt;/span&gt;
        &lt;span class="na"&gt;if&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;failure()&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/upload-artifact@v4&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;a11y-report&lt;/span&gt;
          &lt;span class="na"&gt;path&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;test-results/&lt;/span&gt;
          &lt;span class="na"&gt;retention-days&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;7&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's it. Copy that file into your repo and accessibility testing runs on every PR.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;if: failure()&lt;/code&gt; on the upload step means you only get artifacts when tests fail -- so you can dig into the details without cluttering up passing runs.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 5: Handle violations without blocking everything
&lt;/h2&gt;

&lt;p&gt;Sometimes you need to ship a feature while you fix an existing accessibility issue. You can exclude specific rules temporarily:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;AxeBuilder&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;page&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt;
  &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;withTags&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;wcag2a&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;wcag2aa&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;wcag21a&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;wcag21aa&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
  &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;disableRules&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;color-contrast&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt; &lt;span class="c1"&gt;// TODO: Fix by sprint 23&lt;/span&gt;
  &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;analyze&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I recommend adding a comment with a deadline and tracking these exclusions. They have a tendency to become permanent if you don't.&lt;/p&gt;

&lt;h2&gt;
  
  
  What this catches (and what it doesn't)
&lt;/h2&gt;

&lt;p&gt;Automated testing reliably catches:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Missing alt text on images&lt;/li&gt;
&lt;li&gt;Missing form labels&lt;/li&gt;
&lt;li&gt;Broken ARIA attributes&lt;/li&gt;
&lt;li&gt;Insufficient color contrast&lt;/li&gt;
&lt;li&gt;Missing document language&lt;/li&gt;
&lt;li&gt;Incorrect heading hierarchy&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It won't catch:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Whether alt text is actually meaningful&lt;/li&gt;
&lt;li&gt;Keyboard navigation flow issues&lt;/li&gt;
&lt;li&gt;Screen reader announcement quality&lt;/li&gt;
&lt;li&gt;Complex interactive widget behavior&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;That's why automated testing is a safety net, not a replacement for real accessibility work. But a safety net that runs on every PR is a lot better than a manual audit that happens twice a year.&lt;/p&gt;

&lt;h2&gt;
  
  
  Making it work with your stack
&lt;/h2&gt;

&lt;p&gt;If you're using a static site or SSR framework, the config above works as-is. For SPAs, make sure Playwright waits for your app to hydrate. You might need to add a &lt;code&gt;waitUntil: 'networkidle'&lt;/code&gt; option to &lt;code&gt;page.goto()&lt;/code&gt;, or wait for a specific element to appear before running the analysis.&lt;/p&gt;

&lt;p&gt;For monorepos, point the &lt;code&gt;testDir&lt;/code&gt; and &lt;code&gt;webServer&lt;/code&gt; config at the right package, and you're set.&lt;/p&gt;




&lt;p&gt;Automated accessibility testing won't make your site fully accessible overnight. But it draws a line: from this point forward, we don't ship new violations. That's a meaningful starting point.&lt;/p&gt;

&lt;p&gt;If you're working on accessibility compliance, I put together a free 10-point EAA quick-check: &lt;a href="https://blog.a11yfix.dev" rel="noopener noreferrer"&gt;Get the free checklist&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;The full 100+ item checklist is also available as a &lt;a href="https://www.etsy.com/jp/listing/4485241378/website-accessibility-audit-checklist" rel="noopener noreferrer"&gt;printable PDF on Etsy&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Curious about your own site's accessibility gaps? We offer WCAG audits that go beyond automated scans, starting at $49. &lt;a href="https://blog.a11yfix.dev/audit/eaa/" rel="noopener noreferrer"&gt;Learn more&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>a11y</category>
      <category>github</category>
      <category>webdev</category>
      <category>tutorial</category>
    </item>
  </channel>
</rss>
