DEV Community

Cover image for Building Accessible Websites: A 2026 Checklist with Free Testing Tools
SIKOUTRIS
SIKOUTRIS

Posted on

Building Accessible Websites: A 2026 Checklist with Free Testing Tools

Automated accessibility checkers catch about 30-40% of real-world issues. That's a useful number to keep in mind — not because it makes testing pointless, but because it defines the role of tooling correctly. Tools catch the obvious, measurable failures. The rest requires judgment.

This guide covers both: a practical WCAG 2.2 quick reference, a testing workflow you can actually run, and an honest look at which tools do what.


Why WCAG 2.2 and Not "Just Use a Checker"

WCAG 2.2 (finalized October 2023, now the de facto legal benchmark across the EU, UK, and US) added nine new success criteria to the existing 2.1 standard. The most impactful for developers:

  • 2.4.11 Focus Appearance (AA): Focus indicators must meet a minimum size and contrast ratio. No more outline: none without a custom replacement that actually passes.
  • 2.4.12 Focus Not Obscured (AA): Sticky headers and cookie banners must not fully hide the focused element.
  • 2.5.7 Dragging Movements (AA): Any drag-and-drop interaction must have a pointer alternative (click, tap, keyboard).
  • 2.5.8 Target Size Minimum (AA): Interactive targets must be at least 24x24 CSS pixels. Tiny icon buttons are now a compliance issue, not just a usability preference.
  • 3.2.6 Consistent Help (A): If a contact mechanism appears on multiple pages, it must appear in the same relative position on each.

None of these are detectable by automated scanners alone — they require human judgment about intent and context.


WCAG 2.2 Quick Reference (AA Compliance)

Perceivable

Criterion What to Check
1.1.1 Non-text Content Every <img> has meaningful alt, decorative images use alt=""
1.3.1 Info and Relationships Headings, lists, and tables use semantic HTML — not <div> styled to look like them
1.3.5 Identify Input Purpose Form fields for name, email, phone have correct autocomplete attributes
1.4.3 Contrast (Minimum) Text contrast ≥ 4.5:1, large text ≥ 3:1
1.4.4 Resize Text Page is usable at 200% browser zoom without horizontal scroll
1.4.10 Reflow Single-column layout at 320px width (no horizontal scroll)
1.4.11 Non-text Contrast UI components (borders, icons) contrast ≥ 3:1 against background

Operable

Criterion What to Check
2.1.1 Keyboard Every interaction reachable and operable by keyboard alone
2.4.3 Focus Order Tab order follows visual reading order
2.4.7 Focus Visible Focus indicator visible on all interactive elements
2.4.11 Focus Appearance (NEW 2.2) Focus indicator ≥ 2px, perimeter ≥ component perimeter
2.5.8 Target Size Minimum (NEW 2.2) Click targets ≥ 24x24px

Understandable

Criterion What to Check
3.1.1 Language of Page <html lang="en"> present and correct
3.2.2 On Input Form fields don't trigger navigation changes without warning
3.3.1 Error Identification Errors identified in text, not only by color
3.3.2 Labels or Instructions Every form field has a programmatically associated <label>

Robust

Criterion What to Check
4.1.2 Name, Role, Value Custom components expose role, state, and value via ARIA
4.1.3 Status Messages Dynamic messages (alerts, toast notifications) use role="status" or aria-live

The Testing Workflow

A reliable accessibility check follows three layers. Each layer catches different things.

Layer 1: Automated Scan (10 minutes)

Run automated tooling first to clear the low-hanging fruit. This catches missing alt text, color contrast failures, missing form labels, and invalid ARIA usage.

axe DevTools (browser extension, free tier):

  1. Open DevTools → axe DevTools tab
  2. Click "Scan All of My Page"
  3. Fix everything marked "Critical" and "Serious" before moving on

Lighthouse (built into Chrome DevTools):

# Or run from CLI for CI integration
npx lighthouse https://yoursite.com \
  --only-categories=accessibility \
  --output=json \
  --output-path=./a11y-report.json
Enter fullscreen mode Exit fullscreen mode

A Lighthouse score of 100 does not mean your site is fully accessible. It means you passed the automated checks. Keep this distinction in mind when communicating scores to stakeholders.

For a comprehensive free scan that checks against the full WCAG 2.2 ruleset across your entire domain (not just a single page), web-accessibility-checker.com crawls multiple pages and aggregates issues by severity and criterion — useful for getting a site-wide picture before you start fixing individual pages.

Layer 2: Keyboard-Only Navigation (20 minutes)

Unplug your mouse. Tab through the entire page flow:

  • Can you reach every interactive element?
  • Is the focus indicator always visible?
  • Does the tab order make sense without visual context?
  • Can you close modals and dropdowns with Escape?
  • Does skip navigation work (Tab → Enter on "Skip to main content")?

Common failures here: custom dropdowns that open on hover but don't respond to Enter/Space, modals that trap focus incorrectly (or don't trap it at all), date pickers that require a mouse.

Layer 3: Screen Reader Testing (30 minutes)

Use at least two combinations, since behavior varies significantly:

Screen Reader Browser Platform
NVDA (free) Firefox Windows
VoiceOver (built-in) Safari macOS/iOS
TalkBack (built-in) Chrome Android

Test the critical user journey: landing page → main action → form submission → confirmation. Listen for:

  • Is the page title meaningful?
  • Are headings announced in logical order?
  • Are images described or skipped (decorative)?
  • Are error messages announced when they appear?
  • Does the form submit confirmation get announced?

Tool Comparison

Tool Automated WCAG Version Site-Wide Crawl Free Tier Best For
axe DevTools Yes 2.2 No (single page) Yes Per-page deep scan
Lighthouse Yes 2.1 (mostly) No Yes CI integration
web-accessibility-checker.com Yes 2.2 Yes Yes Full-site audit
WAVE Yes 2.1 No Yes Visual overlay
IBM Equal Access Yes 2.2 No Yes Enterprise/ARIA-heavy apps

Common Failures That Scanners Miss

These require manual review every time:

Meaningful alt text: A scanner confirms that alt exists. Only a human can tell whether alt="image" or alt="DSC_4872.jpg" is useful. Good alt text describes what the image conveys in context, not what it depicts.

Reading order vs. visual order: CSS Grid and Flexbox make it trivial to reorder elements visually while leaving the DOM order unchanged. Screen readers and keyboard users follow DOM order. Test this by disabling CSS in DevTools and checking whether the content still flows logically.

Focus trapping in modals: When a modal opens, focus should move into it and Tab should cycle within it — not escape to the page behind. When the modal closes, focus should return to the trigger element. Automated tools rarely catch broken focus trap implementations.

ARIA misuse: aria-label overrides visible text for screen reader users. If the visible button says "Submit" and aria-label="send-form" is on it, screen reader users hear "send-form" while sighted users see "Submit". This creates a mismatch that's a WCAG 4.1.2 failure and an author error.


Making It Stick: Accessibility in CI

The most effective way to prevent regressions is to make accessibility part of the definition of "done":

// jest-axe example for component testing
import { axe, toHaveNoViolations } from 'jest-axe'
import { render } from '@testing-library/react'
import ContactForm from './ContactForm'

expect.extend(toHaveNoViolations)

test('ContactForm has no accessibility violations', async () => {
  const { container } = render(<ContactForm />)
  const results = await axe(container)
  expect(results).toHaveNoViolations()
})
Enter fullscreen mode Exit fullscreen mode

This won't catch everything — but it ensures that color contrast failures, missing labels, and invalid ARIA don't make it to production undetected.


The Practical Minimum for 2026

If you're starting from scratch or doing a quick audit, here's the minimum viable workflow:

  1. Run a site-wide automated scan to get a prioritized issue list
  2. Fix all Critical and Serious violations
  3. Do a 20-minute keyboard-only walkthrough of your main user flow
  4. Test with VoiceOver or NVDA on that same flow
  5. Add axe to your component test suite for ongoing coverage

Accessibility done well is not a separate work stream. It's the same discipline as writing semantic HTML, keeping markup clean, and building interfaces that behave predictably. The testing just makes the quality visible.


What's the most surprising accessibility issue you've found through manual testing that automated tools missed? I'm genuinely curious — share it in the comments.

Top comments (0)