Skip to main content

Assistive technology testing is conducted

ab-000132 · accessibility-wcag.robust.assistive-tech-testing
Severity: infoactive

Why it matters

Automated tools catch roughly 30–40% of WCAG failures. The remainder — focus management, ARIA live region behavior, reading order, screen reader announcement timing — only surface during testing with real assistive technology. Without documented screen reader testing, an app can pass all automated checks and still fail VoiceOver on Safari or NVDA on Chrome in ways that make core features unusable. For any product claiming WCAG compliance or seeking Section 508 contracts, AT testing evidence is a prerequisite.

Severity rationale

Info because the absence of AT testing is a process gap rather than a direct failure of any single WCAG criterion — but it means the remaining ~60% of potential violations go undetected and the compliance claim cannot be substantiated.

Remediation

Create a documented AT test plan and record results in docs/accessibility-testing.md at the repo root.

# Accessibility Testing Log

## Test combinations
- NVDA 2024.1 + Chrome 123 (Windows)
- VoiceOver + Safari (macOS)
- VoiceOver + Safari (iOS)

## Tested routes
- [ ] / (homepage)
- [ ] /login
- [ ] /dashboard
- [ ] Main user workflow: create → review → submit

## Last tested
2026-04-01 — no critical issues found

Minimum: test the critical user path (sign up, core feature, billing) with VoiceOver + Safari and NVDA + Chrome. File tickets for any issues found. Re-test after major UI changes.

Detection

  • ID: accessibility-wcag.robust.assistive-tech-testing

  • Severity: info

  • What to look for: Enumerate every relevant item. Look for documentation of testing with screen readers or voice control. Check if VoiceOver (macOS), NVDA (Windows), Jaws, or similar tools have been used to test the application.

  • Pass criteria: At least 1 of the following conditions is met. Evidence that the application has been tested with at least one screen reader (VoiceOver, NVDA, Jaws) and one browser combination (Chrome + NVDA, Safari + VoiceOver).

  • Fail criteria: No evidence of assistive technology testing.

  • Skip (N/A) when: Never — AT testing is recommended for all projects.

  • Detail on fail: Example: "No documentation of testing with screen readers. Only automated tools used"

  • Remediation: Add assistive technology testing to your QA process. Create a test plan file at src/docs/accessibility-testing.md or add to your project's testing documentation:

    1. Test with NVDA (Windows) + Chrome on src/app/ routes
    2. Test with VoiceOver (macOS) + Safari
    3. Test keyboard navigation on all pages
    4. Document findings and create tickets for issues found

External references

Taxons

History