From harness-claude
Tests web applications with screen readers (NVDA, VoiceOver, JAWS) to verify ARIA attributes, accessible navigation, announcements, focus management, forms, and dynamic content like modals and SPAs.
npx claudepluginhub intense-visions/harness-engineering --plugin harness-claudeThis skill uses the workspace's default tool permissions.
> Test web applications with screen readers to verify accessible navigation, announcements, and interaction patterns
Tests web applications with screen readers including VoiceOver, NVDA, and JAWS. Use for validating compatibility, debugging ARIA issues, form accessibility, and dynamic content announcements.
Guides testing web applications with screen readers for accessibility validation, covering ARIA implementations, forms, dynamic content announcements, and navigation.
Audits specific design system components for WCAG 2.1 AA accessibility across keyboard navigation, screen readers, color contrast, focus management, and ARIA. Provides PASS/FAIL/WARN with remediation guidance.
Share bugs, ideas, or general feedback.
Test web applications with screen readers to verify accessible navigation, announcements, and interaction patterns
| Screen Reader | Platform | Browser | Market Share |
|---|---|---|---|
| NVDA | Windows | Firefox, Chrome | ~40% |
| JAWS | Windows | Chrome, Edge | ~30% |
| VoiceOver | macOS/iOS | Safari | ~25% |
| TalkBack | Android | Chrome | ~5% |
| Narrator | Windows | Edge | ~3% |
At minimum, test with NVDA + Firefox and VoiceOver + Safari to cover the majority of screen reader users.
Enable/Disable: Cmd + F5
VoiceOver key (VO): Ctrl + Option
Navigate next: VO + Right Arrow
Navigate previous: VO + Left Arrow
Activate element: VO + Space
Read all: VO + A
Open rotor: VO + U (navigate landmarks, headings, links, forms)
Next heading: VO + Cmd + H
Next landmark: VO + Cmd + (no direct shortcut — use rotor)
Enable: Ctrl + Alt + N
NVDA key: Insert (or Caps Lock if configured)
Navigate next: Down Arrow (in browse mode)
Navigate previous: Up Arrow
Activate element: Enter
Read all: NVDA + Down Arrow
Elements list: NVDA + F7 (headings, links, landmarks, form fields)
Next heading: H
Next landmark: D
Next form field: F
Toggle browse/focus: NVDA + Space
Landmarks and structure:
<title>Navigation:
Forms:
Dynamic content:
Images and media:
Test common user flows, not just individual elements. Complete a full task: sign up, make a purchase, change settings. Screen reader issues often emerge in transitions between pages and states.
Listen, do not just read the accessibility tree. Actually listen to the screen reader output. The accessibility tree shows what is exposed, but the spoken output reveals timing, ordering, and verbosity issues that the tree does not show.
Test with the screen reader's virtual/browse mode and forms/focus mode. NVDA and JAWS have two modes — browse mode (arrow keys navigate the page) and focus mode (arrow keys interact with form controls). Custom widgets must work in both modes or correctly trigger mode switching.
Document screen reader bugs with reproduction steps. Include: screen reader + version, browser + version, OS, exact steps, expected announcement, actual announcement.
VoiceOver Rotor: Press VO + U to open the rotor — a navigation menu that lists all headings, links, landmarks, and form controls on the page. This is how screen reader users get an overview of page structure. If your headings and landmarks are poor, the rotor is useless.
Browse mode vs. focus mode: In browse mode, the screen reader intercepts keystrokes (H jumps to next heading, F jumps to next form field). In focus mode, keystrokes go to the web page (arrow keys navigate within a dropdown). Interactive widgets (comboboxes, grids) need role attributes that trigger automatic mode switching.
Common issues found during testing:
alt)htmlFor association)aria-modal or inert)Automated testing complements but does not replace manual testing. Automated tools catch ~30-40% of accessibility issues (missing alt text, missing labels, contrast violations). Manual screen reader testing catches the remaining 60-70% (poor announcements, confusing navigation, broken focus management).
https://www.w3.org/WAI/test-evaluate/