designengineering

Accessibility Audit Report

Write structured accessibility audit reports with findings mapped to WCAG criteria, severity levels, affected components, remediation steps, and a prioritized fix timeline.

accessibilitya11yWCAGaudit-reportremediationcompliance

Works well with agents

Accessibility Auditor AgentCode Reviewer AgentUX Researcher Agent

Works well with skills

Component Design SpecTicket Writing
$ npx skills add The-AI-Directory-Company/(…) --skill accessibility-audit-report
accessibility-audit-report/
    • signup-flow-audit.md5.0 KB
  • SKILL.md6.7 KB
SKILL.md
Markdown
1 
2# Accessibility Audit Report
3 
4## Before you start
5 
6Gather the following from the user. If anything is missing, ask before proceeding:
7 
81. **Scope** — Which pages, flows, or components are being audited? (e.g., "checkout flow", "marketing site", "design system")
92. **Target conformance level** — WCAG 2.1 AA (most common), WCAG 2.2 AA, or AAA?
103. **Testing methods used** — Automated tools (axe, Lighthouse), manual keyboard/screen reader testing, user testing with disabled participants?
114. **Tech stack** — Framework, component library, and any existing a11y tooling in the pipeline
125. **Audience context** — Is there a legal compliance deadline, a known user complaint, or a proactive audit?
13 
14If the user says "just audit everything," push back: "Which user flows are highest traffic or highest risk? Start there — a focused audit produces actionable results, a broad audit produces a backlog nobody reads."
15 
16## Audit report template
17 
18### 1. Executive Summary
19 
203-5 sentences. State the overall conformance status, the number of issues found by severity, and the single most critical finding. This section is for stakeholders who will not read the full report.
21 
22```
23This audit evaluated 12 screens across the checkout flow against WCAG 2.1 AA.
24We identified 23 issues: 4 critical, 7 major, 9 minor, and 3 best-practice
25recommendations. The most critical finding is that the payment form is entirely
26inaccessible to keyboard-only users, blocking approximately 8% of users from
27completing purchases.
28```
29 
30### 2. Scope & Methodology
31 
32List what was tested, what was not, and how. Be explicit about tools and versions.
33 
34```
35Tested: /cart, /checkout/shipping, /checkout/payment, /checkout/confirmation
36Not tested: Account settings, admin dashboard
37Methods: axe-core 4.9, NVDA 2024.1 + Chrome, VoiceOver + Safari, manual
38keyboard navigation, color contrast analyzer
39```
40 
41### 3. Summary of Findings
42 
43Provide a scannable table. Every issue in the report must appear here.
44 
45| # | Issue | WCAG Criterion | Severity | Component/Page |
46|---|-------|---------------|----------|----------------|
47| 1 | Payment form not keyboard accessible | 2.1.1 Keyboard | Critical | /checkout/payment |
48| 2 | Images missing alt text | 1.1.1 Non-text Content | Major | /cart |
49| 3 | Insufficient color contrast on helper text | 1.4.3 Contrast (Minimum) | Minor | Global |
50 
51### 4. Detailed Findings
52 
53Write one section per issue. Every finding must include all six fields below — no exceptions.
54 
55#### Finding #[N]: [Short description]
56 
57- **WCAG Criterion**: [Number + Name] (e.g., 2.1.1 Keyboard)
58- **Severity**: Critical / Major / Minor / Best Practice
59- **Affected Element**: [CSS selector, component name, or page URL]
60- **Description**: What the problem is, who it affects, and what the user experiences. Be specific — "screen reader users cannot determine the purpose of this button" not "button is inaccessible."
61- **Remediation**: Step-by-step fix. Include the specific ARIA attribute, HTML element, or CSS property needed.
62- **Code Example**:
63 
64```html
65<!-- Before (inaccessible) -->
66<div onclick="submit()">Pay Now</div>
67 
68<!-- After (accessible) -->
69<button type="submit">Pay Now</button>
70```
71 
72Severity definitions:
73 
74| Severity | Meaning |
75|----------|---------|
76| **Critical** | Blocks a user group from completing a core task. Fix immediately. |
77| **Major** | Causes significant difficulty but a workaround exists. Fix within current sprint. |
78| **Minor** | Causes inconvenience but does not block functionality. Fix within the quarter. |
79| **Best Practice** | Not a WCAG violation but improves the experience. Schedule as capacity allows. |
80 
81### 5. Remediation Priority Matrix
82 
83Group findings by effort and impact to help the team sequence work.
84 
85| | Low Effort | High Effort |
86|---|-----------|-------------|
87| **High Impact** | Fix first (e.g., missing alt text, missing labels) | Plan next (e.g., rebuild inaccessible custom widget) |
88| **Low Impact** | Quick wins (e.g., skip-to-content link) | Backlog (e.g., ARIA live region for non-critical notifications) |
89 
90### 6. Recommended Timeline
91 
92Map findings to concrete timeframes tied to severity.
93 
94```
95Week 1-2: All Critical findings (#1, #4)
96Week 3-4: All Major findings (#2, #5, #7, #8, #10, #11, #12)
97Week 5-8: Minor findings, prioritized by the matrix above
98Ongoing: Best-practice recommendations integrated into component library
99```
100 
101Include a note on regression prevention: "Add axe-core to CI. Every new component must pass automated a11y checks before merge."
102 
103## Quality checklist
104 
105Before delivering the report, verify:
106 
107- [ ] Every finding maps to a specific WCAG success criterion, not just a general principle
108- [ ] Severity levels are consistent — same type of issue gets the same severity throughout the report
109- [ ] Remediation steps are specific enough for a developer to implement without further research
110- [ ] Code examples show before AND after, not just the correct version
111- [ ] The summary table matches the detailed findings exactly — no orphaned or missing entries
112- [ ] The executive summary includes a concrete number of issues and highlights the worst one
113- [ ] The timeline is realistic given the severity distribution and team capacity
114- [ ] Testing methodology is documented — another auditor could reproduce the findings
115 
116## Common mistakes to avoid
117 
118- **Listing the tool output as the report.** axe-core output is raw data, not a report. Every automated finding needs human interpretation: who is affected, how badly, and what to do about it.
119- **Missing the keyboard test.** Automated tools catch roughly 30% of accessibility issues. If you only ran axe and Lighthouse, say so — and note that keyboard and screen reader testing is still needed.
120- **Vague remediation.** "Make this accessible" is not a remediation step. "Add `aria-label="Close dialog"` to the `<button>` element at `.modal-close`" is.
121- **Inconsistent severity.** If a missing alt text on a decorative image is "Critical" but a missing alt text on a product image is "Minor," your severity framework is broken. Define it once, apply it uniformly.
122- **No prioritization.** A flat list of 40 findings paralyzes teams. The priority matrix and timeline exist to prevent this — always include them.
123 
AgentsSkillsCompaniesJobsForumBlogFAQAbout

©2026 ai-directory.company

·Privacy·Terms·Cookies·