axle · guide

Web accessibility audit — the practical guide

A web accessibility audit is the formal process of evaluating a website against WCAG 2.1 AA (and increasingly 2.2 AA) and documenting the findings in a form that regulators, legal teams, and engineering teams can act on. This guide covers what's in scope, what a credible audit looks like, what it typically costs, how to use axe-core pre-scans to reduce billable hours by 40-60%, and when to commission one.

When to commission an audit

Commission an audit when you need one of:

  • Regulatory disclosure— EAA 2025 requires a conformance statement with assessment methodology. A human audit is the credible version of “third-party evaluation”.
  • Demand-letter response — if you received an ADA Title III demand letter, a human audit is usually part of the settlement deliverable. See the first-48-hours playbook.
  • Procurement requirement — many enterprise buyers, especially in government (Section 508), healthcare, and education, require a VPAT (Voluntary Product Accessibility Template) with a dated third-party audit.
  • Major redesign or rebrand — post-launch audits catch issues before regulators or plaintiffs do.
  • Annual re-certification — continuous compliance programs typically fund one audit per year on top of continuous CI.

Scope — what's actually in

A well-scoped audit covers:

  • Representative page sample: 15-25 URLs covering homepage, key landing pages, product / service pages, checkout / conversion flows, account / profile, search, forms, and error states. A full-site crawl is rarely necessary; sampling covers ~85% of issues.
  • Multi-viewport evaluation: desktop (1280px+), tablet (768px), mobile (375px). WCAG 1.4.10 (Reflow) and the 2.2 target-size criterion require mobile evaluation.
  • Assistive-tech testing: NVDA on Windows, VoiceOver on macOS / iOS, TalkBack on Android, Jaws where the audience includes Jaws users (large in US enterprise). Keyboard-only navigation test.
  • Automated scan coverage: axe-core 4.x, optionally Lighthouse or Pa11y as a cross-check. Automated catches ~57% of issues; human review covers the rest.
  • Colour / contrast analysis: body text, large text, UI components, focus indicators, overlaid text on imagery.
  • Document accessibility (if PDFs or Office files are consumer-facing): tagged-PDF review, alt text, heading order, reading order.

Deliverables — what you should get

A credible audit produces:

  1. Executive summary — conformance status (full / partial / non-conformant), high-level risk profile, business-impact framing.
  2. Per-URL findings — every violation tied to a specific WCAG success criterion, severity (critical / serious / moderate / minor), affected assistive tech, and reproduction steps.
  3. Prioritised remediation plan — realistic dates, effort estimates per finding (hours / story points), recommended sequencing.
  4. VPAT (where required) — formal conformance declaration in the ITI VPAT 2.x template, ready for procurement documentation.
  5. Screenshots / recordingsof key issues — especially useful for internal stakeholders who aren't WCAG fluent.
  6. Re-audit scope — what will be rechecked after remediation and the expected timeline.

If the deliverable is a one-page PDF “certificate” or a listicle of generic issues with no success-criterion mapping, it's not a real audit. Walk away.

Cost — realistic ranges

Market ranges as of 2026, for a WCAG 2.1 AA audit:

  • Small site (5-10 page sample, marketing brochure): $3,000-$7,000.
  • Medium site / SaaS (15-25 page sample, auth flows, key product surfaces): $8,000-$18,000.
  • E-commerce or large SaaS (25-50 URLs, multi-role, checkout, multiple locales): $18,000-$40,000.
  • Enterprise / multi-product (multiple apps, VPATs, re-audits bundled): $40,000-$120,000+.

Regional variation is significant. US auditors skew higher; EU auditors (especially Nordic / Iberian) are often 30-50% less for comparable rigour. Overlay-vendor “audits” bundled with their widget are not independent and should not substitute.

Pre-scan: cut billable hours with automation

Auditors charge by the hour. Every machine-detectable violation a human has to find, log, and write up is an hour you're paying a senior accessibility specialist to do clerical work. The most effective preparation:

  1. Run axle against every page in the audit sample. Output the JSON report.
  2. Hand the auditor the JSON plus the commit SHA that produced it. They now skip the ~57% of issues that are already documented and focus on the ~43% that require human judgement.
  3. Fix the critical and serious findings from the pre-scan before the audit starts. Your audit cost drops by 40-60% and the deliverable is actionable for the remaining issues.

Good auditors prefer this — it lets them focus on where their expertise actually adds value (semantic correctness, heading structure, cognitive load, screen-reader experience quality). Auditors who resist pre-scans are billing the clerical work; reconsider.

Audit template — what to ask for

A minimal audit statement-of-work should specify:

  • WCAG version and level (2.1 AA, 2.2 AA, or AAA if relevant).
  • Referenced technical standard: EN 301 549 (EU), Section 508 (US federal), or the WCAG spec directly.
  • URL sample list (attach explicitly).
  • Viewports to test.
  • Assistive technologies to test (list specific screen readers + versions).
  • Deliverable format (PDF + CSV + VPAT if applicable).
  • Severity taxonomy (map to critical / serious / moderate / minor or equivalent).
  • Re-audit inclusion scope and timeline.
  • Direct access to the auditor for engineering clarifying questions during remediation.
  • NDAs and data-handling expectations for any sensitive pages.

A VPAT is not the same as an audit; it's an output format. Ask for both if procurement requires VPAT.

Choosing an auditor

Credibility markers:

  • IAAP certification — CPACC (foundational) or WAS (specialist) — on the individual auditors, not just the firm.
  • Published work — conference talks, books, active public writing. Accessibility is a craft; active craftspeople do work in public.
  • Independence — not affiliated with an overlay-widget vendor. Independence is a dealbreaker; overlay-vendor “audits” have conflict of interest.
  • Reference engagements — specifically in your industry (fintech, healthcare, e-commerce, etc.) and regulatory context (EAA / ADA / Section 508 / תקנה 35).
  • Tooling transparency — willing to name the automation they use (axe-core, Pa11y, Lighthouse) rather than a proprietary black box.

Notable firms (not endorsements, not exhaustive): Deque, TPGi, Level Access (US); Funka, AnySurfer, Hassell Inclusion, Useit (EU); Fundacja Widzialni, Integracja (Poland); Instituto Sonae, ACAPO (Portugal).

After the audit — remediation and CI

The audit report is a point-in-time snapshot. Without a CI gate, the fixes regress on next quarter's feature work and you pay for another audit. The sustainable model:

  1. Remediate the audit findings, prioritising critical / serious.
  2. Stand up CI accessibility scanning on every PR. Fail PRs on serious-severity regressions.
  3. Publish the accessibility statement with the audit date, methodology, and named contact.
  4. Budget for an annual re-audit — less expensive the second time because the regressions that accumulated between audit #1 and audit #2 were caught by CI.