Now that you learned the four rules of communicating results, you're probably wondering what you should talk about.
Let's start with what you shouldn't talk about:
- how many issues you fixed
- how many WCAG violations you've eliminated
- what your automated tests score is
- what percentage compliant you are
These are all vanity numbers. Activity metrics. Quantity doesn't necessarily mean high impact. Nobody cares about these numbers. They care about what they mean for them. They care about progress.
Instead, swap activity metrics for outcome statements that answer at least one of two questions:
- How does this protect or reward the business or the team?
- What can users do now that they couldn't before?
Here are six metrics that matter and how to present them to your stakeholders.
1. User impact
- Bad: "Fixed 15 screen reader issues on the checkout page"
- Good: "Reduced checkout blockers from seven to two this quarter"
- Best: "Users with disabilities now complete purchases independently, potentially recovering X€/month in abandoned carts"
2. Process efficiency
- Bad: "Created new accessibility review checklist"
- Good: "Devs review code faster with the checklist in hand"
- Best: "Devs now catch 50% more accessibility bugs in PR reviews, reducing the need for hotfixes by 20%"
3. Technical health
- Bad: "Redesign of the form field component eliminated 12 duplicate issues"
- Good: "Improved form accessibility for screen reader users"
- Best: "Checkout completion rose 12% among screen reader users"
4. Business efficiency
- Bad: "Reduced the number of new accessibility tickets"
- Good: "Accessibility-related tickets are down 40% since last release"
- Best: "Cut assistive-tech-related support calls by X%, freeing up support staff and developer time"
5. Compliance
- Bad: "Resolved 8 WCAG AA failures"
- Good: "Level AA violations reduced from 12 to 4 according to scan reports"
- Best: "Addressed most of the high-risk violations avoiding X€ in potential fines"
6. Team efficiency
- Bad: "Added 25 new accessibility tests"
- Good: "New accessibility tests improve release code stability"
- Best: "New features ship with 60% fewer accessibility regressions, saving X engineering hours per month to rework and fix bugs"
All these follow a golden rule.
Always link the process change to time and money saved or user benefits delivered.
Let's call vanity metrics what they are. Noise. They don't tell stakeholders what changed for users or what's in it for the business and team.
Good metrics will force you to answer why you spent time and money on them and how you're better off now.
The result?
Decisions get easier, budgets get approved and real impact happens.