In an earlier email about good and bad accessibility metrics, I argued that one way to tell them apart is to ask whether the chosen metric will give you early warnings about potential issues. Or if it merely confirms what you already know.
This also sparked a discussion. What are some examples of accessibility metrics that give you early warnings?
Say no more!
Here are 8 examples of metrics that give you early warnings for an inaccessible website.
- Automated testing results. Look at the number of WCAG violations detected and the severity distribution of detected issues.
- Keyboard navigation. For all the key user journeys on your website, how many can be successfully completed with just a keyboard? And how long on average does it take a tester to navigate through one?
- Link text quality. What's the percentage of links with descriptive text vs generic link text like "click here" or "read more?"
- Page titles. How many pages pages have unique and descriptive titles?
- Readability score. Use something like the Flesch-Kincaid score to assess content complexity. You can also automate this.
- Designer-developer handoff. How many inconsistencies are there between what the designer wanted and what the developer implemented? Think specifically of heading structure, colour contrast, tab order, touch target size and focus states.
- Accessibility score trend over releases. Do you notice an increase in the number of accessibility issues across product releases?
- Accessibility acceptance criteria. How many user stories or tickets have accessibility as part of their acceptance criteria?
Some of them are more complex to look at, like the accessibility score across releases. Others are as simple as a Find in folder in the developer's editor. And others require a bit of upfront work, but you can reap the benefits for basically the life of the product.
And you don't have to track them all. Or any of these. It's really up to you and your team to decide what you need to move the needle in the right direction.