5 things automated tests miss that break real experiences

1 minute read

Your automated tests say everything's fine. But real users are stuck.

Here are five things that slip through:

1. Keyboard traps.

Automated tests check if elements are focusable. They don't check if you can actually leave them. Users get trapped in navigation menus with no way out except refreshing the page.

2. Invisible focus indicators.

Your button focus indicator passes the 3:1 contrast ratio. Except on your gradient background. No one can actually see it there. Tests measure pixels. They don't measure "can I tell where I am?"

3. Technically correct ARIA labels.

A screen reader announces "button, button, button, link, button" because every icon button is labelled "button." It's valid code but it's useless.

4. Reading order.

Your CSS makes things look right visually. But screen readers follow the HTML order. And you've laid it out in a such a way as to have the sidebar first, then your header and then the main content. Tests don't navigate like humans do. Only humans do.

5. Gesture-only interactions.

Drag to reorder passes automated checks because the element exists. But there's no keyboard alternative. Touch users are fine. No one else can reorder though.

Automated tests catch the obvious stuff. Real testing catches what actually breaks for people.

Sent on

Did you enjoy this bite-sized message?

I send out short emails like this every day to help you gain a fresh perspective on accessibility and understand it without the jargon, so you can build more robust products that everyone can use, including people with disabilities.

You can unsubscribe in one click and I will never share your email address.