Real talk: A/B testing obssession

2 minutes read

I remember reading somewhere that Google tested about 40 shades of blue to use in its search results links. I thought that was amazing! The lengths at which some will go make sure they squeeze every bit of revenue from the website.

Of course, links are only as good as the mouse that clicks on them. Or the keyboard. But only if it works with a keyboard.

And that's where my gripe starts.

We have marketing teams that will split-test 40 shades of blue to squeeze out a one decimal conversion lift. And they completely ignore whether their buttons work for keyboard users at all.

I've also seen marketing push tests for "high-impact" headlines. While not making use of the heading semantic element in those headlines. Or caring all that much about colour contrast ratio.

The irony is we're hyper-optimising for an imaginary average user. That someone with perfect vision, steady hands, usually a mouse user and someone with endless patience for inaccessible pop-ups. And we ignore real people who can't even navigate the site.

I ask myself what's the point of a "high-converting" call-to-action (CTA) if 25% of our audience can't click it? Why obsess over headline variants when the contrast ratios are well below what you'd call acceptable?

A/B testing isn't growth hacking if we're shrinking our own audience.

We'd best stop pretending tweaking pixels is "data-driven" if we ignore the data that matters. How many users are we locking out?

Accessibility first. A/B test shades of blue later.

Prioritise properly.

Either that or just admit we'd rather lose customers than do the work.

Sent on

Did you enjoy this bite-sized message?

I send out short emails like this every day to help you gain a fresh perspective on accessibility and understand it without the jargon, so you can build more robust products that everyone can use, including people with disabilities.

You can unsubscribe in one click and I will never share your email address.