I've heard this and couldn't help but snicker.
How about asking ChatGPT to make it accessible?
As if slapping "make it accessible" into your AI prompt will somehow magically summon perfectly labelled buttons, semantic structure and keyboard navigation.
Listen up!
AI doesn't understand accessibility. It mimics it. It looks at what you want and at what it knows and tries its best to predict what is the most likely thing that'll fit. Sometimes it does a good job. Sometimes it sucks.
Give it a mundane problem with a clear prompt and instructions and chances are you'll get something quite usable out the other end. Something you still have to massage a little bit, but workable.
The minute. No. The second you give it a more complex accessibility issue, it'll spit out an abomination and sound confident while doing it.
And if you don't know what to look for, how to test it, how to massage it afterwards, then you'll just shrug and ship it because the robot confidently said it's fine.
Adding "and it should be accessible" to the prompt won't make it accessible if it's been trained on code that reeked of accessibility violations. The AI just gives you the same old garbage. But now it comes with a side of false confidence.
Accessibility isn't a sprinkle of prompt seasoning. It's work. Real, human, thinking work.
So is there no use case for AI in accessibility?
I think there is. And I'll end on this happy note.
I think AI is really good at catching silly stupid bugs that our tired brains and our worn out eyes can easily overlook. But that's where it stops, because it cannot handle more complex problems and certainly not patterns it hasn't encountered before without lots of guidance from a human. And as long as humans accept that, I don't see a problem.