Within information technology, there is a digital accessibility industry. Its consultants, companies, nonprofits, trade associations, and government agencies know how to make websites, mobile apps, PDF files, email messages, and kiosks accessible—that is, functional for a wide range of users in a wide range of situations, especially atypical ones arising from disabilities.
Almost invariably, these accessibility experts have websites. If you wanted to find an accessible website, where would you look? Obviously, you would choose the website of an accessibility expert. It would be an exemplar of accessibility excellence.
But that conjecture, though reasonable, would be a mistake. It has previously been reported that many diversity and inclusion podcasts are inaccessible. It is even more ironic if organizations in the accessibility industry have websites with impaired accessibility. But, according to the results of a battery of tests, that is indeed true.
An automated accessibility testing procedure (version 7 of
a11y in Autotest) was executed on the home pages of 80 digital-accessibility experts. The websites were assembled mainly from lists published by the International Association of Accessibility Professionals and Raghavendra Satish Peri.
The procedure generated a score for each page. The lower the score, the better. A score of 0 would indicate that a page passed all the tests.
The tests were conducted from October 2021 to January 2022.
In the table below:
- Each name in the
Pagecolumn is a link to the page that was tested.
- Each number in the
Scorecolumn is a link to a detailed report.
The table above shows that none of the 80 pages got a perfect score of zero. Why?
Of course, the tests might be to blame. All tests are fallible. Different tests would produce different results. There are disagreements on exactly what makes a web page accessible. And some of these tests produce recommendations or alerts, not claims of deficiency. So a high score does not constitute proof of inaccessibility.
A high score does, however, justify concern and investigation. Most of the 427 accessibility tests performed by the procedure belong to widely used test packages (Axe, Equal Access, and WAVE) developed by industry leaders. Only 13 of the pages received a zero (perfect) score on any of those test packages.
The pattern here reminds me of George Bernard Shaw’s maxim,
He who can, does. He who cannot, teaches. Many consultants claim they can make your website accessible, but you could reasonably ask them to demonstrate this competence on their own websites first, before you trust them to work on yours. If they claim their sites are accessible and some tests are invalid, there may be good reasons; ask what they are.