Verifying the Disability Equality Index
Accessibility scores for 272 web pages of DEI winners
Summary
A new analysis reveals wide variation in conformity to web-accessibility standards by organizations that have won high scores on the Disability Equality Index (DEI). That is not a surprise, since the DEI is based on self-reporting by organizations, not measured performance.
Introduction
The Disability Equality Index (DEI) is a measure of disability inclusion and equality
in organizations in the United States with 500 or more employees.
Organizations that choose to participate in the DEI complete a questionnaire and, on the basis of their answers, receive scores. If their scores are 80% or more, their scores are published and they are considered DEI winners.
Some of the questions in the questionnaire deal with digital accessibility: the design and implementation of websites and other digital content to avoid barriers to use. Digital accessibility is covered by questions 3 through 7 of the 7 weighted questions in the Enterprise-wide access
section of the DEI questionnaire, and that section contributes 10 points toward the total of 100 points. For example, question 4c asks the organization, please estimate the percentage of your entire externally facing digital products that are accessible
.
Accessible content is essential for many people with disabilities. It also benefits users dealing with challenges such as poor lighting, small screens, and moving vehicles. Given this connection, do DEI winners actually have accessible websites?
Findings
In August 2022 I used the software packages Testaro and Testilo to measure the accessibility of the home pages of all 272 of the DEI winners for 2021.
The 1230 tests performed on those pages were created by Deque, IBM, Utah State University, Level Access, Siteimprove, Squiz Labs, Tenon, and individual contributors. Human testing was not performed, so the tests could not detect all accessibility problems, but thousands of problems were identified.
The results are shown in the following table.
Test results
In this table, the lower the score, the better. A score of 0 would indicate that a page passed all the tests. A perfect score is unlikely, though, because even suspected accessibility problems add small amounts to a score.
The first link on each row names an organization and its DEI score. That link will take you to the page that was tested. The second link gives the page’s accessibility score. That link goes to a detailed report explaining the page’s discovered or suspected defects and how the accessibility score was computed.
The range of accessibility scores is large enough to make proportionally sized bars impractical. So, the bars in the table are sized proportionally to the square roots of the accessibility scores as fractions of the largest accessibility score. If one bar is 2 times as long as another bar, the first score is 4 times as large (i.e. bad) as the other score.
Comments
Not a single home page was found that could not be made more accessible. For example, the page with the best accessibility score, belonging to Fifth Third Bank, contains several embedded web pages (in iframe
elements). Each such embedded page needs to declare what language its content is written in, so blind or other users who are listening to the page by means of a screen reader will hear the correct pronunciation. But these embedded pages do not declare their language.
Moreover, among the DEI winners, the DEI score is not a good predictor of the accessibility score. The 80%, 90%, and 100% winners are found throughout the table. In fact, the 90% winners have the best median accessibility score. The medians are:
- 2688 among 80% DEI winners
- 1567 among the 90% DEI winners
- 2242 among the 100% DEI winners
This weak association makes sense, given that:
- Digital accessibility constitutes only about 7 points out of the 100-point DEI score.
- The DEI score is based on self-reports, not measured performance.
Conclusions
This analysis discovered that the DEI winners publish web content with accessibility defects. Automated tests like these are not perfect and not complete, but they can discover and diagnose many accessibility defects at trivial cost.
DEI scores are not accurate predictors of digital accessibility. We therefore have reason to question whether organizations should reap reputational benefits from their DEI scores. Inaccessible websites create substantial risk of litigation, loss of reputation, and loss of market share.
DEI scores are based on what organizations claim. If they claim digital accessibility, they earn DEI points. But it is feasible to subject their public websites to accessibility testing and award points for real performance. So, why not do that, instead of believing the claims?