Karl Groves

Data Mining Accessibility

Pages tested after deployment have more than twice as many accessibility issues. This talk presents the data-driven case for shifting accessibility testing left.

Data Mining Accessibility
#1about 4 minutes

How large-scale data was collected for accessibility research

The research methodology involved analyzing 14.6 million errors from 6 million URLs to establish a statistically significant dataset.

#2about 8 minutes

Common accessibility issues found through automated testing

Automated analysis reveals that 90% of issues fall into five categories, with missing alt text and navigation problems being the most frequent.

#3about 9 minutes

Analyzing the results of manual accessibility audits

Manual audits on representative samples show that keyboard accessibility and color contrast are top issues, with nearly half being high-severity problems.

#4about 8 minutes

The strong correlation between automated and manual testing

Data shows a significant overlap between automated and manual test findings, with seven of the top ten failing success criteria being identical.

#5about 3 minutes

Why testing before deployment is twice as effective

Pages tested before deployment have less than half as many issues as those tested after, proving the ineffectiveness of a reactive audit-fix cycle.

#6about 8 minutes

Applying extreme programming principles to accessibility

Sustainable accessibility is achieved by integrating practices like early automation, specific acceptance tests, pair programming, and treating accessibility as a core quality problem.

#7about 17 minutes

Q&A on developer tooling and testing best practices

The discussion covers developer resistance to in-IDE linting, the causes of false positives in tools, and the need for a layered testing strategy.

Related jobs
Jobs that call for the skills explored in this talk.

Featured Partners

From learning to earning

Jobs that call for the skills explored in this talk.