Seduced by automated testing?

There's a wee bit of this year's Better Connected that escaped my attention on first reading, but happily an item in Headstar's very worthy E-access Bulletin (external link) this week led me back to the report. It concerns the disparity between WAI conformance claims on councils' websites and their real level of comformance.

Of the 296 sites in the transactional (T) and content plus (C+) categories which claimed a particular level of conformance, only 69 were found to achieve that level in reality, or just 23%. There are only really two explanations for such an alarming disparity - either the councils in question are deliberately over-stating their conformance level, or, more likely in my opinion, they are being led to believe that their sites are achieving a higher conformance level than they really are.

Better Connected suggests that the culprit might be automated testing:

There is no doubt that achieving Level A is hard work and that measuring it is a complex business. Many might also be lulled into thinking that passing the automated tests of Level A (and Level AA and AAA) means that you have achieved comformance at those levels.

If this is true it's fair to say that the use of automated testing is effectively damaging the accessibility of the sites in question, rather than improving it. Given the gravity afforded to the SiteMorse league tables in some quarters, it's easy to understand why councils might be seduced into developing and measuring their sites using the company's tool alone. But as has been said before (external link), the number of WAI guidelines that can be reliably tested with automated software is very small indeed, and the only way to really know if your site is accessible is to have people use it, preferably disabled people using a range of assistive technologies.

An analogy I like to use with non-technical, non-web managers is that of a car's MOT Test (for non-UK readers the MOT Test is a comprehensive safety test that cars have to pass every year). A full accessibility audit is like an MOT Test - it delves into aspects of your site's performance and accessibility that you can't reach yourself, and that you really aren't qualified to judge. An automated test on the other hand is like emailing a photograph of your car, with a note of the make, model and year, to a bloke who knows a bit about cars, and having him judge on that evidence alone if your car is road-worthy and safe to travel in. Which would you rather your passengers travelled in?

PS: I know this is a familiar refrain, and I know that I bang on about it all the time, but I've been convinced of the value of repetition (external link) by Jeff Atwood.

Post a comment

Personal information