Better Connected & web accessibility

SOCITM's Better Connected 2007 is published next week. Almost a year ago to the day I posted somewhat critically about the report's use of SiteMorse, and its reliance on automated testing for some of its findings. This year I've become rather more personally interested in the report - I was disappointed to learn earlier this week that ClacksWeb is not one of the 2 sites which were found to conform with WCAG level AA.

I'll cut to the chase. BC's assessment of the accessibility of local authority websites is fundamentally flawed. Admittedly this is a reflection of the use of the Web Content Accessibility Guidelines 1.0 as the instrument of measurement, but it's flawed all the same.

The single most important aspect of that flaw is this: syntactically valid HTML is not a primary indicator of web accessibility, and by the same token syntactically invalid HTML does not categorically indicate an inaccessible website.

Valid HTML is at best a proxy indicator of web accessibility - that is an indicator that doesn't have a causal link with the outcome (in this case an accessible website), but rather is something that is likely to be found where the outcome exists. Simply put, web developers who appreciate the issues around accessibility are more likely to be informed professionals who also appreciate the benefits of adopting and adhering to web standards. However, just as with SiteMorse's much maligned league tables, using HTML validity as an initial filter to identify "more accessible" sites is wholly invalid.

For the purposes of Better Connected an arbitrary threshold of 50 errors across 200 tested pages was used. Any sites reporting less than 50 errors went forward to be considered for WCAG AA conformance, those reporting more than 50 errors did not. Leaving aside this arbitrary limit, this also shows a gross failure of logic - to conform to level AA of WCAG a site must surely report zero errors across its 200 pages? A single error breaches checkpoint 3.2 of the guidelines, rendering it unable to conform to level AA.

The Web Content Accessibility Guidelines are 8 years old this year. In web terms they are at the very least pensionable, and quite probably pushing up the daisies. And remember they are guidelines, and as time passes it becomes more important that those using them as guidance recognise this.

Education is the key to improving the state of web accessibility, whether we're talking about government or any other sector. Web developers and managers, content editors, suppliers of applications that produce web-based output - all of these people require a sound understanding of the accessibility issues in their respective areas of operation to achieve and sustain an accessible online presence, and that understanding can only come through learning.

A good start would be to make the findings of the automated tests for BC available to the local authorities themselves. I was disappointed to discover 158 validation errors had been found on ClacksWeb - was it a single error across 158 pages, or one really bad page? The two scenarios have quite different implications for me as a manager, but to date I've been unable to elicit the details, and the errors aren't apparent on the site any longer.

Little fault, if any, should be attributed to the RNIB for this state of affairs - there is no practical way 468 websites can be adequately tested for accessibility on an annual basis without a significant financial and resource commitment.

The solution, however unpalatable it might be to the bean counters who seem to have a desperate need to rank and score us all, is to abandon the concept of ranking 468 websites for accessibility, and to stop testing them against an 8 year-old set of guidelines. Instead SOCITM should much more wisely employ the expertise of the highly skilled and knowledgeable staff at the RNIB to identify, highlight and promote best practice in web accessibility, both in the local government sector and beyond. I'm certain the WAC staff could come up with some fantastic educational resources if they were given free rein with SOCITM's financial contribution for BC. The current state of affairs is like asking the Michelin Guide to judge restaurants on the quality of their cutlery.

The question that I keeping coming back to is this - what does the Better Connected reporting of web accessibility achieve? Last year it painted a fairly depressing picture, and this year that picture is almost identical. If SOCITM wants to be an agent for change it needs to do more than just reporting a problem exists, and start putting its members' best interests first by helping them to address the problem.


Hi Dan,

I have to agree with you. These reports always focus on the negative while *mostly* focusing on automated tests.

What I'd prefer to see anyone commissioning these kind of reports is to switch to the positive, from the quantitative to the qualitative.

Something like, "Accessibility Heroes: Who's getting it right and why?" or "Beyond the Guidelines: Best Practice Accessibility in the Real World".

The scope could be to take 10 really well implemented websites, like ClacksWeb, and highlight how they work for real people, perhaps including user testing.

In terms of planning and writing the report, it would be far more interesting than compiling automated test data.

Posted by: Laurence Veale at March 3, 2007 11:06 AM

Thanks for this Laurence, those are precisely the types of things I had in mind. As you allude to, quantitative analysis and accessibility don't really mix - accessibility is not a binary state, it's a process and a continuum. We need to see a bit of creativity to help authorities move on.

Posted by: Dan at March 3, 2007 1:45 PM


Obviously I agree with the majority of what you say but I do have to query a couple of things.

SOCITM's Better Connected 2007 is published next week. Almost a year ago to the day I posted somewhat critically about the report's use of SiteMorse,...

How certain are you that they use SiteMorse? My sources indicate other wise.

I personally don't like pointing people at the RNIB - as their main concern is for the blind. As such they may miss things, accessibility wise, for other differently abled people.

Posted by: Rich Pedley at March 4, 2007 9:56 PM

I'd just like to clarify a couple of points.

First of all, the testing we do is far more than just compiling automated test results. There's also a significant amount of manual testing done.

Secondly, yes, SOCITM use Sitemorse, but not for Accessibility testing (and for further clarification - they've never done any accessibility testing for SOCITM), that's done by us at RNIB.

Lastly, RNIB as an organisation may be focused on the needs of blind and partially sighted people, but I can assure you that the Web Accessibility Consultancy is and always has been pan-disability. We couldn't, and wouldn't, operate any other way.

Posted by: pixeldiva at March 4, 2007 10:35 PM

@Rich: As pix says Better Connected uses Sitemorse for some of the other areas of the report but not accessibility. I wasn't trying to suggest they did, apologies if that wasn't clear. Not that she needs me to, I'd also back up what pix says about WAC being pan-disability.

@Pix: My concern is the distinction between level A and level AA conformance for the purposes of BC. I do appreciate that a lot of manual testing is done to set that baseline for level A and identify the sites that may achieve, but IMHO the method for distinguishing between A and AA is highly questionable.

Laurence pretty much sums up what I'd like to see from SOCITM in the future - more qualititative analysis, more focus on what people are getting right, and more education. As it stands the report does little to help those who are failing to meet the minimum standards.

Posted by: Dan at March 5, 2007 7:44 AM

Thanks for the clarification pixeldiva, and thanks also for setting me straight :grin:

Posted by: Rich Pedley at March 5, 2007 1:50 PM

"syntactically invalid HTML" may display fine in user agents that adopt a quirks- mode but not all will untangle it & those that don't may be used by the very people we claim to be making the sites accessible to.

Make the code valid so that any inaccessibility is the fault of the user agent, not your site.

Posted by: Nigel at March 7, 2007 2:05 PM

"Make the code valid so that any inaccessibility is the fault of the user agent, not your site."

Nigel, if only it was that easy. A site with thousands of pages produced and maintained by dozens of editors is tricky to keep valid with limited resources.

The vast majority of validation errors on sites like ours do not cause accessibility problems. The current crop of errors on ClacksWeb are things like invalid attributes (I've got name attributes on a number of forms) and named attributes of anchors starting with #. The latter is an example of an education issue for our editors - they appreciate how to link to a named anchor but don't appreciate that the target element doesn't require the # to precede the id or name attribute.

Sure these are easy to fix, but we have to make a judgement about how often we trawl for and correct such errors.

Posted by: Dan at March 7, 2007 3:58 PM

Interesting debate, whilst we are constantly told that the website has to be accessible, we have to innovate all the time to keep things accessible.

One thought that comes to mind, where is the innovation with the tools that enable people with disability better able to deal with web technologies?

Surely the web conversation that we have with our browser visitors is two way?

Is there any “league” table of there abilities?

Posted by: renagade at March 14, 2007 3:19 PM

Post a comment

Personal information