Shaw Trust Report Overview

On Wednesday last week I received the audit report on ClacksWeb from the Shaw Trust Web Accreditation Service. Some of you may be aware that by Friday I had completed the necessary remedial work, and the site was accredited  (External link) and given the Trust's "accessible plus" award. This was partly due to me having already addressed some of the issues in the report both from the user testing I observed, and from feedback I'd received from the Trust during the audit process. That notwithstanding I thought it would be useful to provide an overview of the report itself, and at a later date some of the more interesting and esoteric audit findings.

Our 56 page written report consists of 5 sections:

  1. Executive summary
  2. Background & methodology
  3. User testing
  4. Technical audit
  5. Automated testing

Executive summary

A nicely written couple of pages which were perfect for me to copy to my management, this section provided a high-level overview of the findings together with the remedial action required before accreditation could be awarded. In our case the site didn't require a full repeat audit, but in some cases this is needed to ensure the audit report has been understood and the issues addressed adequately.

Background & methodology

This section provides an introduction to the report, an explanation of what web accessibility is, and why it's important, and information about adaptive technologies. It also provides an overview of the Trust's testing methodology, describing what user testing is carried out and why, and brief details of the technical audit process.

User testing

Into the meat of the report, covering tests performed by each group of users (low vision, blind, mobility impaired, dyslexic & deaf), problems encountered together with URLs, screen shots and comments from testers. A summary of the users overall impressions is given at the end of the section.

Having observed one of the days of user testing myself, I had already addressed some of the issues in this section, but reading it through with the full range of tester experiences listed reminded me of the value it added.

Technical audit

The contents of the technical audit are based on automated tests, carried out using InFocus  (External link), but extensively interpreted and moderated by human checks. It's worth stressing that these results are a long, long way from the raw automated testing results produced by services such as SiteMorse - it brings home how little of the WCAG can be tested by software alone.

Instead of covering each WCAG checkpoint in turn, the technical audit is broken into functional areas such as 'tables', 'images', 'site map' and so forth, and within each of these areas a good deal of narrative is provided explaining the relevant issues as they relate to the site in question. Where action is required this is clearly stated, together with URLs of offending pages, or where necessary broader guidance, for example for the handling of PDFs.

This section surprised me - I was expecting a dry, less useful summary of the technical shortcomings of the site, but it proved to be much more than that.

Automated testing

I received the report by email, and was intrigued to see a zip file attached as well as the report document itself. It contained the 900+ HTML files generated during the InFocus automated testing, along with a screen shot of the options used when the software was run. The last section of the written report provides information about these InFocus files, pointing out false positives and issues to be address not covered elsewhere.

At first it was a bit of a slog nailing down individual problems from all those files, but once I'd understood the structure of the reports, using the extended search functions in HomeSite made it a breeze to get a quick list of pages which were effected by a particular problem.

My view

All in all the report is impressive - it provides information for those in the organisation who need a summary but don't need to know the technical nitty-gritty. It provides softer, humanised user testing feedback which reminds you that the work we do is for people, not validators and automated accessibility testing software, and finally it provides expert technical advice which if followed and understood can only elevate the accessibility of the site in question. And above all it's usable - not a tome filled with technical references and jargon, but a practical, real-world guide to improving the audited site.

Post a comment

Personal information