Accessibility Archive

October 18, 2007

FOI enquiry - withdrawal of domains

@ 12:01 PM

As part of the preparation for my talk at Techshare earlier this month I made an FOI enquiry to the COI on 15th September about the conditions of use for domains.

I have a question about the conditions of use for names, which appear on the Cabinet Office at

These conditions state in section 4 that websites which do not comply with current UK disability legislation will have their domain names withdrawn. Other circumstances when domain names may be withdrawn are also detailed.

Could you please tell me:

  1. How many domains have been withdrawn during the lifetime of these conditions for failure to comply with disability legislation.
  2. How many domains have been withdrawn for other reasons listed in the conditions.

To give a little more context, Section 4a of the conditions of use state:

The applications (Web, email, etc) using a domain name must comply with current UK legislation and support channels that provide accessibility for disabled people, members of ethnic minorities and those at risk of social/digital exclusion. Legislation includes Copyright, Data Protection Act and Disability Discrimination Act. Abuse of will result in the name being withdrawn.

Today I received this response from the COI:

Firstly, it is important to understand the legislative context and the guidance for compliance with it. In your letter you state in relation to the conditions of use of a domain name that, 'websites which do not comply with current UK disability legislation will have their domain names withdrawn.' This isn't the case for a number of reasons. Firstly, we are not just looking at whether a website complies with disability legislation, we are also looking at whether a website meets the minimum standards for accessibility. The guidelines state that websites should 'comply with the accessibility recommendation for public sector sites, that is, W3C WAI Level AA.'

Secondly, it should be stressed that disability legislation is in place to protect the rights of the individual, not to detail the specific requirements on websites (both within and outside the namespace). Even if websites meet the recommended minimum standard, this is no guarantee that the user experience for people with disabilities will be problem free. This fact was highlighted in the formal investigation carried out by the Disability Rights Commission and re-iterated in the 2005 survey of public service eAccessibility commissioned by the Cabinet Office.

Thirdly, the guidelines go on to say that, 'Failure to comply with this may result in the name being withdrawn.' This implies that websites will be considered for withdrawal if they fail to meet the minimum standard, not automatically withdrawn.

Having said that, the Government has been working with industry, academia and the third sector to build a robust approach to delivering inclusive websites. To ensure that government pays due regard to current disability legislation (the Public Sector Disability Equality Duty) and in order to meet European objectives for inclusive e-government (Riga Ministerial Declaration 2006), COI has updated Chapter 2.4 of the Guidelines for UK Government Websites and proposes that all government websites must meet Level Double-A of the W3C guidelines by December 2008. The updated guidance has recently been sent out for formal consultation and is attached for your information.

In answer to your question,

  1. No website domain names have been withdrawn for failure to comply with disability legislation; and
  2. No website domain names have been withdrawn for the other reasons set out in the conditions of use.

However, the government has taken a proactive approach to reducing the overall number of websites it owns and in the future we can expect to see increased focus on raising the standards for government websites, including inclusivity and accessibility. This is part of the Transformational Government Strategy to converge websites around audience channels including Directgov and BusinessLink. This will be reflected in the updated policy on naming and registering websites, which will go for formal consultation later this month.

The most telling part of the response for me is that the conditions of use for domains have never been enforced. Given that section 4a explicitly states that sites not complying with current legislation will have their name withdrawn, one can only conclude that the agency responsible for upholding the conditions considers every website ever to be published under a domain to be compliant with current UK legislation around accessibility. (Or am I being unduly harsh and literal?) More support for the view that the proposal in Delivering Inclusive Websites to use WCAG as a measure and domain withdrawal as a consequence is a waste of time and needs to be replaced with a more positive and realistic scheme?

Perhaps more revealing of the government's future strategy is the final paragraph, which refers to the reduction in the number of government websites and an increased focus on raising standards. It's likely that the recently issued consultation around web accessibility is a thinly disguised ploy to encourage as many sites as possible to move into the loving arms of DirectGov, or at least onto the technical platform it now shares with several other large, high profile departmental sites. If it happened it wouldn't be a bad thing.

October 5, 2007

Techshare presentation

@ 5:54 PM

Today I had the pleasure of presenting at Techshare  (external link). I saw some great talks, made a lot of new friends, saw a lot of old friends and my presentation seemed to go down well.

My presentation, "Influencing government web accessibility policy: advocacy vs. militancy", is now on the site and available to download with speaker notes:

New UK government web accessibility consultation

@ 5:47 PM

On Tuesday the COI (Central Office of Information) released a consultation document titled "Delivering inclusive websites: user-centred accessibility". The document isn't yet available online, but I'm told it should be on the Cabinet Office site from some time next week.

The main thrust of the document is that all existing UK government websites should be accessible to WCAG level AA by end December 2008, while all new sites should be conformant before being launched. The main difference between this policy target and the endless procession of missed targets we've seen over the past few years is the explicit threat of the withdrawal of the domain for sites which fail to meet the standard.

The COI (and the Cabinet Office before them) already have this power under the Code of Practice for domains (see Where are the gatekeepers, March 2006) but as far as I know have never exercised it. Whether the threat is real this time remains to be seen.

But it is only a consultation document at this stage, and it's a prime opportunity to lobby the government to adopt a robust stance over non-compliance. If anyone wants a copy of the document please email me and I'll be happy to pass it on.

Here's the text that accompanied the document:

The Central Office of Information (COI) would like to invite you provide feedback on the attached document, Delivering Inclusive Websites (TG102), by end of business November 13, 2007.

This guidance is an update of Chapter 2.4 of the Guidelines for UK Government Websites

In order to meet European objectives for inclusive e-government and so that the UK public sector meets its obligations with regards to disability legislation, we have stipulated that all government websites must meet Level Double-A of the W3C guidelines by December 2008. Failure to satisfy this requirement will result in initiation of the process to withdraw the domain name used by the website.

Government websites are strongly recommended to develop an accessibility policy to aid the planning and procurement of inclusive websites. This includes building a business case, analysing user needs, developing an accessibility test plan and procuring accessible content authoring tools. The guidance covers some of the design solutions to common problems faced by users but is mainly aimed at strategic managers and project managers to assist with planning and procurement.

Please send comments to

August 13, 2007

Speaking at Techshare 2007

@ 8:36 PM

I was extremely chuffed today to receive confirmation that I'll be speaking at this year's Techshare conference in October in London. The Techshare conference is organised by the RNIB and "highlights the role of technology in the everyday life of people with disabilities, looking not just at the web but also software, mobiles, standards, compliance and much more." Lots more info at the Techshare site (external link).

There was a danger, I feared, that the paper I proposed might prove to be a bit contentious - "Government web accessibility policy - advocacy vs. militancy" - but fortunately the committee deemed it appropriate, so roll on 5th October.

Real World Accessibility London

@ 8:02 PM

We had a re-run of our Real World Accessibility event last week, this time at the Barbican in London. The day went off extremely well, and the feedback has been excellent.

The presentations are available to download from the Public Sector Forums site (external link). Also worth a mention are Pat Lauke's photos of the event (external link).

For those of you who attended here are links to some of the resources that were mentioned during the day's proceedings, if I've missed anything please let me know and I'll add them:

In a slight departure from the Birmingham event we set aside 30 minutes for exhibitor presentations, and an hour before lunch to discuss the possibility of forming a public sector web management group.

This latter idea has been met with a generally positive response, and a forum's been setup at (external link) to facilitate discussions at this very early stage. If you're involved in the web in the UK public sector in any way at all please drop by, register and let us know what you think.

July 2, 2007

Real World Accessibility

@ 7:11 PM

After a successful outing in Birmingham in May, we're bringing this one day accessibility workshop to London on 8th August. The main thrust of the day is to get away from a dry, box-ticking approach to web accessibility, and closer to what you really need to think about and do to produce accessible sites.

The same cast of speakers - Bruce Lawson (external link), Ann McMeekin (external link), Pat Lauke (external link), Grant Broome, Ian Lloyd (external link) and myself - will each present a 40 minute session, and sit as a panel for open questions. If Birmingham was anything to go by it should be another great day.

The event is being organised by my company, Champion IS, in association with Public Sector Forums. Despite their monicker, and unlike last time around, this event is open to all and sundry, not just public sector delegates.

Full details of the day are available on the CIS website (external link), with online booking via the PSF site (external link).

May 20, 2007

Review: RiverDocs Converter

@ 1:14 PM

Disclosure: This is a paid review. RiverDocs Limited have had no influence on the tone or content of this review.


An essential tool for any organisation which publishes Microsoft Word or PDF files online, RiverDocs Converter is vastly superior to any other conversion software currently available. There's now no reason for publishers not to offer accessible, high quality HTML versions of documents previously published only in proprietary formats. The parser even compensates for poorly authored source documents, previously a significant barrier to producing accessible, semantic HTML versions of Word and PDF documents.

It's not a magic bullet though - every conversion requires human-checking, and documents with any degree of complexity require a degree of input from an experienced web editor - but despite a slightly weak editor it's still well worth the price and will only get better in future versions given the publisher's focus on research and development.


RiverDocs Converter is a software package for the Microsoft Windows operating system which claims to convert documents designed for print into structured, accessible HTML documents for online delivery. In short this means it'll take PDFs and Microsoft Word files and attempt to convert them into a format more suitable for delivery and consumption over the web.

PDF and MS Word are beloved of government and corporations who often need to publish large documents quickly, but these formats are primarily designed for printing, not for delivery online, and have serious accessibility issues associated with them. So the potential benefits from effective conversion software are enormous - being able to offer HTML versions of these documents cost effectively is something that hasn't been possible before.


Installation was straightforward, taking a couple of minutes on my workhorse desktop PC.

RiverDocs Installation

The software does require the latest version of the Microsoft .NET 2.0 Framework, if this isn't already installed and available you will be prompted to download and install it.

Getting started

Starting the software for the first time you are presented with a quick guide to converting your first document, and the clean, functional RiverDocs interface.

RiverDocs interface

Test 1 - my first conversion

To test the software for the first time I used a PDF document regarding chimney stack removal I found on Cambridge City Council's website at:

It's a 4 page document containing a cover sheet, and a mix of different levels of heading, bullets and images. The PDF document was not tagged.

Opening the file displays it in the main RiverDocs window:

RiverDocs showing PDF file

Clicking the Convert button started the conversion, which took less than a second using the default settings. The interface changes to a split-screen affair, with the original document in the left pane, and the converted document in the right pane:

RiverDocs PDF / HTML conversion split panes

To give an idea of the quality of conversion and mark-up the software can produce automatically I wanted to save the document immediately. Admittedly this is not intended real-world usage of the product, but does provide an idea of quality of the baseline conversion prior to manual editing.

Big River had provided me with a one page crib-sheet covering the major interface elements, so I knew that the Save function was for saving a RiverDocs project, and the Publish function was for saving the converted document as HTML, CSS and images.

Clicking the Publish button presents the Publish dialogue box:

RiverDocs publish dialogue box

In addition to publishing as HTML, the software also supports output in CHM (Microsoft Compiled HTML Help) format.

To keep things tidy I wanted to publish this version into a new folder, but this is not a standard Windows file dialogue box, and doesn't provide the facility to create a new folder, so I had to switch out to Windows Explorer to do this before publishing the document in RiverDocs.

But, it turns out the file name entered into this dialogue box is actually used as a folder name, which will be created for you and into which the document is published. These sorts of interface issues are symptomatic of the software's relative youth, and will no doubt be ironed out as the product matures.

The publishing of this document took less than a second, here are the results:

The default settings produce HTML documents with an XHTML 1.0 Transitional doctype, generating a separate HTML file for each page of the source PDF, an index HTML document containing a generated table of contents, a single CSS file and an images folder containing converted images. The CSS is valid, and attempts to mimic the style of the original document as closely as possible.

As a comparison I ran the same file through Abbyy's PDF Transformer, another PDF to HTML conversion tool. The results were much vastly less impressive:

The Abbyy software makes no attempt to produce structured HTML, instead presenting every single line in the document as a paragraph and styling them to appear as closely as possible to the original PDF.

In general the quality of the default output from RiverDocs is extremely impressive. In this case there were just two validation problems: an unclosed list item in the generated table of contents, and missing alt attributes for the images on the final page. Since the default output is "section based" the parser moved the words "GUIDANCE NOTES" onto a page by itself despite displaying it as part of the title page in the preview pane, which was the only deviation from the page layout of the original.

But this isn't a fair test of the software which wasn't designed to be operated in this manner. While the results are good, they aren't good enough to publish without manual editing, so let's try again, only this time using some good old human judgement.

Test 2 - getting serious

For the second test I wanted to take the same document but publish it to a single HTML document of the highest quality as close to the original format as possible. The process is the same - open the file to be converted, and click Convert.


Before getting stuck into the document itself I wanted to specify some metadata for it. Fortunately RiverDocs make this very easy to do (just click the Metadata button), and provides a default set of Dublin Core elements for completion:

Metadata editor

It appears that additional user-defined elements can be created, so publishers in UK government for example can easily add eGMS metadata to converted documents:

eGMS metatdata entry

Unfortunately these additional elements didn't make it to my published document, a bug I've reported to Big River.


RiverDocs optionsRiverDocs offers a number of options to customise the output of the converted document. The most important are:

The editor

For many users the area of the application where most time will be spent is the HTML editor, where the converted output can be modified and fine-tuned. In most cases this will be to either match the original document or to conform to a house web publishing style.

The editor always presents the output document in a page-by-page format, regardless of the publish mode that's currently set. It would be nice to be able to preview the single page and section-based options.

The editor can be used visually in preview mode, or in source mode which provides a simple text editor view of the document page you're working on. As I wanted a single file output and had set the options accordingly there was something of a disconnection between working on a separate HTML file for each page, and the intended output. As far as I can see there is no way to preview the single file output prior to publishing.

RiverDocs toolbar

The toolbar provides standard editing tools you'd expect to find available on a simple HTML editor. These generally work as expected, although there are some quirks - for example undo will only remember changes you've made until you switch to source mode: so if you make change, switch to source mode and back to visual mode you'll need to correct any errors manually in source mode.

Once you've got used to the way the editor functions it's a reasonably comfortable working environment, but don't expect it have the functionality of DreamWeaver. I can foresee many users doing the initial conversion in RiverDocs and taking the published output into the editor of their choice to complete the process: indeed if I was using RiverDocs on a daily basis to convert a large number of files this is the way I'd work - the software's value lies in its conversion capabilities, not its editing capabilities.

One of the most common problems that will arise from automatic conversion is that of images and appropriate alt attributes. Editing images is easy - select the image in the editor, and click the image icon:

Editing image properties

The id is a temporary value used by the software during conversion and editing, and is removed on publishing.

Screen capture

One very nice feature of RiverDocs is the screen capture tool. On the final page of the original PDF is a diagram showing a cross-section of a wall, with some labels indicating particular features of the diagram. Since the PDF was generated from Adobe Pagemaker, the diagram consists of an image object and a series of text objects for the labels. In the automatic conversion RiverDocs quite rightly converted these separately, which can be seen on the last page of the output of test 1.

In my final version I want the image and labels as a single image, and this is where the screen capture tool comes in:

Screen capture tool in action

It operates like any screen capture tool you've used before - highlight the area to be captured and click an icon. In RiverDocs the highlighted area will be inserted into your HTML document as an image.

You've got issues

Issues icon The software provides assistance to help you identify and correct potential issues with the converted document. The Issues icon gives a quick idea of the number of issues identified by the software at any stage after automatic conversion. Clicking the icon opens a third pane with details of the issues:

Three pane view - original version, conversion and issues

The potential issues highlighted include missing alt attributes on images. I was disappointed to note that alt text from objects in tagged PDFs wasn't carried across to the converted HTML document. Otherwise the guidance provided by the issues is sound, based as it is on HTML Tidy - those of you familiar with the Tidy extension for Firefox will know what to expect.

For non-expert users this provides an extremely useful indication of where there are potential problems in the converted document, and the separation of current page issues and whole document issues guides such users through the document with ease. Personally I was more comfortable editing the document first before using the issues tool - picking up the issues I could see, modifying structure, adding or correcting alt attributes, generally tidying the document up - but that's probably no more than a reflection of my workflow habits.

Test 2 results

Here's the output:

It took 10 minutes from opening the original PDF to publishing this version - very impressive results in such a short space of time.

Test 3 - getting more complex

To really test the software we need something a little more complex than a single-column, text and images document. On the Clackmannanshire Council website I found a 24 page consultation document laid out in 2 columns, which included multiple levels of headings and a data table:

The untouched output from RiverDocs shows its limitations, but is still an impressive result:

It took me about 30 minutes to tidy the document up in RiverDocs, but I was still left with a lot of redundant classes with names like "font19" and all those named anchors generated for the table of contents. Cleaning up the mark-up in RiverDocs proved to be a bit of a chore, so I tried again, this time dumping the output immediately into DreamWeaver.

15 minutes later I had this clean, structured version of the PDF:

My conclusion - if your document is anything more complex than single-column text then forego the RiverDocs editor for your favourite HTML editor.

Test 5 - Microsoft Word

So what about Microsoft Word conversion? Well, this review was produced in a simple Word document, so I ran it through the RiverDocs Converter for publishing online. Here is the untouched conversion:

This was a 12 page Word document, and conversion took noticeably longer than PDF conversion, at about 20 seconds. The only real issues with the conversion were the failure to convert Word bullets to HTML lists and the failure to pick up alternative text on images. Other than this the structure was accurately represented and the images correctly positioned.

The converter doesn't appear to parse the styles used in Word documents - I converted a test document which was styled throughout as paragraphs, but with headings made bold with larger font sizes. RiverDocs therefore accommodates poorly authored, unstructured source documents, by analysing the font size and weight and assigning heading levels accordingly. This is a great feature given the preponderance of incorrectly produced Word documents in many organisations.


Given the immaturity of the package there are some inevitable annoyances with the interface and output:

possible in source mode. In a long document this can quickly become tedious. It would also be an improvement if the TOC used ids rather than named anchors.

<p class="font9"><span class="font9"><strong>NOTE: Some chimneys act as a buttress and provide support to long walls.</strong></span> <strong>Please check with Building Control or a structural engineer</strong><span class="font9"><strong>, before</strong></span> <span class="font9"><strong>proceeding, to determine if this is the case.</strong></span></p>

None of these are major problems though, and I would expect the interface to improve as the software is developed further. The key feature of the product is the conversion algorithm, which is extremely impressive.


RiverDocs is an impressive product and an essential tool for any organisation which has a need to publish more than a small number of PDF and Word documents online. Simple documents take no time at all to convert and tidy using the RiverDocs editor, while I found more complex documents are best converted in RiverDocs and then edited in a more powerful and functional dedicated HTML editor such as DreamWeaver.

The true value of RiverDocs lies in its ability to turn unstructured, multi-column PDF documents into structured HTML documents, whilst maintaining the correct reading order. Critically, the intelligent parsing engine compensates for low-quality source documents, previously a real barrier to producing HTML versions of PDF and Word documents.

Future versions of RiverDocs are very likely to offer significant improvements, both in terms of quality of conversion and the application interface. Apart from being a single-product company, concentrating solely on the development of the RiverDocs Converter, they also fund applied research at Queens University Belfast as well as other universities engaged in the fields of accessibility, artificial intelligence and character recognition.

About the reviewer

Dan Champion has worked in the web industry since 1995 through his company Champion Internet Solutions Limited, with clients in the private and public sectors. Between 1999 and 2007 he was responsible for Clackmannanshire Council's multi-award winning websites.

He is a regular speaker on the subjects of web accessibility, web standards and web strategy at conferences and workshops throughout the UK, has written on the subjects of e-government and web accessibility for the Guardian, and featured on national BBC Radio in various guises.

April 26, 2007

DTI update - good money after bad?

@ 12:57 PM

Loyal readers will remember that in May last year I started to post about the DTI's new yet inaccessible website, developed at a cost of £200,000 of taxpayers' money. If this is news to you please start with the summary I posted in August when the story ran in Private Eye, and at my co-conspirator Bruce Lawson's website, where he has a special category just for the DTI.

In February the DTI posted an accessibility update to their website, detailing a three-step improvement plan, to be completed in early summer 2007.

Being a curious bloke (so I've been told) I was keen to learn more about this plan, and in particular how much more of our money it was going to take to clean up the mess produced by the previous contractors. So I sent the DTI this FOI enquiry on 22nd March:


I'd be grateful if you could provide details of the work described on this page related to the accessibility of the DTI website:


  1. The cost of emplying Nomensa to audit the website.
  2. The cost of step one of the remedial work to be undertaken, described as "Make the necessary accessibility improvements to the core DTI website. This will consist of both technical and content work streams. They will address the underlying design, code and content issues that have been identified as requiring attention to meet the appropriate standards."
  3. The name of the contracted company undertaking the work in step one.
  4. The budgeted cost of completing the entire process described on the page referred to above.

Please contact me if you require any clarification of this request.

Kind regards,

Dan Champion

The DTI responded on 23rd April:


I am writing in response to your email of 22 March 2007 in which you requested information relating to the DTI website.

  1. The total budget for the website accessibility audit was £10,000. This cost covered:
    • An accessibility audit of the DTI website templates;
    • Guidance on creating and maintaining accessible pdf documents;
    • Presentation of the results at DTI;
    • Copies of the reports;
    • A workshop on issues raised by the audit
    • Central Office of Information procurement and project management costs.
  2. On the website Step One is described as "consist[ing] of both technical and content work streams". The technical workstream has two elements:
    • Building the website templates. Cost: £59,837
    • Upgrading the Content Management System software. This work, unrelated to the accessibility issue, was due and it was prudent to combine it with the template work. Cost: £45,666
    The content workstream is being developed; cost is not yet confirmed.
  3. The company contracted to undertake the technical workstream in Step One is Fujitsu.
  4. The budgeted costs for the technical workstream is £59,837, with a further £45,666 for the CMS software upgrade. The content workstream for Step One is being developed; cost is not yet confirmed. Steps Two and Three of the accessibility project will follow the content workstream.

To summarise:

I have two perspectives on this. On the one hand, this is positive action, and with Nomensa's help the DTI will most likely emerge with an accessible, usable website. Hoorah. On the other hand, it is a marvellous example of how not to procure, develop and deliver an accessible website. The cost of the remedial work looks likely to approach if not exceed the cost of the original development. A perfect illustration of why you build accessibility in from the very start. No doubt more to come about the DTI in the near future.

March 2, 2007

Better Connected & web accessibility

@ 9:35 PM

SOCITM's Better Connected 2007 is published next week. Almost a year ago to the day I posted somewhat critically about the report's use of SiteMorse, and its reliance on automated testing for some of its findings. This year I've become rather more personally interested in the report - I was disappointed to learn earlier this week that ClacksWeb is not one of the 2 sites which were found to conform with WCAG level AA.

I'll cut to the chase. BC's assessment of the accessibility of local authority websites is fundamentally flawed. Admittedly this is a reflection of the use of the Web Content Accessibility Guidelines 1.0 as the instrument of measurement, but it's flawed all the same.

The single most important aspect of that flaw is this: syntactically valid HTML is not a primary indicator of web accessibility, and by the same token syntactically invalid HTML does not categorically indicate an inaccessible website.

Valid HTML is at best a proxy indicator of web accessibility - that is an indicator that doesn't have a causal link with the outcome (in this case an accessible website), but rather is something that is likely to be found where the outcome exists. Simply put, web developers who appreciate the issues around accessibility are more likely to be informed professionals who also appreciate the benefits of adopting and adhering to web standards. However, just as with SiteMorse's much maligned league tables, using HTML validity as an initial filter to identify "more accessible" sites is wholly invalid.

For the purposes of Better Connected an arbitrary threshold of 50 errors across 200 tested pages was used. Any sites reporting less than 50 errors went forward to be considered for WCAG AA conformance, those reporting more than 50 errors did not. Leaving aside this arbitrary limit, this also shows a gross failure of logic - to conform to level AA of WCAG a site must surely report zero errors across its 200 pages? A single error breaches checkpoint 3.2 of the guidelines, rendering it unable to conform to level AA.

The Web Content Accessibility Guidelines are 8 years old this year. In web terms they are at the very least pensionable, and quite probably pushing up the daisies. And remember they are guidelines, and as time passes it becomes more important that those using them as guidance recognise this.

Education is the key to improving the state of web accessibility, whether we're talking about government or any other sector. Web developers and managers, content editors, suppliers of applications that produce web-based output - all of these people require a sound understanding of the accessibility issues in their respective areas of operation to achieve and sustain an accessible online presence, and that understanding can only come through learning.

A good start would be to make the findings of the automated tests for BC available to the local authorities themselves. I was disappointed to discover 158 validation errors had been found on ClacksWeb - was it a single error across 158 pages, or one really bad page? The two scenarios have quite different implications for me as a manager, but to date I've been unable to elicit the details, and the errors aren't apparent on the site any longer.

Little fault, if any, should be attributed to the RNIB for this state of affairs - there is no practical way 468 websites can be adequately tested for accessibility on an annual basis without a significant financial and resource commitment.

The solution, however unpalatable it might be to the bean counters who seem to have a desperate need to rank and score us all, is to abandon the concept of ranking 468 websites for accessibility, and to stop testing them against an 8 year-old set of guidelines. Instead SOCITM should much more wisely employ the expertise of the highly skilled and knowledgeable staff at the RNIB to identify, highlight and promote best practice in web accessibility, both in the local government sector and beyond. I'm certain the WAC staff could come up with some fantastic educational resources if they were given free rein with SOCITM's financial contribution for BC. The current state of affairs is like asking the Michelin Guide to judge restaurants on the quality of their cutlery.

The question that I keeping coming back to is this - what does the Better Connected reporting of web accessibility achieve? Last year it painted a fairly depressing picture, and this year that picture is almost identical. If SOCITM wants to be an agent for change it needs to do more than just reporting a problem exists, and start putting its members' best interests first by helping them to address the problem.

February 8, 2007

USAJOBS vs Section 508

@ 2:24 PM

It's never truly comforting, but the aphorism "there's always someone worse off than you" can at least make you take a more circumspect view of your own place in the world, if only for a short time. In the past I've been critical of the UK government for producing inaccessible websites, both because it's the wrong thing to do, and because it fails to meet the government's own standards. But this morning I came across USAJOBS (external link), "the official job site of the United States Federal Government". For once the UK government almost shines in comparison.

Can someone please tell me if I'm missing something here? It's like going back in time. I can hear Prince singing 1999 in the background, are burning their way through $100 million, IT contractors the world over are filling their pockets as business panics about the millennium bug. And USAJOBS is being built. At least that's what it looks like to me.

Just how can the Federal Government, responsible for Section 508 of the Rehabilitation Act (external link), procure and promote such a poor service? I'm not going to list the problems the site has, just go and have a quick look yourself. Treat it like one of those puzzles where they give you a target score for making words out of a bundle of letters. Your target is 6,744.

In my opinion this type of failure does more damage to the accessibility cause than anything else. USAJOBS should be an exemplar of accessible design. How can we expect businesses and web developers to take accessibility seriously when there is such a fundamental lack of adherence to standards by the very body that sets those standards?

Comment on USAJOBS vs Section 508 (2)

November 21, 2006

Petition the PM

@ 7:34 PM

Ian Fenn (I'm assuming of Chopstix (external link) fame) has setup an e-petition at the 10 Downing Street site (external link) asking the PM to ensure that websites launched by government comply with WCAG.

Let's face it, Tony needs some good news before he buggers off on the lecture circuit, so it's definitely worth a shot, if only to highlight the continuing non-compliance of the DTI and other high profile government sites.

There are 517 open petitions at the site at the time of writing, and we only need 81 signatures to make the first page, and 9123 to displace the top petition, so get over there now and sign it, you know it makes sense.

October 24, 2006

Web Accessibility Google

@ 6:42 PM

Google have launched customised search engines via Google Co-op, a new (beta of course) corner of the Google empire.

It's a potentially powerful tool. Here's a quick web acessibility search engine I setup to test it:

It's restricting searches to just 4 domains, and yet produces pretty decent results for any accessibility issue you care to throw at it. With a bit of time, care and attention it's going to be possible to create fantastically targetted search engines. Even better there's a community element, since anyone can suggest sites to add to the search engine, and if the engine owner deems them suitable they can be easily added.

Throw some accessibility-related terms at it and see how it fares. And if you've got suggestions for sites to add to the list just follow the 'Volunteer to contribute to this search engine' link (if you've got a Google account) and do your stuff.

You can also access this engine at its homepage.

September 19, 2006

DTI Internal Review

@ 9:15 PM

I've finally had a response from the DTI on the internal review I requested into their decision not to honour the follow-up FOI request I made in July. In short, the Chief Operating Officer at the DTI has decided that the department were justified in declining to answer the questions set out in that request. Can't say I'm surprised.

If I wanted to pursue the matter the next course of action would be to appeal to the Information Commissioner's Office. I'm not satisfied with the DTI's answer, which (apart from being poorly written) consists of little more than platitudes and half-arsed excuses, but for now I'm going to give them the benefit of the doubt and sit back and wait to see the fruits of their labour.

Here's the full text of the response I received by email, retyped by my own fair hand since the PDF attachment it was contained in consisted of a scanned letter. Good job I'm not blind.

Dear Mr Champion,

Thank you for your email of 26 July 2006 requesting an internal review of the Department's decision to decline your request of 25 June made under the Freedom of Information Act (FOIA). In line with Departmental policy this matter was passed to me for consideration as Director-General with responsibility for the website policy area.

I have now had the opportunity of reviewing this matter. After giving full consideration to your request and the information provided by the E-Communications team I am content that the s12 and s5(1) justification for declining was valid in this instance. Having reached that conclusion, I set out later in this letter the steps we will take that may help with your enquiry.

The nine requests totalled 51 separate questions, 27 questions were asked in the first tranche of requests (which we answered in full) and 24 were asked in the second tranche of requests. In calculating the cost of answering these questions the following elements were taken into consideration:

The volume of questions, combined with the breadth of information held, would have taken us over the £600 threshold.

Having considered your request for review, I have also considered the other action we can take. We are preparing statements that we can publish on the website explaining the current position on accessibility and the background to the procurement. That may help you to refine your request to bring it below the financial threshold. The statement will explain that we are carrying out an audit and we would aim to be in a position subsequently to explain the action we are taking on accessibility.

I recognise that the position on accessibility of the site is unsatisfactory - it is unfortunate that this was the outcome of the procurement. There was undoubtedly a failure to ensure that it was compliant with the accessibility guidelines. The development of the site was a long and complicated process that took place over several years and involved numbers of different people in DTI and its suppliers. We certainly aim to learn lessons from it but a detailed investigation of it would be time-consuming and probably would not provide complete answers to the questions you have raised. Our focus at present is on the action required to bring the site to an appropriate standard of accessibility.

Hilary Douglas, Chief Operating Officer

The DTI site's accessibility page (external link) does now have a notice which includes this passage:

An accessibility audit is being carried out by a specialist independent agency. The audit will identify where the site fails to comply with relevant accessibility standards. The recommendations will be used to draft an implementation plan.

If there's anyone willing to submit an FOI enquiry asking which specialist independent agency they've employed I'd be very interested to hear the answer.

August 28, 2006

United Nations E-Accessibility Day

@ 8:17 PM

3rd December 2006 will be the International Day of Disabled Persons,and this year's theme is accessibility to information technologies.

Read all about it at the UN Enable site (external link). There are few details of what will be happening on the day, but one would imagine that there will be a series of co-ordinated events much like World Usability Day (external link) (which is 14th November this year).

August 17, 2006

Private Eye for the DTI

@ 1:30 PM

A warm welcome to anyone who has been led here by that esteemed publication Private Eye. Just so you don't have to rake around the less interesting corners of the site, you can find the bits about the DTI here:

My co-conspirator Bruce Lawson has a handy category just for the DTI (external link) on his site, clever bloke, which will complete the picture.

For visitors who don't subscribe to the Eye (shame on you) Bruce has a transcript of the Eye piece (external link) on his site.

Current position

At the moment I'm waiting for a response to the internal review I requested after the department dodged our follow-up enquiry. They've told me:

The Department is carrying out an Internal Review into the decision not to disclose the information you request. The review will be undertaken by the Director General within the DTI who is responsible for the policy area within which your original request falls.

The target for conducting an internal review is 20 working days from receipt of your letter. We will write to you again following the review.

I made my request for an internal review on 26th July, so should have a response by 23rd August, which will be published here. If the result of the review is unsatisfactory the final course of action is an appeal to the Information Commissioner's Office.

The tip of the iceberg?

It's probably worth pointing out that the DTI is by no means the only government department to procure and publish such a low quality website, and in doing so ignore the government's own guidelines on web development. There's a fundamental flaw in the current e-government setup at Whitehall, where the eGovernment Unit issues some very good (if now dated) guidance on producing accessible, usable websites, based on best practice, which is subsequently ignored by departments. There's no threat of sanctions from the government itself, so the only risks the departments are taking by ignoring the guidance are of legal action (miniscule) and bad publicity (hello!).

August 2, 2006

RNIB Web Access Centre Courses

@ 7:34 AM

Whether web accessibility is a scary, monsters-under-the-bed-just-don't-want-to-look thing for you, or you've already got an inkling of what it's all about and want to learn more, it seems the fine folks at the RNIB Web Access Centre can help.

They've announced two new courses, the first for people who need to understand the big issues; why accessibility is vital and what happens when sites aren't accessible, the second is a more technical course for people who need to understand the detail of how it's done, includes examples, so that delegates are equipped to rip out the bad practice and bolt in the best, on their return to work.

So if London's an option for you, and you can stretch to the very reasonable £195 for each full-day course, there are few better people to learn about web accessibility from. They know their stuff inside out, and are very nice people to boot. And no, I don't get commission.

Full details from the Web Access Centre Blog (beta) (external link).

July 31, 2006

Redesign: Rucksack Readers

@ 2:12 PM

I've had a web design business since 1995, and although over the past few years I've gradually run it down and let clients go, largely to spend more time doing other things, there's one client I've kept on and have no plans of letting go.

Rucksack Readers (external link) was founded in 2000 by a good friend of ours, Jetta Megarry, and it's been great to watch the development of and be involved with a successful, growing business. The company produces guide books for long distance walks, treks and now the seven summits (the highest mountain in each continent). Their books are second to none in their sector - superior design and killer content combine to make them essential for those actually undertaking the routes covered in the books, and the photography is enough to make them ideal coffee table reading or gifts.

Anyway, today the Rucksack Readers website (external link) was relaunched, following a standards-based, accessible redesign which has taken shape over the last month. I was responsible for everything at the web end - the scripts, the CMS, and the front-end code and CSS. The site is visually designed by Ian Clydesdale at Workhorse Design (external link), who also does the design for the books, with Jetta producing the content. It's been a real team effort, and hopefully the results will speak for themselves.

July 26, 2006

Google Accessible Search in a pickle

@ 8:20 AM

Like many others I was very interested to see Google's foray into the accessibility arena this week in the shape of Google Accessible Search (external link), "Accessible Web Search for the Visually Impaired". A nice idea, and great to see accessibility on their agenda, but from a few quick tests my confidence levels in its utility is a tad shaky.

Search regular old Google for Pickled Eggs (external link) and at number 5 you'll find a recipe for this delicacy on The Accidental Smallholder (external link), my other personal site.

Take Google Accessible Search, enter the same query, Pickled Eggs (external link), and we're there again, only at number 44.

Now, I know that The Accidental Smallholder has its accessibility problems, but that recipe page is valid XHTML, has total separation of content from style, contains one image with appropriate alt text, and uses semantic markup. The top five accessible results are a mixture of table-based layout, tag soup, doctype-less frontpage monstrosities and pages from, and I fail to see in what way they are more accessible than my pickled eggs recipe. Harrumph.

The Accessible Search FAQ (external link) makes all the right noises, with statements like:

In its current version, Google Accessible Search looks at a number of signals by examining the HTML markup found on a web page. It tends to favor pages that degrade gracefully --- pages with few visual distractions and pages that are likely to render well with images turned off.

...but it clearly needs some fine-tuning, otherwise how are the visually impaired ever going to enjoy my superior pickled eggs?

July 21, 2006

DTI update - FOI shenanigans

@ 10:10 PM

I've been on my hols this week, and on my return I discovered, as predicted, that the DTI had replied to the follow-up Freedom of Information (FOI) enquiry Bruce Lawson and I submitted to them on 26th June. Unfortunately the DTI has rather neatly, but not necessarily fairly, side-stepped the entire issue, and used a technicality to avoid fulfilling our enquiry.

Here's the email from the DTI in full, I'll leave you to have a quick read and draw your own conclusions before reading mine:

Dear Mr Champion

Thank you for your request for information on the accessibility of the Department of Trade and Industry's website which we received on 26/06/06. I regret that we cannot provide this information, as the cost of administering your request would exceed the limit prescribed under Section 12 of the Freedom of Information Act. This is £600, which represents the estimated cost of spending 24 hours in determining whether the Department holds the information, and locating, retrieving and extracting the information. Where the cost of compliance with a request would exceed the appropriate limit, we are not obliged to comply with that request.

We have received nine separate FOI requests regarding the accessibility of the DTI website. All nine requests appear to have been generated by contributors to the website and discussion forum:

Regulation 5(1) of the Freedom of Information and Data Protection (Appropriate Limit and Fees) Regulations 2004 provides that, where two or more requests for the same or similar information are made to a public authority by different persons who appear to be acting in concert or in pursuance of a campaign, those requests may be aggregated for the purposes of estimating whether compliance with the requests would exceed the appropriate limit.

We have aggregated the nine requests received on this subject, and estimate that the cost of compliance with them would exceed the appropriate limit. We are therefore not obliged to provide the information requested.

However, the DTI is aware of the accessibility issues with the new website. An accessibility audit is planned and the recommendations from the audit will identify accessibility improvements.

If you have any queries about this letter, please contact the DTI Response Centre quoting the FOI reference number above.

Appeals procedure

If you are unhappy with the way the Department of Trade and Industry has handled your request you may ask for an internal review. If you wish to complain, you should contact us at:

Department of Trade and Industry
Response Centre
1 Victoria Street
London SW1H 0ET

If you are not content with the outcome of the internal review, you have the right to apply directly to the Information Commissioner for a decision. The Information Commissioner can be contacted at:

Information Commissioner's Office
Wycliffe House
Water Lane
Cheshire SK9 5AF


DTI Response Centre
Tel: 020 7215 5000

For those of you not familiar with the Freedom of Information Act (FOIA), public authorities subject to its provisions are not obliged to respond to requests where the estimated cost of determining what information the authority holds, locating the information, retrieving it, and, if necessary, editing or redacting it exceeds £600, calculated at a notional rate of £25 per hour.

In addition, under section 12(4)(b) of the Act, authorities can aggregate multiple enquiries for information from different individuals where it appears to the authority that the requests have been made in concert or as part of a campaign.

In short, the DTI is refusing to answer legitimate questions about the processes it followed in procuring certain services, and about processes it may or may not have in place for future procurement.

The FOIA is regulated by the Information Commissioner's Office (ICO). The ICO issues guidance to public authorities about their responsibilities under the legislation, and to the public about their rights. The ICO offers this guidance (PDF) to authorities refusing requests due to excessive costs:

In this case the DTI has chosen to ignore the first four of these guidelines, merely providing information about appeals procedures. (Ignoring guidelines is clearly something of a habit for the DTI.) They also seem to have missed the point entirely by stating that:

...the DTI is aware of the accessibility issues with the new website. An accessibility audit is planned and the recommendations from the audit will identify accessibility improvements.

It would be a tad distressing if they weren't aware of the issues by now, but of course what we're trying to discover is how they missed the issues in the first place, and what they are doing to prevent it happening again.

The upshot is that I will be requesting an internal review from the DTI, and if I find that unsatisfactory I'll be perfectly happy to appeal to the ICO.

If anyone reading this made an FOI enquiry about the DTI site please get in touch, since we might as well make it a real campaign if the DTI are going to treat it as one. And if you didn't, and you feel strongly about this, please take Bruce's advice and write to your MP.

June 29, 2006

PAS 78 set free

@ 6:04 PM

PAS 78, a guide to good practice in commissioning accessible websites, is now available free of charge from the Disability Rights Commission website (external link).

This is excellent news - if you're the slightest bit responsible for procuring or specifying websites do yourself and your organisation a favour and go grab a copy now. It'll help you tell the difference between the snake-oil salesmen (external link) and bona-fide top-notch web development companies, which can only be a good thing.

June 28, 2006 - DTI fail again

@ 3:28 PM

Launched yesterday by Jim Fitzpatrick, Parliamentary Under-Secretary of State for the Department of Trade & Industry, (external link) is described in its launch statement (external link) thusly:

There are now more than 7500 Government contracts advertised on a new business portal which removes the barriers faced by many small businesses to access public sector contracts.

Sadly it doesn't remove any barriers if you browse without javascript enabled, or using a device which doesn't support javascript. And they aren't shy about telling you that you need javascript: alerts a user to the need for js with a big, obtrusive warning message

Unfortunately the link to the instruction page is broken.

The site's accessibility statement (external link) includes the most confused definition of WCAG conformance I've ever seen:

Additionally the site meets with Bobby (opens in a new window) approved Conformance Level AA in association with the Web Content Guidelines (opens in a new window).

And of course it fails to meet the claimed level of conformance, for various reasons including requiring scripting to be available, for using deprecated markup in a few places, non-contiguous headings, lack of labels for form elements, misuse of the apostrophe, and best of all for introducing a brand new HTML element to the world, <h7> (it's right there on the home page).

This is actually a real shame, because the site is quite nicely built in general, but falls down on the detail. There's also evidence of a rush job in the markup, where the (js dependent) search form is commented out, and a beta statement still resides. I'll email them with these observations, and hopefully in due course it will be made more accessible and we can celebrate a decent DTI website for a change.

Blaze Aware - fire safety for sighted people

@ 12:25 PM

The Scottish Executive launched Blaze Aware (external link) yesterday, a new website aimed at raising childrens' awareness of fire safety precautions. According to the press release (external link) from the Exec:

The Blaze Aware website is designed to be a fun, creative and interactive way of getting the message across.

Shame it wasn't designed to be accessible. There are countless problems with this site, including it being unusable without images (no alt attributes on many images), heavily dependent on javascript for navigation and functionality, barely keyboard navigable (although to be fair there are some bits of Flash with good keyboard support), it contains audio with no textual equivalent, has a table-based layout from the dark ages, there's not a single heading used across the site, pages without titles, and so on.

I love the "low graphics" version, which basically turns off the 460k background image. Except of course it defaults to the high graphics version, so your browser's probably downloaded and cached the image by the time you turn it off. Seems like the DTI have some competition after all.

It's hosted by a company called Civic (external link), who proclaim themselves to be expert in web and digital communication. It's not clear if they were responsible for the development. [Edit: I've been contacted by someone at Civic who has stated that they were not responsible for the design and development of the site.] Bizarrely it's hosted on a subdomain, (external link) - visit the root for a giggle, but be prepared to disconnect immediately!

Oh, and it cost £36,000. I'll be making enquiries...

June 27, 2006

Da Vinci Code Trail - not big or clever

@ 4:45 PM

Mona Lisa says be accessibleImagine for a moment that you can't use a mouse or other pointing device. Maybe you're a screenreader user. The reason is immaterial, but you're dependent on your keyboard or voice recognition software to use your computer. You're also a big fan of the Da Vinci Code (the book that is - the chances of someone being a fan of the film and a keyboard user are too tiny to contemplate).

When Sony Ericsson and O2 announce "The Da Vinci Trail"(external link), an entire site of phone offers, content and secrets from The Da Vinci Code, you're pretty damned excited. Apparently you can get free downloads, win a car, and best of all take on the challenge of The Da Vinci Code Trail for your chance to win The Da Vinci Code experience of a lifetime. And to top it all, the Da Vinci Code Trail website is Segala certified - see the press release on the Segala site (external link) (if you didn't receive the same unsolicited email I did from them) for the full details.

Ah, life is sweet. You visit the site, select the html version over the Flash version (feeling a little like a second class citizen, but that's okay, at least the content is accessible), download some stuff, enter a competition to win a car (even though the markup on the competition entry form is still horribly broken, 48 hours after I reported it), and prepare for the big one, the Da Vinci Code Trail itself.

As you may have already guessed, that's when it all goes pear-shaped. See, the good folk at Sony Ericsson and O2 have seen fit to provide a pretty accessible alternative to some of the content on the site, like the downloads and the car comp (about 8 pages in total), but the Trail itself is a multi-stage Flash game, wholly unusable with anything other than a mouse, for anyone other than a sighted user.

I contacted Segala about this bizarre situation - after all the whole campaign is called "The Da Vinci Code Trail", and all that free audio book, download and win a car competition stuff is only secondary to the main competition - and this is the reason they gave for the Segala certification not including the Trail itself:

The Flash game on the site is actually hosted on Sony Ericsson's domain and was developed independently.

Normally I wouldn't bat an eyelid at this sort of setup, but in this case Segala, O2 and Sony Ericsson are shouting about this half-arsed effort at accessibility as:

a great example of how organisations who are now starting to take accessibility more seriously are not building sites that might just look good and have some really great interactive features but that don't comply with accessibility requirements.

No-one should be under the impression that the discriminatory Da Vinci Code Trail website is acceptable, and it certainly isn't anything to shout about.

More questions for the DTI

@ 1:20 PM

Being unhappy with the DTI's response to our recent enquiries regarding the development of their new website, Bruce Lawson and I have put our heads together and asked them some further questions. You can see the full list on Bruce's site (external link).

When we get answers, probably on or around 21st July, we'll post them here and at Bruce's place, where you'll be able to comment.

June 21, 2006

Next stop - East Midlands Conference Centre

@ 9:45 AM

If you work in the public sector and are interested in web accessibility and broader web development issues for government sites, you might be interested in this event:

Currently in production and back by popular demand is a forum to showcase the leading edge of public sector website development highlighting innovation, usability and compatibility. Government Websites 2.0 - The Next Generation  (external link) will be held on the 15th August at the East Midlands Conference Centre (Nottingham).

Not sure about "leading edge", but you'll get to hear me bang on about accessibility for half an hour or so, and participate in the panel, discussing the question "What is the purpose and function of a local authority website?". If anyone knows the answer please email me, otherwise I'll just have to talk bollocks and hope no-one notices...

June 18, 2006

The DTI Responds

@ 1:26 PM

Last month I posted about the disquiet I felt about the DTI's new website (external link). Subsequently I emailed an enquiry to the department, requesting 6 pieces of information about the development of the website. My main concerns were that the site was inaccessible, ignored almost wholesale the government's own guidelines on the development of websites, and subsequently was an example of the mis-use of public funds.

Exactly 20 working days later on Friday June 16th (boy do they know their rights under the FOI legislation) I received a reply. It makes for interesting reading, but for me raises more questions than it answers.

The Q&A

For the record here are the 6 things I asked for and the responses I received, verbatim (with my added links):

The total budget and actual spend for development of the new website.
There was a budget of approximately £200000 for the development of the new website. The spend on website development is estimated at £175000 which includes costs from Fresh01 (external link) and the Department's main IT supplier, Fujitsu (external link).
Whether the website was developed by a team at the DTI or by a private company. If the latter please provide the name of the company.
The website was designed under contract by Fresh 01 (external link). The design was then implemented by the Department's main IT supplier, Fujitsu (external link), into a Content Management System (external link).
A copy of the requirements document for the production of the new website.
A copy of the requirements document is attached. This formed part of the 'Invitation to Tender for rebuild of the website, brief for customer research, design & information architecture, and usability testing phases'. [Download the document - dti.pdf (118kb, PDF format); dti.doc (87kb, MS Word format)]
A copy of any tender documentation related to the production of the new website.
Unfortunately the DTI considered their answer to the previous question to also answer this, despite the mention of a more comprehensive 'Invitation to Tender' document. I'll attempt to secure a copy of this in my next information request.
The basis for this statement on the DTI website: "This website meets the World Wide Web Consortium (W3C) Web Accessibility Initiative (WAI) AA-level standard."
The statement relating to accessibility was an error. It was removed from the website on 19 May 2006 when we reviewed the site in the light of questions raised.
Details of what quality assurance procedures were followed to ensure the new website met the requirements of the department and satisfied the relevant legal requirements for websites.
Two main rounds of User Assurance Testing were carried out on each template of the Content Management System, using test scripts. There was no formal User Assurance Testing for accessibility.


Let's take a look at the accessibility issues. The requirements document, echoing the government's own standards, specifies that:

10. Companies should note that the final website must comply with the Government Website Guidelines:$file/handbookindex.htm and Level AA of the Web Accessibility Initiative ("

It also states that one of the key objectives of the DTI website rebuild is:

To be a leading example of usable, accessible web design"

Finally, in Annex I, the objectives of the Usability Testing Phase are stated in these terms:

We need to ensure that we provide high quality, usable templates for incorporation into the Percussion CMS. To do this we need a robust programme of usability testing carried out during the design and build phase. This is important in ensuring that the site meets accessibility guidelines for the disabled and other groups, but it is also intended to improve the experience for all users. We need to ensure that users can find what they need to quickly and easily on the site.

We will expect the successful tenderer for this phase of the project to work closely with Percussion and also the company responsible for developing the templates. The results of usability tests will feed into the process of new page templates as they are developed.

Testing should carried out with representative groups of the DTI site's users and potential users. The final website must comply with the Government Website Guidelines and Level AA of the Web Accessibility Initiative.

All laudable stuff, but sadly the reality doesn't match the rhetoric. Why did the DTI:

This seems to be a classic case of a gap between the standards of accessibility a commissioner is stipulating in a requirements document, and their ability to verify that those standards have been met by suppliers. What concerns me is that no-one in the DTI or in any other central government department (the eGU anyone?) seems to have take it upon themselves to fulfil that quality assurance role, and as I've said many times before it's a major failing of many government web projects currently.

Who's to blame?

It's hard to see who comes out of this with any credit at all:

What happens now?

It's fascinating to me how a government department can spend £200,000 in 2006 on such a poor website. There's no shortage of guidance, advice and support for those seeking to produce and commission quality websites today, so why did the DTI and its suppliers fail to take pretty much any of it on-board? What steps will the DTI be taking now they know the website doesn't meet the objectives and requirements they stipulated? I'll be trying to learn more over the coming weeks - I've been directed to make furhter enquiries about the website to the DTI's Response Centre ( and of course I'll post what I do learn here.

June 14, 2006

Website Accessibility 2006 thoughts

@ 11:26 AM

Just a very quick post to say a big thank you to the organisers, speakers and delegates at the Website Accessibility 2006 conference which was held in Edinburgh yesterday. As I mentioned previously I signed up to give a talk about the lessons I learned when redeveloping ClacksWeb.

I had a great time, enjoyed giving my talk, met a lot of very nice people who were very enthusiastic about and committed to web accessibility, and I learned a lot too. It seemed that most other people had a good time too, and found it worthwhile (but then I didn't see the feedback forms!). There was good, informed participation from the audience, and a very wide range of organisations represented, including government, charities, the BBC and large corporations. It was very encouraging to see that accessibility was clearly on their radar.

I'm still in Edinburgh, preparing to travel down to London for @media - I'll post my presentation on this site or on ClacksWeb early next week, and will catch-up with those people who requested a copy of my development plan.

June 2, 2006

Beer drinker's guide to WCAG2

@ 2:15 PM

Bruce Lawson expresses in plain, accessible language what I'm sure many of us are feeling about WCAG2.0:

WCAG 2.0: when I want a beer, don't give me shandy (external link)

Comment on Beer drinker's guide to WCAG2 (2)

May 17, 2006

DTI achieves new low

@ 10:56 PM

Layout tables galore on the DTI websiteUsually it's accompanied by a feeling of disappointment, resignation and perhaps mild surprise. This week though I'm truly shocked by the mind-numbing, soul-crushing, bile-inducing awfulness of a new UK central government website. I've checked the date on this news release (external link) at least half a dozen times in the hope that it says May 2000 and not May 2006, or will reveal itself to be a sick joke. But no luck, it's a fact, the DTI's newly revamped website (external link) is about as shit as it's possible for a large, corporate website to be.

To make matters worse it's clear that they either don't know how shit it is, or don't care. Take their accessibility page (external link) for example, which boldly claims AA-level standard (sic), and provides a mine of useful information such as how to change the size and colour of text in Netscape. The entire site (thousands of pages at a guess) appears to be devoid of a single heading. It uses a javascript pop-up to provide a printable version of pages.

This time though I'm not just going to whinge about it here, I've been galvanised into action. I'm determined to do some digging to find out just what process was followed to produce this monstrosity, how much it cost and why the eGovernment Unit (external link), whose mission according to the PM is ensuring that IT supports the business transformation of Government itself so that we can provide better, more efficient, public services, are failing so miserably in their responsibility to promote best practice across government.

May 15, 2006

OPSI daisy

@ 1:51 PM

Apparently our favourite automated accessibility testing company, SiteMorse, has been working with the Office of Public Sector Information (external link)(OPSI) to make the OPSI website accessible. The impressive press release at e-consultancy (external link) tells us that the site is "accessible to all", that OPSI are aiming for AAA compliance, and that SiteMorse is part of the ideal solution for them to achieve it.

John Sheridan, who heads up the OPSI, goes so far as to say:

Automated testing was the obvious answer as it can check thousands of pages and site journey permutations in minutes, saving time and resources compared to manual testing. Of course there is still a need for manual testing for areas that cannot be checked automatically, e.g. images matching alternative text tags.

After reading the press release you'd be forgiven for thinking that the OPSI site must be a paragon of accessibility, representing the very best current practice and thinking around web accessibility. Sadly you'd be wrong. Sure it's better than many central government websites, but as I've documented here in the past that's not a very difficult thing to achieve.

I only spent 10 minutes on the OPSI site, but here are a list of the serious accessibility problems I identified in that short time (and which were obviously not picked up by SiteMorse):

These are all basic errors which any developer with an understanding of accessibility and web standards issues would have avoided during the design and build phases of site development.

The intention here isn't to pillory the OPSI. They've got a vast range of information across thousands of pages which they are trying to make as accessible as possible. The problem is that, like many local authorities (external link), they appear to have been seduced into thinking that the way to achieve accessibility is to run automated tests, then pick up the pieces. This approach is fundamentally flawed. Fixing the things found by automated software does not make an inaccessible site accessible.

Accessibility must be built in from the start, and that obviously requires an understanding of what makes an accessible site. The answer is to invest in your own knowledge of accessibility (buy some books, visit some forums, subscribe to some mailing lists) and to apply that knowledge and understanding to the design and build of your website. Then use the W3C validator (external link), and a free tool like TAW3 (external link) which are extremely helpful for finding typos rather than fundamental grammatical errors), and finally get some users to test it. Just don't believe the SiteMorse hype.

Thanks to Isolani (external link) for the OPSI/SiteMorse link.

April 4, 2006

GAWDs Meet 2006

@ 7:18 AM

On Saturday I popped through to Glasgow to meet with fellow members of the Guild of Accessible Web Designers  (external link) (GAWDs) including the founder of the guild, Jim Byrne. While there were only 9 in attendance (7 members, one spouse and a kindly minute-taker) the discussion was lively and good-humoured. Apart from myself the members who made it were:

It's always nice to put faces to names, and the thumbnail portraits on the GAWDs site just don't do some members justice! Kudos to Gareth and Sense Scotland for hosting the event and for providing refreshments. I'll not go into any great detail here about the discussion that ensued, for fear of misrepresentation - hopefully there will be a minute produced and published, thanks to Anne-Marie, again of Sense Scotland.

Discussion during the morning session concentrated on the background of GAWDs, how it is constituted, its aims and objectives, and how it might develop in future. A few personal highlights (or things I remember), but please note these are my recollections and may not reflect the views of the other members who were in attendance or GAWDs itself:

Unfortunately I had to leave shortly after lunch due to domestic commitments, but I assume that the afternoon discussion was as lively as the morning, and that the evening activities were undertaken responsibly! Apologies for the lack of photos, my camera battery died after two shots, but there were other, better-prepared snappers there so we should see some pictures soon.

March 30, 2006

Seduced by automated testing?

@ 6:39 PM

There's a wee bit of this year's Better Connected that escaped my attention on first reading, but happily an item in Headstar's very worthy E-access Bulletin (external link) this week led me back to the report. It concerns the disparity between WAI conformance claims on councils' websites and their real level of comformance.

Of the 296 sites in the transactional (T) and content plus (C+) categories which claimed a particular level of conformance, only 69 were found to achieve that level in reality, or just 23%. There are only really two explanations for such an alarming disparity - either the councils in question are deliberately over-stating their conformance level, or, more likely in my opinion, they are being led to believe that their sites are achieving a higher conformance level than they really are.

Better Connected suggests that the culprit might be automated testing:

There is no doubt that achieving Level A is hard work and that measuring it is a complex business. Many might also be lulled into thinking that passing the automated tests of Level A (and Level AA and AAA) means that you have achieved comformance at those levels.

If this is true it's fair to say that the use of automated testing is effectively damaging the accessibility of the sites in question, rather than improving it. Given the gravity afforded to the SiteMorse league tables in some quarters, it's easy to understand why councils might be seduced into developing and measuring their sites using the company's tool alone. But as has been said before (external link), the number of WAI guidelines that can be reliably tested with automated software is very small indeed, and the only way to really know if your site is accessible is to have people use it, preferably disabled people using a range of assistive technologies.

An analogy I like to use with non-technical, non-web managers is that of a car's MOT Test (for non-UK readers the MOT Test is a comprehensive safety test that cars have to pass every year). A full accessibility audit is like an MOT Test - it delves into aspects of your site's performance and accessibility that you can't reach yourself, and that you really aren't qualified to judge. An automated test on the other hand is like emailing a photograph of your car, with a note of the make, model and year, to a bloke who knows a bit about cars, and having him judge on that evidence alone if your car is road-worthy and safe to travel in. Which would you rather your passengers travelled in?

PS: I know this is a familiar refrain, and I know that I bang on about it all the time, but I've been convinced of the value of repetition (external link) by Jeff Atwood.

March 29, 2006

Guardian Inside View

@ 12:50 PM

When Better Connected (External link) was published at the beginning of the month I was asked if I'd write a short piece for the Guardian newspaper detailing our approach to web development at Clackmannanshire Council, with a slight emphasis on accessibility. I was very happy to do so, and it was finally published in today's paper, in the Epublic supplement. You can also read it online at the Guardian site - Why size doesn't matter in setting web standards (External link).

Also worth a read is the headline article in the supplement, Online, but out of touch (External link).

March 24, 2006

Where are the gatekeepers?

@ 10:53 PM

I'm a great fan of standards. They provide a constant point of reference, an ideal to measure yourself and others against. Not just the standards that are set for us by the W3C and their ilk, that have far-reaching and universal benefits, but also those we set for ourselves. Without standards how can we know for sure that we're achieving the levels of quality we aspire to?

It seems to me to be a lack of standards that has led Feather to post about the (mis)use of significant wads of public money  (external link) for the sponsorship of a conference, some of which will have gone towards developing an inaccessible website  (external link). He asks:

What if our provincial and federal governments made web accessibility a requirement for actually recieving the sponsorship money? What if organizations that get any funding from the government had to have accessible web sites? Would any of that help awareness? Would it make a difference? Is it simply that accessibility wasn't a requirement on the project, and so it just didn't happen?

My gut feeling is that awareness of web accessibility issues is still next to zero outside of the small but steadily expanding web standards clique. We are getting there, slowly, but when the websites of the agencies Derek cites, the Ontario Media Development Corporation  (external link), Canadian Heritage  (external link) and the Canadian Broadcasting Corporation  (external link), aren't accessible, what hope is there that the websites of projects they are sponsoring are going to be held to higher standards?

Here in the UK we're no less guilty. In recent months I've covered the brand new and in many cases horribly inaccessible websites of agencies either partially or wholly funded by our national government, and expressed my frustration at their disregard for (or ignorance of) the government's own standards  (external link).

The question I keep asking is where are the gatekeepers in these scenarios? Setting standards is only effective when someone is doing the measuring at the sharp end, fulfilling the quality assurance role that gives practical foundation to the commitment made in the standards. Or you could call it putting your money where your mouth is.

I'd like to see an enforcement of standards for all new domains, with domains only allocated to projects once they have demonstrated the necessary commitment and follow-through on accessibility and other standards. Currently the conditions for use of domains  (external link) states:

When you are using a domain name to deliver a web presence you are reminded that websites should comply with the e-Government Interoperability Framework, the Guidelines for UK Government websites and Framework for Local Government particularly on such issues as use of metadata, PICS labelling, accessibility and security.

Excuse my language, but screw "reminding" them, that's all just a bit too afternoon tea and bowler hats for my liking. If they don't comply then don't allocate the domain, or if it's already been allocated then withdraw the domain, after a warning shot if you want to be soft. I'm sure it would concentrate the mind wonderfully.

March 12, 2006

I can't complain, but sometimes I still do

@ 7:08 PM

Blair asks a searching question (external link) about inaccessible websites over at The Letter - just who do you complain to?

A user's first instinct might be to contact the site owner, but my personal experience is that very few even acknowledge such complaints, let alone act on them. Last year I (anonymously) emailed over a dozen local authority websites which had serious accessibility problems despite claiming AA or AAA conformance, but only one responded (kudos to East Renfrewshire Council (external link) who quickly addressed some of the problems and edited their accessibility statement).

Blair's suggestion is a link to the DRC's website inviting users who have found an inaccessible website to report it to the DRC. A fine idea in my opinion, which might pave the way to a dedicated reporting facility should the DRC see the volume of complaints increase.

March 8, 2006

PAS 78

@ 1:50 PM

It's a big day for web accessibility in the UK with the launch and publication of Publicly Available Specification (PAS) 78 (external link) in London. The specification was developed by the Disability Rights Commission in collaboration with the British Standards Institute, providing guidance to those commissioning websites to help them to understand the issues around web accessibility, and to ensure that the work they commission results in accessible sites.

It's attracted plenty of publicity today, including a good piece on BBC News (external link), so here's hoping that it makes a material difference in the months and years to come.


February 21, 2006

Visionary Design Awards

@ 6:05 PM

Robert Llewellyn playing KrytenI found out today that ClacksWeb (external link) has been shortlisted in the Public Sector category of the Visionary Design Awards (external link), and needless to say I'm chuffed. So I'm off to London next week for the awards, presented by old ice cube head himself, Robert Llewellyn.

There are some interesting sites on the shortlists for all the categories, but what particularly caught my eye was the Inaccessible Website Award category. What a fantastic idea that is - it should get some good publicity, and while I'm sure that Kate Bush and Blays Net Ratings deserve their nominations, I can't help but feel that if Disney World were to win it would bring the greatest benefit to the accessibility cause.

Update 23rd February: Last night BBC Radio Scotland (external link) ran a short piece on the awards on their Newsdrive programme. It's nice to hear web accessibility discussed on national radio, especially at prime time (this went out at about 5:30pm). Kudos to Radio Scotland. You can download it here: radioscotland.mp3 (2mb, 4 minutes 27 seconds, mp3).

Update 4th March: Well, we didn't win, but I had a great night nonetheless, thanks in no small part to the lovely Judy Friend and Pat Beech between whom I sat. The speakers were extremely good, as was the food and wine. Congratulations to Great Sampford Primary School (external link) and all the other winners, except Kate Bush that is. :O)

February 15, 2006

User-defined accesskeys - update

@ 7:59 PM

In response to a post by Mike Cherim to the GAWDs (external link) mailing list today, I've updated my accesskeys script and re-evaluated the way accesskey defaults should be handled. Mike was contacted by a user whose name contains an accented letter, which he enters using the keystrokes Alt +0228. If a site implements the UK Government accesskey recommendations, Alt-0 is the accesskey for the accesskey page, and a conflict arises preventing the user from producing his or her accented character. In fact any site implementing accesskeys 0-9 is creating potential conflicts for users requiring to input extended characters.

The solution is to implement no default accesskeys. To make it easier for a user to set standard keys I've extended the script to allow the site owner to provide suggested keys, which the user can set with a single form button (an idea borrowed from the implementation by Gez and Rich). I've also fixed a bug which was outputting empty accesskey attributes in some instances (thanks to Gez for the heads-up).

The extended script is running at ClacksWeb (external link) and will be running at (external link) in due course.

February 7, 2006

User-defined accesskeys

@ 10:09 PM

The drawbacks of accesskeys are well documented (external link), but one way of mitigating those drawbacks is to allow the user to define their own accesskeys for a site. In the absence of an established standard this is the best compromise - those who do make use of the functionality can do so, those who have problems with application or OS conflicts can disable them.

The implementation I'll be describing here is a server-side solution, using PHP. Before I started work on this I was aware of the work done by Rich Pedley and Gez Lemon (external link), but hadn't looked at their scripts. I wasn't aware of the work done by Thierry Koblentz (external link), and only found his implementation when searching for Rich and Gez's. Rich and Gez are clearly much classier coders than I, having provided an OOP solution - my version is totally procedural.

So why bother producing yet another script when there are already at least two out there? Two primary reasons - firstly I wanted to provide user-defined accesskeys at (external link) and at my day job  (external link), and that meant I needed to be totally comfortable and intimate with the code and the way it worked; secondly it's a great learning exercise to try to reproduce something someone else has already produced, and then to compare and contrast.

Key features

The script

The script is very easy to install and use:

  1. Download the code (1k), or copy and paste it from below and save it as
  2. Edit the $accesskeypages array. This array contains the URLs of the pages you wish to provide accesskeys for. Each URL has an associative array defining the token (internal name), default accesskey (can be left empty), and label (displayed on the user form).
  3. Include the script on every page where you want accesskeys to be available, ideally via a global or header include file. As it sets a cookie it must be included before output is sent to the user's browser.
  4. Pass a URL to the output_accesskey() function and it will return a string containing the accesskey for that URL if one has been set. For example:
    $navigation_links = array("Home" => "/index.php", "Contact" => "/contact/", "Accessibility" => "/accessibility/");
    echo '<ul>';
    foreach ($navigation_links as $label => $url) {
      echo '<li><a href="' . $url . '"' . output_accesskey($url) . '>' . $label . '</a></li>';
    echo '</ul>';

To display the form for a user to set their accesskeys call the output_acesskeys_form() function. For example:

include "";
include "";
echo output_accesskeys_form();
include "";

The script contains no styling, so you'll probably want to add some classes or ids to the output_acesskeys_form() function and apply CSS accordingly.

Possible enhancements

There are a few easy enhancements which could be made - providing suggested keys and a button to implement these, and sanity-checking a user's choice of key (for example detecting and warning of duplicates) to name two.

In the wild

The script can be seen in action in two places at the time of writing:

If you do use the script please let me know and I'll add the site to the list.

The script

$accesskeypages = array("/index.php" => array("token" => "home", "default" => "", "label" => "Home", "suggested" => "1"),
    "/accessibility/" => array("token" => "accessibility", "default" => "", "label" => "Accessibility", "suggested" => "0"),
    "/contact/" => array("token" => "contact", "default" => "", "label" => "Contact", "suggested" => "9")
if ($_POST["accesskeys"]) {
    $setaccesskeys = array();
	if ($_POST["submit"]) {
		for ($x=0; $x < count($_POST["accesskeys"]); $x++) {
			$setaccesskeys[$_POST["token"][$x]] = $_POST["accesskeys"][$x];
	} else if ($_POST["suggested"]) {
		foreach ($accesskeypages as $accesskeypage) {
			$setaccesskeys[$accesskeypage["token"]] = $accesskeypage["suggested"];
    setcookie("accesskeys", base64_encode(serialize($setaccesskeys)), 2147483647, "/");
    header("Location: " . $_SERVER['PHP_SELF'] . "?ak=1");
if ($_COOKIE["accesskeys"]) {
    $useraccesskeys = unserialize(base64_decode($_COOKIE["accesskeys"]));
} else {
    foreach ($accesskeypages as $akarray) {
        $useraccesskeys[$akarray["token"]] = $akarray["default"];
function output_accesskey($url) {
    global $accesskeypages;
    global $useraccesskeys;
    if ($useraccesskeys[$accesskeypages[$url]["token"]] != "") {
        return ' accesskey="' . $useraccesskeys[$accesskeypages[$url]["token"]] . '"';
function output_accesskeys_form() {
    global $accesskeypages;
    global $useraccesskeys;
    $akform = '';
    if ($_GET["ak"]) {
        $akform .= '<p>Your accesskeys settings have been saved.</p>';
    $akform .= '<form action="' . $_SERVER['PHP_SELF'] . '" method="post"><fieldset><legend>Current settings</legend>';
    foreach ($accesskeypages as $akarray) {
		$akform .= '<div><label for="' . $akarray["token"] . '">' . $akarray["label"];
		if (isset($akarray["suggested"])) $akform .= ' <em>Suggested key: ' . $akarray["suggested"] . '</em>';
		$akform .= '</label> ';
        $akform .= '<input type="text" maxlength="1" size="3" name="accesskeys[]" id="' . $akarray["token"] . '"';
        if (isset($useraccesskeys[$akarray["token"]])) $akform .= ' value="' . $useraccesskeys[$akarray["token"]] . '"';
        $akform .= ' /><input type="hidden" name="token[]" value="' . $akarray["token"] . '" /></div>';
    $akform .= '</fieldset><div><input type="submit" value="Set Accesskeys" name="submit" /> <input type="submit" value="Clear Accesskeys" name="reset" /></div></form>';
    return $akform;

January 22, 2006

@ 9:02 PM

This is a somewhat tardy announcement, but I'm delighted to be involved with (external site), a new site showcasing the best of accessible websites and proving that accessible != rubbish design. Sort of a StyleGala for accessibility, it's the brainchild of Mike Cherim (external site), and an excellent job he's done, both of the site and of the scheme of ranking (external site) sites.

I'm still toying with the idea of creating an accessibility hall of shame - a showcase of those sites who singularly fail to live up to the standards they claim to have achieved. There's still no shortage of sites who proudly display their WAI badges, and provide flowery accessibility statements rendered in 8px text, and they should be made an example of if you ask me. So, AAA Shame - good idea or not?

November 28, 2005

No surprises in EU report

@ 7:43 PM

As reported on Out-law (external link) and many other sites, a report published by the European Parliament has found that just 3% of public sector websites are achieving WCAG Conformance Level AA. It should be noted that 'public sector' in this report equates to central government and not local government, where the picture is slightly rosier.

It's no surprise to me that conformance levels are so low, at least in the UK. Following my recent mini-rant about visas4UK winning a major award, I contacted the eGovernment Unit (eGU) to ask what their response was to such an inaccessible site being lauded as best practice. They responded (eventually) thus:

The Cabinet Office does provide guidance to both central and local government on the eAccessibility of websites.... However, this is guidance only and it is the practical and indeed legal, responsibility of individual departments and their web management teams on how they interpret and apply such guidelines in order to comply with, eg, the Disability Discrimination Act.

My response:

What pressure or sanctions do departments face if they fail to adopt the guidelines issued by the eGU? It appears that you are not interested at all in whether departments actually follow the guidance, you're simply concerned with making sure the guidance is robust. If the only pressure is the threat of legal action then it's effectively no pressure at all.

Who is playing the essential quality assurance role in this scenario, to ensure that poorly designed, inaccessible sites aren't emerging from departments?

I doubt I'll ever get a reply, but until there is significant pressure from within government, the situation isn't going to improve in the near (or possibly even distant) future.

Comment on No surprises in EU report (2)

November 9, 2005

GC Accessibility Award

@ 9:02 AM

As reported previously, ClacksWeb was shortlisted for the Accessibility prize at the Good Communication Awards. On Monday I was lucky enough to be at the Bafta building in London to hear the site announced as the winner, and to receive the award from Garry Richardson, of BBC Sports Report fame. The first eight awards were presented by Phil Woolas, Minister for Local Government, but he had to dash off to another place, so Garry stepped into the breach for the later awards, including ours. Can't say I was gutted. ;O)

How much an award means depends on a lot of things, such as the number, breadth and quality of entries, and one of the most important factors is the authority of the judges. It was great then to see that the judging panel were all people in the industry I respected - Patrick Lauke  (external link), Derek Featherstone  (external link), Richard Conyard  (external link) and Donna Smilie  (external link) - and it made it a bit more special to know that the judges knew what they were talking about.

After breakfast at the fantastic Patisserie Valerie  (external link) on Piccadilly we also got to do a bit of shopping, hitting Fortnum & Mason  (external link), Hamleys  (external link) and Molton Brown  (external link). I don't miss living in London, but it would be nice if I could have chocolates from F&M's more than once a year!

November 6, 2005

Shaw Trust Report Overview

@ 11:31 AM

On Wednesday last week I received the audit report on ClacksWeb from the Shaw Trust Web Accreditation Service. Some of you may be aware that by Friday I had completed the necessary remedial work, and the site was accredited  (External link) and given the Trust's "accessible plus" award. This was partly due to me having already addressed some of the issues in the report both from the user testing I observed, and from feedback I'd received from the Trust during the audit process. That notwithstanding I thought it would be useful to provide an overview of the report itself, and at a later date some of the more interesting and esoteric audit findings.

Our 56 page written report consists of 5 sections:

  1. Executive summary
  2. Background & methodology
  3. User testing
  4. Technical audit
  5. Automated testing

Executive summary

A nicely written couple of pages which were perfect for me to copy to my management, this section provided a high-level overview of the findings together with the remedial action required before accreditation could be awarded. In our case the site didn't require a full repeat audit, but in some cases this is needed to ensure the audit report has been understood and the issues addressed adequately.

Background & methodology

This section provides an introduction to the report, an explanation of what web accessibility is, and why it's important, and information about adaptive technologies. It also provides an overview of the Trust's testing methodology, describing what user testing is carried out and why, and brief details of the technical audit process.

User testing

Into the meat of the report, covering tests performed by each group of users (low vision, blind, mobility impaired, dyslexic & deaf), problems encountered together with URLs, screen shots and comments from testers. A summary of the users overall impressions is given at the end of the section.

Having observed one of the days of user testing myself, I had already addressed some of the issues in this section, but reading it through with the full range of tester experiences listed reminded me of the value it added.

Technical audit

The contents of the technical audit are based on automated tests, carried out using InFocus  (External link), but extensively interpreted and moderated by human checks. It's worth stressing that these results are a long, long way from the raw automated testing results produced by services such as SiteMorse - it brings home how little of the WCAG can be tested by software alone.

Instead of covering each WCAG checkpoint in turn, the technical audit is broken into functional areas such as 'tables', 'images', 'site map' and so forth, and within each of these areas a good deal of narrative is provided explaining the relevant issues as they relate to the site in question. Where action is required this is clearly stated, together with URLs of offending pages, or where necessary broader guidance, for example for the handling of PDFs.

This section surprised me - I was expecting a dry, less useful summary of the technical shortcomings of the site, but it proved to be much more than that.

Automated testing

I received the report by email, and was intrigued to see a zip file attached as well as the report document itself. It contained the 900+ HTML files generated during the InFocus automated testing, along with a screen shot of the options used when the software was run. The last section of the written report provides information about these InFocus files, pointing out false positives and issues to be address not covered elsewhere.

At first it was a bit of a slog nailing down individual problems from all those files, but once I'd understood the structure of the reports, using the extended search functions in HomeSite made it a breeze to get a quick list of pages which were effected by a particular problem.

My view

All in all the report is impressive - it provides information for those in the organisation who need a summary but don't need to know the technical nitty-gritty. It provides softer, humanised user testing feedback which reminds you that the work we do is for people, not validators and automated accessibility testing software, and finally it provides expert technical advice which if followed and understood can only elevate the accessibility of the site in question. And above all it's usable - not a tome filled with technical references and jargon, but a practical, real-world guide to improving the audited site.

October 26, 2005

Shortlisted for GC Award

@ 6:07 PM

It was very gratifying to learn on Monday that ClacksWeb is one of three government sites shortlisted for the 'Accessibility Award' in the e-Government category of the Good Communication Awards 2005  (External link). I'll be popping down to London on 7th November for the awards event, hosted by Phil Woolas MP, Minister for Local Government, to learn the decision of the judging panel, and even if we don't win it's great exposure for the Council.

October 25, 2005

Shaw Trust Prologue

@ 6:02 PM

A couple of enquiries from readers of Blether about my relationship with the Shaw Trust have made me realise that a vital part of the story has been missed - the bits that happened before the ClacksWeb audit even began.

Development on the site started in earnest in autumn 2004, with a target completion date of March 2005. The site's development plan had provision for contracting an external, expert auditor preferably to cover usability and accessibility. I recognised at an early stage that the testing I could arrange myself was inadequate in depth, breadth and quality. In the end the budget didn't exist to get any professional testing done at all before the site's launch. That doesn't mean that no testing was done, however.

I'd imagine that like many developers in large organisations, my main usability testing pool consisted of colleagues with differing levels of web experience, selected to try different bits of the site as it progressed. Forms, navigation, colour schemes - these things and many others all got the informal review treatment, and predictably the results were never conclusive. It was a useful format for finding some server-side bugs, but beyond that it was hard to separate subjective preferences from true usability issues.

For accessibility testing I had access to one screenreader (Jaws) user, blind, who was relatively new to computers and new to the web. Whilst his input was valuable at the time, in retrospect I'm certain it wasn't representative of the average Jaws user. He lacked the experience to have developed strategies for using the software to overcome the barriers that exist in even the most carefully constructed site, so many of the problems he experienced weren't necessarily site design issues. Apart from this I read the books (Joe Clark's  (External link) and Jim Thatcher's  (External link)) and tested and retested against the WCAG myself.

And that was it for testing. The site launched on schedule, but like many sizeable web projects (the site has about 1,000 static pages and tens of thousands of dynamic pages), for a couple of months immediately afterwards I fire-fought problems that would have been picked up by better testing, but that I wouldn't have had the time to fix even if I had known about them. It's called a public beta I suppose!

A new financial year brought the possibility of getting some proper testing done. I contacted a number of usability and accessibility testing service providers and received a few quotes. It soon became clear that the option of separate usability and accessibility testing was beyond my meagre budget, so a decision was made to focus on accessibility, and on getting as much as I could for the Council's money. I looked at a number of accessibility audit providers, but the Shaw Trust's pan-disability user testing was the deciding factor. No other provider I found came close to offering the same breadth of user testing:

I also appreciated the Trust's decision to work with a commercial partner, CDSM  (External link), in providing the technical audit. The Trust's primary business isn't the web, but CDSM's is - the combination of expert awareness of disability and expert knowledge of the technical web was unique in my experience.

With the decision made it was a matter of getting more details of the audit process, satisfying myself that it was what we needed, establishing cost, and formally contracting with the Trust. A pre-audit questionnaire provided the Trust with full details of the site, covering areas like:

That completed the audit was ready to start.

October 20, 2005

Shaw Trust User Testing

@ 8:29 PM

The testers: Linda, Steve, Malcolm, Jamie and MarkAccessibility was a prominent feature of ClacksWeb's development plan, reflecting its status as a local authority site and my increasing awareness of it accessibility as a critical issue. As work on the site progressed in early 2005 I undertook regular checks to make sure it was on track to deliver, both by checking against the WCAG and by using very simple tests like increasing the text size, browsing the site with Lynx, and navigating with the keyboard. Yesterday it became very clear to me just how inadequate that sort of testing was in truly determining whether or not the site was accessible.

I had the pleasure of spending the day at the Shaw Trust's regional headquarters in Llandarcy, Wales, experiencing first-hand some of the user testing of ClacksWeb. Accompanied by Andrea Kennedy, the Trust's Web Accessibility Services Officer, and Grant Broome of CDSM (External link), who co-devised the audit programme and conducts the technical audit, I met, observed and learned from 6 users with various disabilities and levels of web experience. They were:

The users are provided with scripts by Andrea - basically a series of tasks, for example yesterday the users were required to register with MyClacksWeb (External link) - and are asked to record their positive and negative experiences while working their way through them. They all use PCs running Windows XP. The atmosphere is relaxed and informal - this isn't a sterile, clinical testing lab, but somewhere the testers seem to come to socialise as well as make their services available to the Trust. They are all volunteers, although there are plans to create a social enterprise offering user testing on a commercial basis - an excellent idea in my opinion.

I spent some time with each user, observing and asking questions, or in the case of Mark listening in to his use of Jaws. This was an invaluable experience for me, seeing and hearing how these users navigated the site, what barriers they were facing and what strategies they each used to overcome them. It only took a couple of minutes of observing Mark, and having him talk me through his perception of the site, to discover a serious problem with the site's contextual navigation menus. Basically they are placed just above the destination of the site's "skip to content" link, so he never knew that contextual navigation existed, and was forced to use alternative methods such as search or the A to Z to complete the tasks.

Another serious problem was encountered by Steve, the keyboard user - I hadn't specified any focus styling on links, so when tabbing around the site it wasn't obvious which link had the focus. The browser default, faint dotted border just wasn't enough for Steve to perceive the current focus. These two examples show just how important pan-disability user testing is. Neither would have been discovered by automated testing (99% of pages on the site are valid XHTML and satisfy WCAG to at least AA), and neither user was affected by the problem the other discovered.

More testing was scheduled for today, by another group of users with different disabilities, with the technical audit to follow soon thereafter. I expect to receive the initial report late next week, and based on my experiences yesterday have no doubt that it will prove to be an extremely valuable resource.

October 18, 2005

DTI e-commerce award winner horror

@ 4:41 PM

The winners of the DTI E-commerce awards for 2005 were announced last week. I must admit it passed me by (I was on my hols) but I was interested to see who had won the 'eGovernment National ICT Innovators Award' on my return. According to the award site (External link):

These awards will recognise best practice in the development of information and communication technologies (ICT) by UK business and public sector communities.

Heady stuff, and there's more:

Awards will be presented to individuals and or project teams that have demonstrated clarity of thinking and development/deployment of an approach that has the capacity to change the ICT paradigm in the particular technology,market sector.

And the winner was... visa4UK (External link) which allows you to apply for a UK visa online (no surprises there). What is surprising is just how far from e-commerce best practice the winning site is, and just how inaccessible. There's not enough time for a full-fledged dissection of the site's technical shortcomings in terms of accessibility and web standards, but the real shocker is the fact that you can't apply for a visa without javascript.

While it's not a surprise - there are still a vast number of sites out there that don't degrade gracefully - the fact that this site won a national award is evidence of the long, long way we still have to go as an industry before accessibility and web standards are established as de facto good practice.

October 17, 2005

A Shaw start

@ 3:35 PM

Today marks the start of the Shaw Trust web accessibility audit of ClacksWeb (External link), and is hopefully the first step on the way to Shaw Trust Accreditation for the site.

For those of you who aren't aware of the Trust and its web accessibility services, they offer perhaps the most comprehensive pan-disability audit and accreditation scheme available in the UK (and possibly further afield). As well as automated tests, the Trust uses a panel of real users, with a range of disabilities.

There's a lot more information on the Shaw Trust website (External link), and I'll be recording my experiences of the process on this site as we progress. Although I'm experiencing some high anxiety at the moment (what will they find wrong with the site?!) it's an exciting prospect to have quality feedback from real users, and going through the process can only help to improve the site.