Web Standards Archive

June 8, 2007

@media 2007 London: day one retrospective

@ 8:10 AM

A quick take on the talks I saw yesterday at @media:

Jesse James Garrett, Beyond Ajax

He's clearly a very smart and successful bloke, and it was a tough ask to follow Jeffrey Zeldman and Eric Meyer in delivering the keynote, which is maybe why I was somewhat disappointed. Some great snippets of content and ideas, but a little lacking in focus, no clear central theme, and no real conclusion or summary. The lack of audience questions may have been a result of shock and awe, or more likely a lack of challenge.

Jason Santa Maria, Diabolical Design

Enjoyed this one a lot. Jason provided some excellent insights for non-designers like myself into the processes he and other visual designers go through when working on a design.

Nate Koechley, High Performance Web Pages

Probably my favourite talk of the day. I'm a sucker for quick-fire, technique and knowledge-based presentations, and this fitted the bill exactly. The great thing about the stuff Nate presented was that it was backed-up by the might of Yahoo! and had clearly been extensively tested. So when they say that the optimum number of hosts-per-page is between 2 and 4, you should listen.

YSlow looks like a fantastic tool (integrating it into Firebug is a master stroke), so here's hoping it will be released into the wild later in the year as Nate intimated.

Dan Cederholm, Interface Design Juggling

Dan's a great presenter: relaxed, funny, and personable. This started very well, with the genius ToupeePal (external link), but then suddenly we were into 20 minutes of Microformats. Shame - if I'd wanted to hear the Microformats pitch again I'd have gone to see Tantek's talk (and heckled about the abuse of abbr). So A+ for entertainment in the first half, D- for interest in the second half.

Mark Boulton, Five Simple Steps to Better Typography

Great stuff. Had a strong learning element, interweaved with a martial arts story culminating in a one-inch punch (external link). Delivered in a conversational style that worked brilliantly.

Joe Clark, When Web Accessibility Is Not Your Problem

Announced the release of the WCAG Samurai (external link) WCAG 1.0 errata, and two peer reviews. Go read it now, you have 3 weeks to email comments that may be considered before a final version.

Made mildly controversial statements about certain matters of accessibility, most of which I agreed with, and at least one of which I didn't, at least in the context in which it was illustrated.

This was that we shouldn't trouble ourselves to ensure link texts are unique on a page, even when they lead to different destinations, or to ensure they make sense out of context. We know the link list is a commonly used feature in JAWs, and it doesn't take much effort to accommodate, even if you want to present visually identical link texts.

Can't remember what the other point was, didn't take notes for some reason.

Unfortunately Joe over-ran: he should have been prompted to wind-up at least 10 minutes earlier, then he himself could have made the judgement call as to whether to take the time delivering, or to wrap-up and let the questions come. As it was there was no time for questions, which caused a little consternation in some quarters.

In summary

Despite the atmosphere feeling a bit flat (and it might just have been me, although speaking to some other third-timers it didn't seem to be), looking back now at my notes it was generally as high-class as in previous years. Maybe my expectations of some of the speakers were unrealistic?

The food was delicious (no mean feat on that scale), the free bar lasted a lot longer than anyone except the most tight-fisted could have hoped for, and there were plenty of networking opportunities all day and night. Music was still too loud at the party though.

Sadly due to other commitments I couldn't make any of the sessions today, so I'll need to rely on the slides and hopefully podcasts.

May 20, 2007

Review: RiverDocs Converter

@ 1:14 PM

Disclosure: This is a paid review. RiverDocs Limited have had no influence on the tone or content of this review.

Summary

An essential tool for any organisation which publishes Microsoft Word or PDF files online, RiverDocs Converter is vastly superior to any other conversion software currently available. There's now no reason for publishers not to offer accessible, high quality HTML versions of documents previously published only in proprietary formats. The parser even compensates for poorly authored source documents, previously a significant barrier to producing accessible, semantic HTML versions of Word and PDF documents.

It's not a magic bullet though - every conversion requires human-checking, and documents with any degree of complexity require a degree of input from an experienced web editor - but despite a slightly weak editor it's still well worth the price and will only get better in future versions given the publisher's focus on research and development.

Introduction

RiverDocs Converter is a software package for the Microsoft Windows operating system which claims to convert documents designed for print into structured, accessible HTML documents for online delivery. In short this means it'll take PDFs and Microsoft Word files and attempt to convert them into a format more suitable for delivery and consumption over the web.

PDF and MS Word are beloved of government and corporations who often need to publish large documents quickly, but these formats are primarily designed for printing, not for delivery online, and have serious accessibility issues associated with them. So the potential benefits from effective conversion software are enormous - being able to offer HTML versions of these documents cost effectively is something that hasn't been possible before.

Installation

Installation was straightforward, taking a couple of minutes on my workhorse desktop PC.

RiverDocs Installation

The software does require the latest version of the Microsoft .NET 2.0 Framework, if this isn't already installed and available you will be prompted to download and install it.

Getting started

Starting the software for the first time you are presented with a quick guide to converting your first document, and the clean, functional RiverDocs interface.

RiverDocs interface

Test 1 - my first conversion

To test the software for the first time I used a PDF document regarding chimney stack removal I found on Cambridge City Council's website at:

It's a 4 page document containing a cover sheet, and a mix of different levels of heading, bullets and images. The PDF document was not tagged.

Opening the file displays it in the main RiverDocs window:

RiverDocs showing PDF file

Clicking the Convert button started the conversion, which took less than a second using the default settings. The interface changes to a split-screen affair, with the original document in the left pane, and the converted document in the right pane:

RiverDocs PDF / HTML conversion split panes

To give an idea of the quality of conversion and mark-up the software can produce automatically I wanted to save the document immediately. Admittedly this is not intended real-world usage of the product, but does provide an idea of quality of the baseline conversion prior to manual editing.

Big River had provided me with a one page crib-sheet covering the major interface elements, so I knew that the Save function was for saving a RiverDocs project, and the Publish function was for saving the converted document as HTML, CSS and images.

Clicking the Publish button presents the Publish dialogue box:

RiverDocs publish dialogue box

In addition to publishing as HTML, the software also supports output in CHM (Microsoft Compiled HTML Help) format.

To keep things tidy I wanted to publish this version into a new folder, but this is not a standard Windows file dialogue box, and doesn't provide the facility to create a new folder, so I had to switch out to Windows Explorer to do this before publishing the document in RiverDocs.

But, it turns out the file name entered into this dialogue box is actually used as a folder name, which will be created for you and into which the document is published. These sorts of interface issues are symptomatic of the software's relative youth, and will no doubt be ironed out as the product matures.

The publishing of this document took less than a second, here are the results:

The default settings produce HTML documents with an XHTML 1.0 Transitional doctype, generating a separate HTML file for each page of the source PDF, an index HTML document containing a generated table of contents, a single CSS file and an images folder containing converted images. The CSS is valid, and attempts to mimic the style of the original document as closely as possible.

As a comparison I ran the same file through Abbyy's PDF Transformer, another PDF to HTML conversion tool. The results were much vastly less impressive:

The Abbyy software makes no attempt to produce structured HTML, instead presenting every single line in the document as a paragraph and styling them to appear as closely as possible to the original PDF.

In general the quality of the default output from RiverDocs is extremely impressive. In this case there were just two validation problems: an unclosed list item in the generated table of contents, and missing alt attributes for the images on the final page. Since the default output is "section based" the parser moved the words "GUIDANCE NOTES" onto a page by itself despite displaying it as part of the title page in the preview pane, which was the only deviation from the page layout of the original.

But this isn't a fair test of the software which wasn't designed to be operated in this manner. While the results are good, they aren't good enough to publish without manual editing, so let's try again, only this time using some good old human judgement.

Test 2 - getting serious

For the second test I wanted to take the same document but publish it to a single HTML document of the highest quality as close to the original format as possible. The process is the same - open the file to be converted, and click Convert.

Metadata

Before getting stuck into the document itself I wanted to specify some metadata for it. Fortunately RiverDocs make this very easy to do (just click the Metadata button), and provides a default set of Dublin Core elements for completion:

Metadata editor

It appears that additional user-defined elements can be created, so publishers in UK government for example can easily add eGMS metadata to converted documents:

eGMS metatdata entry

Unfortunately these additional elements didn't make it to my published document, a bug I've reported to Big River.

Options

RiverDocs optionsRiverDocs offers a number of options to customise the output of the converted document. The most important are:

The editor

For many users the area of the application where most time will be spent is the HTML editor, where the converted output can be modified and fine-tuned. In most cases this will be to either match the original document or to conform to a house web publishing style.

The editor always presents the output document in a page-by-page format, regardless of the publish mode that's currently set. It would be nice to be able to preview the single page and section-based options.

The editor can be used visually in preview mode, or in source mode which provides a simple text editor view of the document page you're working on. As I wanted a single file output and had set the options accordingly there was something of a disconnection between working on a separate HTML file for each page, and the intended output. As far as I can see there is no way to preview the single file output prior to publishing.

RiverDocs toolbar

The toolbar provides standard editing tools you'd expect to find available on a simple HTML editor. These generally work as expected, although there are some quirks - for example undo will only remember changes you've made until you switch to source mode: so if you make change, switch to source mode and back to visual mode you'll need to correct any errors manually in source mode.

Once you've got used to the way the editor functions it's a reasonably comfortable working environment, but don't expect it have the functionality of DreamWeaver. I can foresee many users doing the initial conversion in RiverDocs and taking the published output into the editor of their choice to complete the process: indeed if I was using RiverDocs on a daily basis to convert a large number of files this is the way I'd work - the software's value lies in its conversion capabilities, not its editing capabilities.

One of the most common problems that will arise from automatic conversion is that of images and appropriate alt attributes. Editing images is easy - select the image in the editor, and click the image icon:

Editing image properties

The id is a temporary value used by the software during conversion and editing, and is removed on publishing.

Screen capture

One very nice feature of RiverDocs is the screen capture tool. On the final page of the original PDF is a diagram showing a cross-section of a wall, with some labels indicating particular features of the diagram. Since the PDF was generated from Adobe Pagemaker, the diagram consists of an image object and a series of text objects for the labels. In the automatic conversion RiverDocs quite rightly converted these separately, which can be seen on the last page of the output of test 1.

In my final version I want the image and labels as a single image, and this is where the screen capture tool comes in:

Screen capture tool in action

It operates like any screen capture tool you've used before - highlight the area to be captured and click an icon. In RiverDocs the highlighted area will be inserted into your HTML document as an image.

You've got issues

Issues icon The software provides assistance to help you identify and correct potential issues with the converted document. The Issues icon gives a quick idea of the number of issues identified by the software at any stage after automatic conversion. Clicking the icon opens a third pane with details of the issues:

Three pane view - original version, conversion and issues

The potential issues highlighted include missing alt attributes on images. I was disappointed to note that alt text from objects in tagged PDFs wasn't carried across to the converted HTML document. Otherwise the guidance provided by the issues is sound, based as it is on HTML Tidy - those of you familiar with the Tidy extension for Firefox will know what to expect.

For non-expert users this provides an extremely useful indication of where there are potential problems in the converted document, and the separation of current page issues and whole document issues guides such users through the document with ease. Personally I was more comfortable editing the document first before using the issues tool - picking up the issues I could see, modifying structure, adding or correcting alt attributes, generally tidying the document up - but that's probably no more than a reflection of my workflow habits.

Test 2 results

Here's the output:

It took 10 minutes from opening the original PDF to publishing this version - very impressive results in such a short space of time.

Test 3 - getting more complex

To really test the software we need something a little more complex than a single-column, text and images document. On the Clackmannanshire Council website I found a 24 page consultation document laid out in 2 columns, which included multiple levels of headings and a data table:

The untouched output from RiverDocs shows its limitations, but is still an impressive result:

It took me about 30 minutes to tidy the document up in RiverDocs, but I was still left with a lot of redundant classes with names like "font19" and all those named anchors generated for the table of contents. Cleaning up the mark-up in RiverDocs proved to be a bit of a chore, so I tried again, this time dumping the output immediately into DreamWeaver.

15 minutes later I had this clean, structured version of the PDF:

My conclusion - if your document is anything more complex than single-column text then forego the RiverDocs editor for your favourite HTML editor.

Test 5 - Microsoft Word

So what about Microsoft Word conversion? Well, this review was produced in a simple Word document, so I ran it through the RiverDocs Converter for publishing online. Here is the untouched conversion:

This was a 12 page Word document, and conversion took noticeably longer than PDF conversion, at about 20 seconds. The only real issues with the conversion were the failure to convert Word bullets to HTML lists and the failure to pick up alternative text on images. Other than this the structure was accurately represented and the images correctly positioned.

The converter doesn't appear to parse the styles used in Word documents - I converted a test document which was styled throughout as paragraphs, but with headings made bold with larger font sizes. RiverDocs therefore accommodates poorly authored, unstructured source documents, by analysing the font size and weight and assigning heading levels accordingly. This is a great feature given the preponderance of incorrectly produced Word documents in many organisations.

Annoyances

Given the immaturity of the package there are some inevitable annoyances with the interface and output:

possible in source mode. In a long document this can quickly become tedious. It would also be an improvement if the TOC used ids rather than named anchors.

<p class="font9"><span class="font9"><strong>NOTE: Some chimneys act as a buttress and provide support to long walls.</strong></span> <strong>Please check with Building Control or a structural engineer</strong><span class="font9"><strong>, before</strong></span> <span class="font9"><strong>proceeding, to determine if this is the case.</strong></span></p>

None of these are major problems though, and I would expect the interface to improve as the software is developed further. The key feature of the product is the conversion algorithm, which is extremely impressive.

Conclusions

RiverDocs is an impressive product and an essential tool for any organisation which has a need to publish more than a small number of PDF and Word documents online. Simple documents take no time at all to convert and tidy using the RiverDocs editor, while I found more complex documents are best converted in RiverDocs and then edited in a more powerful and functional dedicated HTML editor such as DreamWeaver.

The true value of RiverDocs lies in its ability to turn unstructured, multi-column PDF documents into structured HTML documents, whilst maintaining the correct reading order. Critically, the intelligent parsing engine compensates for low-quality source documents, previously a real barrier to producing HTML versions of PDF and Word documents.

Future versions of RiverDocs are very likely to offer significant improvements, both in terms of quality of conversion and the application interface. Apart from being a single-product company, concentrating solely on the development of the RiverDocs Converter, they also fund applied research at Queens University Belfast as well as other universities engaged in the fields of accessibility, artificial intelligence and character recognition.

About the reviewer

Dan Champion has worked in the web industry since 1995 through his company Champion Internet Solutions Limited, with clients in the private and public sectors. Between 1999 and 2007 he was responsible for Clackmannanshire Council's multi-award winning websites.

He is a regular speaker on the subjects of web accessibility, web standards and web strategy at conferences and workshops throughout the UK, has written on the subjects of e-government and web accessibility for the Guardian, and featured on national BBC Radio in various guises.

March 2, 2007

Better Connected & web accessibility

@ 9:35 PM

SOCITM's Better Connected 2007 is published next week. Almost a year ago to the day I posted somewhat critically about the report's use of SiteMorse, and its reliance on automated testing for some of its findings. This year I've become rather more personally interested in the report - I was disappointed to learn earlier this week that ClacksWeb is not one of the 2 sites which were found to conform with WCAG level AA.

I'll cut to the chase. BC's assessment of the accessibility of local authority websites is fundamentally flawed. Admittedly this is a reflection of the use of the Web Content Accessibility Guidelines 1.0 as the instrument of measurement, but it's flawed all the same.

The single most important aspect of that flaw is this: syntactically valid HTML is not a primary indicator of web accessibility, and by the same token syntactically invalid HTML does not categorically indicate an inaccessible website.

Valid HTML is at best a proxy indicator of web accessibility - that is an indicator that doesn't have a causal link with the outcome (in this case an accessible website), but rather is something that is likely to be found where the outcome exists. Simply put, web developers who appreciate the issues around accessibility are more likely to be informed professionals who also appreciate the benefits of adopting and adhering to web standards. However, just as with SiteMorse's much maligned league tables, using HTML validity as an initial filter to identify "more accessible" sites is wholly invalid.

For the purposes of Better Connected an arbitrary threshold of 50 errors across 200 tested pages was used. Any sites reporting less than 50 errors went forward to be considered for WCAG AA conformance, those reporting more than 50 errors did not. Leaving aside this arbitrary limit, this also shows a gross failure of logic - to conform to level AA of WCAG a site must surely report zero errors across its 200 pages? A single error breaches checkpoint 3.2 of the guidelines, rendering it unable to conform to level AA.

The Web Content Accessibility Guidelines are 8 years old this year. In web terms they are at the very least pensionable, and quite probably pushing up the daisies. And remember they are guidelines, and as time passes it becomes more important that those using them as guidance recognise this.

Education is the key to improving the state of web accessibility, whether we're talking about government or any other sector. Web developers and managers, content editors, suppliers of applications that produce web-based output - all of these people require a sound understanding of the accessibility issues in their respective areas of operation to achieve and sustain an accessible online presence, and that understanding can only come through learning.

A good start would be to make the findings of the automated tests for BC available to the local authorities themselves. I was disappointed to discover 158 validation errors had been found on ClacksWeb - was it a single error across 158 pages, or one really bad page? The two scenarios have quite different implications for me as a manager, but to date I've been unable to elicit the details, and the errors aren't apparent on the site any longer.

Little fault, if any, should be attributed to the RNIB for this state of affairs - there is no practical way 468 websites can be adequately tested for accessibility on an annual basis without a significant financial and resource commitment.

The solution, however unpalatable it might be to the bean counters who seem to have a desperate need to rank and score us all, is to abandon the concept of ranking 468 websites for accessibility, and to stop testing them against an 8 year-old set of guidelines. Instead SOCITM should much more wisely employ the expertise of the highly skilled and knowledgeable staff at the RNIB to identify, highlight and promote best practice in web accessibility, both in the local government sector and beyond. I'm certain the WAC staff could come up with some fantastic educational resources if they were given free rein with SOCITM's financial contribution for BC. The current state of affairs is like asking the Michelin Guide to judge restaurants on the quality of their cutlery.

The question that I keeping coming back to is this - what does the Better Connected reporting of web accessibility achieve? Last year it painted a fairly depressing picture, and this year that picture is almost identical. If SOCITM wants to be an agent for change it needs to do more than just reporting a problem exists, and start putting its members' best interests first by helping them to address the problem.

July 31, 2006

Redesign: Rucksack Readers

@ 2:12 PM

I've had a web design business since 1995, and although over the past few years I've gradually run it down and let clients go, largely to spend more time doing other things, there's one client I've kept on and have no plans of letting go.

Rucksack Readers (external link) was founded in 2000 by a good friend of ours, Jetta Megarry, and it's been great to watch the development of and be involved with a successful, growing business. The company produces guide books for long distance walks, treks and now the seven summits (the highest mountain in each continent). Their books are second to none in their sector - superior design and killer content combine to make them essential for those actually undertaking the routes covered in the books, and the photography is enough to make them ideal coffee table reading or gifts.

Anyway, today the Rucksack Readers website (external link) was relaunched, following a standards-based, accessible redesign which has taken shape over the last month. I was responsible for everything at the web end - the scripts, the CMS, and the front-end code and CSS. The site is visually designed by Ian Clydesdale at Workhorse Design (external link), who also does the design for the books, with Jetta producing the content. It's been a real team effort, and hopefully the results will speak for themselves.

July 21, 2006

DTI update - FOI shenanigans

@ 10:10 PM

I've been on my hols this week, and on my return I discovered, as predicted, that the DTI had replied to the follow-up Freedom of Information (FOI) enquiry Bruce Lawson and I submitted to them on 26th June. Unfortunately the DTI has rather neatly, but not necessarily fairly, side-stepped the entire issue, and used a technicality to avoid fulfilling our enquiry.

Here's the email from the DTI in full, I'll leave you to have a quick read and draw your own conclusions before reading mine:

Dear Mr Champion

Thank you for your request for information on the accessibility of the Department of Trade and Industry's website which we received on 26/06/06. I regret that we cannot provide this information, as the cost of administering your request would exceed the limit prescribed under Section 12 of the Freedom of Information Act. This is £600, which represents the estimated cost of spending 24 hours in determining whether the Department holds the information, and locating, retrieving and extracting the information. Where the cost of compliance with a request would exceed the appropriate limit, we are not obliged to comply with that request.

We have received nine separate FOI requests regarding the accessibility of the DTI website. All nine requests appear to have been generated by contributors to the blether.com website and discussion forum:

http://www.blether.com/archives/2006/06/the_dti_respond.php
http://www.brucelawson.co.uk/index.php/2006/stupid-government-websites/

Regulation 5(1) of the Freedom of Information and Data Protection (Appropriate Limit and Fees) Regulations 2004 provides that, where two or more requests for the same or similar information are made to a public authority by different persons who appear to be acting in concert or in pursuance of a campaign, those requests may be aggregated for the purposes of estimating whether compliance with the requests would exceed the appropriate limit.

We have aggregated the nine requests received on this subject, and estimate that the cost of compliance with them would exceed the appropriate limit. We are therefore not obliged to provide the information requested.

However, the DTI is aware of the accessibility issues with the new website. An accessibility audit is planned and the recommendations from the audit will identify accessibility improvements.

If you have any queries about this letter, please contact the DTI Response Centre quoting the FOI reference number above.

Appeals procedure

If you are unhappy with the way the Department of Trade and Industry has handled your request you may ask for an internal review. If you wish to complain, you should contact us at:

Department of Trade and Industry
Response Centre
1 Victoria Street
London SW1H 0ET
dti.enquiries@dti.gsi.gov.uk

If you are not content with the outcome of the internal review, you have the right to apply directly to the Information Commissioner for a decision. The Information Commissioner can be contacted at:

Information Commissioner's Office
Wycliffe House
Water Lane
Wilmslow
Cheshire SK9 5AF

Regards

DTI Response Centre
Tel: 020 7215 5000

For those of you not familiar with the Freedom of Information Act (FOIA), public authorities subject to its provisions are not obliged to respond to requests where the estimated cost of determining what information the authority holds, locating the information, retrieving it, and, if necessary, editing or redacting it exceeds £600, calculated at a notional rate of £25 per hour.

In addition, under section 12(4)(b) of the Act, authorities can aggregate multiple enquiries for information from different individuals where it appears to the authority that the requests have been made in concert or as part of a campaign.

In short, the DTI is refusing to answer legitimate questions about the processes it followed in procuring certain services, and about processes it may or may not have in place for future procurement.

The FOIA is regulated by the Information Commissioner's Office (ICO). The ICO issues guidance to public authorities about their responsibilities under the legislation, and to the public about their rights. The ICO offers this guidance (PDF) to authorities refusing requests due to excessive costs:

In this case the DTI has chosen to ignore the first four of these guidelines, merely providing information about appeals procedures. (Ignoring guidelines is clearly something of a habit for the DTI.) They also seem to have missed the point entirely by stating that:

...the DTI is aware of the accessibility issues with the new website. An accessibility audit is planned and the recommendations from the audit will identify accessibility improvements.

It would be a tad distressing if they weren't aware of the issues by now, but of course what we're trying to discover is how they missed the issues in the first place, and what they are doing to prevent it happening again.

The upshot is that I will be requesting an internal review from the DTI, and if I find that unsatisfactory I'll be perfectly happy to appeal to the ICO.

If anyone reading this made an FOI enquiry about the DTI site please get in touch, since we might as well make it a real campaign if the DTI are going to treat it as one. And if you didn't, and you feel strongly about this, please take Bruce's advice and write to your MP.

June 18, 2006

The DTI Responds

@ 1:26 PM

Last month I posted about the disquiet I felt about the DTI's new website (external link). Subsequently I emailed an enquiry to the department, requesting 6 pieces of information about the development of the website. My main concerns were that the site was inaccessible, ignored almost wholesale the government's own guidelines on the development of websites, and subsequently was an example of the mis-use of public funds.

Exactly 20 working days later on Friday June 16th (boy do they know their rights under the FOI legislation) I received a reply. It makes for interesting reading, but for me raises more questions than it answers.

The Q&A

For the record here are the 6 things I asked for and the responses I received, verbatim (with my added links):

The total budget and actual spend for development of the new website.
There was a budget of approximately £200000 for the development of the new website. The spend on website development is estimated at £175000 which includes costs from Fresh01 (external link) and the Department's main IT supplier, Fujitsu (external link).
Whether the website was developed by a team at the DTI or by a private company. If the latter please provide the name of the company.
The website was designed under contract by Fresh 01 (external link). The design was then implemented by the Department's main IT supplier, Fujitsu (external link), into a Content Management System (external link).
A copy of the requirements document for the production of the new website.
A copy of the requirements document is attached. This formed part of the 'Invitation to Tender for rebuild of the website, brief for customer research, design & information architecture, and usability testing phases'. [Download the document - dti.pdf (118kb, PDF format); dti.doc (87kb, MS Word format)]
A copy of any tender documentation related to the production of the new website.
Unfortunately the DTI considered their answer to the previous question to also answer this, despite the mention of a more comprehensive 'Invitation to Tender' document. I'll attempt to secure a copy of this in my next information request.
The basis for this statement on the DTI website: "This website meets the World Wide Web Consortium (W3C) Web Accessibility Initiative (WAI) AA-level standard."
The statement relating to accessibility was an error. It was removed from the website on 19 May 2006 when we reviewed the site in the light of questions raised.
Details of what quality assurance procedures were followed to ensure the new website met the requirements of the department and satisfied the relevant legal requirements for websites.
Two main rounds of User Assurance Testing were carried out on each template of the Content Management System, using test scripts. There was no formal User Assurance Testing for accessibility.

Accessibility

Let's take a look at the accessibility issues. The requirements document, echoing the government's own standards, specifies that:

10. Companies should note that the final website must comply with the Government Website Guidelines: http://www.e-envoy.gov.uk/oee/oee.nsf/sections/webguidelines-handbook-top/$file/handbookindex.htm and Level AA of the Web Accessibility Initiative (http://www.w3.org/WAI)."

It also states that one of the key objectives of the DTI website rebuild is:

To be a leading example of usable, accessible web design"

Finally, in Annex I, the objectives of the Usability Testing Phase are stated in these terms:

We need to ensure that we provide high quality, usable templates for incorporation into the Percussion CMS. To do this we need a robust programme of usability testing carried out during the design and build phase. This is important in ensuring that the site meets accessibility guidelines for the disabled and other groups, but it is also intended to improve the experience for all users. We need to ensure that users can find what they need to quickly and easily on the site.

We will expect the successful tenderer for this phase of the project to work closely with Percussion and also the company responsible for developing the templates. The results of usability tests will feed into the process of new page templates as they are developed.

Testing should carried out with representative groups of the DTI site's users and potential users. The final website must comply with the Government Website Guidelines and Level AA of the Web Accessibility Initiative.

All laudable stuff, but sadly the reality doesn't match the rhetoric. Why did the DTI:

This seems to be a classic case of a gap between the standards of accessibility a commissioner is stipulating in a requirements document, and their ability to verify that those standards have been met by suppliers. What concerns me is that no-one in the DTI or in any other central government department (the eGU anyone?) seems to have take it upon themselves to fulfil that quality assurance role, and as I've said many times before it's a major failing of many government web projects currently.

Who's to blame?

It's hard to see who comes out of this with any credit at all:

What happens now?

It's fascinating to me how a government department can spend £200,000 in 2006 on such a poor website. There's no shortage of guidance, advice and support for those seeking to produce and commission quality websites today, so why did the DTI and its suppliers fail to take pretty much any of it on-board? What steps will the DTI be taking now they know the website doesn't meet the objectives and requirements they stipulated? I'll be trying to learn more over the coming weeks - I've been directed to make furhter enquiries about the website to the DTI's Response Centre (dti.enquiries@dti.gsi.gov.uk) and of course I'll post what I do learn here.

May 19, 2006

My top 3 public sector sites

@ 9:37 PM

William Heath from the Ideal Government Project (external link) turned the tables on me a bit after my rant about the new DTI website, and asked:

What are your three top-rated public-service web sites? And is it just accessibility you focus on, or content and effectiveness?

To answer the second question first, I don't just focus on accessibility, but I do believe that it a high degree of accessibility is a fundamental characteristic of any quality website. It's a good general indicator - to be truly accessible a site must have a number of other inherent qualities, including valid, semantic mark-up, good information architecture and a usable interface.

I also like to see apparently minor features like clean, technology-neutral URLs, good 404 pages, robust error recovery, user-friendly functions like RSS feeds and mailing lists, and effective search. It's this attention to detail that separates a great site from an average site for me.

My three top-rated public service sites? It's a great question, and the answer is naturally very subjective. All of these sites have issues (as does every site I've ever worked on I hasten to add), but I'll go for:

  1. Royal Borough of Kensington and Chelsea (external link)
  2. London Borough of Lambeth (external link)
  3. Lincolnshire County Council (external link)

Try as I might I couldn't come up with an exemplar from central government. I did consider the National Crime Squad (external link) since it's been standards-based for a long time now, but it's a defunct body and the site won't be there much longer.

So what are your top three public sector sites? What gems have I missed? They needn't be UK or english-language sites - it would be really interesting to hear of some top notch sites from elsewhere.

May 17, 2006

DTI achieves new low

@ 10:56 PM

Layout tables galore on the DTI websiteUsually it's accompanied by a feeling of disappointment, resignation and perhaps mild surprise. This week though I'm truly shocked by the mind-numbing, soul-crushing, bile-inducing awfulness of a new UK central government website. I've checked the date on this news release (external link) at least half a dozen times in the hope that it says May 2000 and not May 2006, or will reveal itself to be a sick joke. But no luck, it's a fact, the DTI's newly revamped website (external link) is about as shit as it's possible for a large, corporate website to be.

To make matters worse it's clear that they either don't know how shit it is, or don't care. Take their accessibility page (external link) for example, which boldly claims AA-level standard (sic), and provides a mine of useful information such as how to change the size and colour of text in Netscape. The entire site (thousands of pages at a guess) appears to be devoid of a single heading. It uses a javascript pop-up to provide a printable version of pages.

This time though I'm not just going to whinge about it here, I've been galvanised into action. I'm determined to do some digging to find out just what process was followed to produce this monstrosity, how much it cost and why the eGovernment Unit (external link), whose mission according to the PM is ensuring that IT supports the business transformation of Government itself so that we can provide better, more efficient, public services, are failing so miserably in their responsibility to promote best practice across government.

March 29, 2006

Guardian Inside View

@ 12:50 PM

When Better Connected (External link) was published at the beginning of the month I was asked if I'd write a short piece for the Guardian newspaper detailing our approach to web development at Clackmannanshire Council, with a slight emphasis on accessibility. I was very happy to do so, and it was finally published in today's paper, in the Epublic supplement. You can also read it online at the Guardian site - Why size doesn't matter in setting web standards (External link).

Also worth a read is the headline article in the supplement, Online, but out of touch (External link).

February 2, 2006

CIPFA Governance

@ 5:43 PM

<sigh>

Here we go again. Another shiny new website launched in the public sector, this time for the CIPFA Governance Network  (external link). Another table-based, inaccessible, doctype-less, javascript-dependent, tag-soup of a site.

They claim:

"the public expect more from those who govern in terms of standards, behaviour and outcomes."

Hmm. So long as those aren't web standards I guess the public will be much happier now.

New professionalism  (external link)? Nah, same old comfortable amateurism.

January 26, 2006

Google Web Authoring Statistics

@ 4:35 PM

As part of their work with the WHAT (Web Hypertext Application Technology) Group  (external link), Google have released the results of an analysis of a billion HTML documents  (external link) in the wild. It makes interesting reading, and there are some horrors and surprises in there - the widespread use of class names like 'smalltext', 'white' and 'link', for example.

I hope this is a baseline for the start of a longitudinal study which will let us see how the web is evolving over time. There's no analysis of doctypes, which would have been useful, and of course with every element taken in isolation any generalisations made from such stats are wholly invalid. My suspicion is that the web standards:tag soup ratio is still pretty darned small, but that matters are improving, but I can't prove it. Yet.

January 5, 2006

@media 2006 yadda yadda yadda

@ 9:42 PM

Gosh, @media 2006  (external link) announced. Cue multidinous, gushing blog posts extolling the virtues of a conference not even one year old.

They'll be right though, 2005 was top-notch, and the line-up for 2006 is shaping up nicely. Eric Meyer is going to be speaking. I'm so excited (and such a fanboi) I think I feel a song comin' on. If you can only get to one web standards conference at the QEII Centre in London in June in 2006 make sure it's this one.