November 2005 Archive
November 28, 2005
No surprises in EU report
As reported on Out-law and many other sites, a report published by the European Parliament has found that just 3% of public sector websites are achieving WCAG Conformance Level AA. It should be noted that 'public sector' in this report equates to central government and not local government, where the picture is slightly rosier.
It's no surprise to me that conformance levels are so low, at least in the UK. Following my recent mini-rant about visas4UK winning a major award, I contacted the eGovernment Unit (eGU) to ask what their response was to such an inaccessible site being lauded as best practice. They responded (eventually) thus:
The Cabinet Office does provide guidance to both central and local government on the eAccessibility of websites.... However, this is guidance only and it is the practical and indeed legal, responsibility of individual departments and their web management teams on how they interpret and apply such guidelines in order to comply with, eg, the Disability Discrimination Act.
What pressure or sanctions do departments face if they fail to adopt the guidelines issued by the eGU? It appears that you are not interested at all in whether departments actually follow the guidance, you're simply concerned with making sure the guidance is robust. If the only pressure is the threat of legal action then it's effectively no pressure at all.
Who is playing the essential quality assurance role in this scenario, to ensure that poorly designed, inaccessible sites aren't emerging from departments?
I doubt I'll ever get a reply, but until there is significant pressure from within government, the situation isn't going to improve in the near (or possibly even distant) future.
November 18, 2005
Most Wanted - criminal web design
Another major UK public sector site launch, another hideously bloated, nested-table monstrosity. This time it's the Crimestoppers 'Most Wanted' site.
How's this for starters:
<!-- xinclude virtual="/includes/objPageCache.asp" --> <!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"> <!-- Document coded using XHTML 1.0 rules | HTML 4.01 DTD used for backwards campatibility --> <html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
That's a new one for me.
Crimestoppers may be a charity, but in 2003 it received total income of £3 million, including £1 million from the Home Office, so it wasn't a matter of resources. It can only be hoped that PAS 78 will help such organisations commission better sites in the future.
The Register reported how the site was brought down this morning by heavier than aniticipated traffic. Looking at the markup used on the site a good dose of web standards would have at least halved the page weight, and consequently doubled the number of visitors the site could have coped with, potentially avoiding such an embarassing launch.
November 16, 2005
Google Base goodness
For those who don't know, Google Base is a new service from the search giant. They describe it thus:
Google Base is Google's database into which you can add all types of content. We'll host your content and make it searchable online for free.
What sorts of content? Well, jobs, events, courses, reviews, cars for sale, and so on.
Although Google can host the content for you, if you've got existing web content you can feed the Base 'bulk uploads' using RSS over FTP, and just point it at your content. Since I've got a bunch of RSS feeds for ClacksWeb, it was a breeze to repurpose the feeds for Google Base, setup a daily cron on a PHP script to FTP the stuff onto Google's servers.
What's most interesting is the effect this appears to have had on our Google search rankings. For example, one of the Base items included in the RSS uploaded from the site today was a vacancy for a headteacher at one of our primary schools. Search Google Base for headteacher and there it is, 1st item of 2. Search Google for headteacher and there it is, currently 5th item of 2.4 million.
Coincidence? Possibly, but unlikely. ClacksWeb does do well on Google due to nice URLs, valid, semantic XHTML, reasonable use of headings and so on, but 5th of 2.4 million for what is effectively an ephemeral page?
It's too early to conclude that Google are using Base data to adjust the algorithms used on Google itself, but it would make sense - if it proves to be true it will drive traffic through Base, and a lot of that will be well-classified and tremendously more valuable than data gathered from trawling the web.
I'll be pushing more feeds to Base tomorrow, for events and general vacancies, only this time I'll do some searching beforehand...
November 10, 2005
Tools aren't skills aren't knowledge
Yesterday I was asked by a colleague in another local authority what software I had used to make our website accessible. The question threw me for a second. Since I no longer think of web development in terms of tools, it hadn't consciously occured to me that others in the business still do. I started to list the software tools I do use - HomeSite, TopStyle, Bullet Proof FTP, Firefox, CSE Validator Pro and so on. But of course software doesn't make websites accessible - at the stage we're at as an industry it's pretty much down to knowledge, research and understanding.
So I explained that it wasn't really the software I had used that was important so much as the time I had invested in learning what makes an site accessible, through the GAWDs mailing list, AccessifyForum and many websites. The more I thought about it the more I realised what a steep learning curve I'd been on in the past 12 months, and just how much time I'd dedicated to re-learning the fundamentals of the job of a web developer. Lists like css-discuss and Evolt's thelist, sites like ALA and QuirksMode, books by Zeldman, Meyer and Clark - I've read them all over the past 2 years or so, tried to learn from them and apply what I've learned. How do you get that across to someone who's looking for a magic software bullet to achieve a worthy aim, without scaring them off?
When it comes down to it tools can only be used to apply the knowledge you have using the skills you've developed (or if you're lucky were born with), and the production of just about anything of quality relies on all the presence of all three.
November 9, 2005
GC Accessibility Award
As reported previously, ClacksWeb was shortlisted for the Accessibility prize at the Good Communication Awards. On Monday I was lucky enough to be at the Bafta building in London to hear the site announced as the winner, and to receive the award from Garry Richardson, of BBC Sports Report fame. The first eight awards were presented by Phil Woolas, Minister for Local Government, but he had to dash off to another place, so Garry stepped into the breach for the later awards, including ours. Can't say I was gutted. ;O)
How much an award means depends on a lot of things, such as the number, breadth and quality of entries, and one of the most important factors is the authority of the judges. It was great then to see that the judging panel were all people in the industry I respected - Patrick Lauke , Derek Featherstone , Richard Conyard and Donna Smilie - and it made it a bit more special to know that the judges knew what they were talking about.
After breakfast at the fantastic Patisserie Valerie on Piccadilly we also got to do a bit of shopping, hitting Fortnum & Mason , Hamleys and Molton Brown . I don't miss living in London, but it would be nice if I could have chocolates from F&M's more than once a year!
November 6, 2005
Shaw Trust Report Overview
On Wednesday last week I received the audit report on ClacksWeb from the Shaw Trust Web Accreditation Service. Some of you may be aware that by Friday I had completed the necessary remedial work, and the site was accredited and given the Trust's "accessible plus" award. This was partly due to me having already addressed some of the issues in the report both from the user testing I observed, and from feedback I'd received from the Trust during the audit process. That notwithstanding I thought it would be useful to provide an overview of the report itself, and at a later date some of the more interesting and esoteric audit findings.
Our 56 page written report consists of 5 sections:
- Executive summary
- Background & methodology
- User testing
- Technical audit
- Automated testing
A nicely written couple of pages which were perfect for me to copy to my management, this section provided a high-level overview of the findings together with the remedial action required before accreditation could be awarded. In our case the site didn't require a full repeat audit, but in some cases this is needed to ensure the audit report has been understood and the issues addressed adequately.
Background & methodology
This section provides an introduction to the report, an explanation of what web accessibility is, and why it's important, and information about adaptive technologies. It also provides an overview of the Trust's testing methodology, describing what user testing is carried out and why, and brief details of the technical audit process.
Into the meat of the report, covering tests performed by each group of users (low vision, blind, mobility impaired, dyslexic & deaf), problems encountered together with URLs, screen shots and comments from testers. A summary of the users overall impressions is given at the end of the section.
Having observed one of the days of user testing myself, I had already addressed some of the issues in this section, but reading it through with the full range of tester experiences listed reminded me of the value it added.
The contents of the technical audit are based on automated tests, carried out using InFocus , but extensively interpreted and moderated by human checks. It's worth stressing that these results are a long, long way from the raw automated testing results produced by services such as SiteMorse - it brings home how little of the WCAG can be tested by software alone.
Instead of covering each WCAG checkpoint in turn, the technical audit is broken into functional areas such as 'tables', 'images', 'site map' and so forth, and within each of these areas a good deal of narrative is provided explaining the relevant issues as they relate to the site in question. Where action is required this is clearly stated, together with URLs of offending pages, or where necessary broader guidance, for example for the handling of PDFs.
This section surprised me - I was expecting a dry, less useful summary of the technical shortcomings of the site, but it proved to be much more than that.
I received the report by email, and was intrigued to see a zip file attached as well as the report document itself. It contained the 900+ HTML files generated during the InFocus automated testing, along with a screen shot of the options used when the software was run. The last section of the written report provides information about these InFocus files, pointing out false positives and issues to be address not covered elsewhere.
At first it was a bit of a slog nailing down individual problems from all those files, but once I'd understood the structure of the reports, using the extended search functions in HomeSite made it a breeze to get a quick list of pages which were effected by a particular problem.
All in all the report is impressive - it provides information for those in the organisation who need a summary but don't need to know the technical nitty-gritty. It provides softer, humanised user testing feedback which reminds you that the work we do is for people, not validators and automated accessibility testing software, and finally it provides expert technical advice which if followed and understood can only elevate the accessibility of the site in question. And above all it's usable - not a tome filled with technical references and jargon, but a practical, real-world guide to improving the audited site.