October 2005 Archive
October 29, 2005
Searching Local Government
The development of a website is almost always an iterative process. Once the core functionality is in place improvements tend to be incremental, either by extending the range of functions, or by improving existing functions. For example, last week I installed the latest version of mnoGoSearch , the search engine software I use for ClacksWeb. For tasks like this I always try to programme in time to have a look at other UK local authorities to see what they're up to in the same area. It helps me to get ideas for future developments, and often ideas I can implement at the same time as fulfilling the task in hand.
In this instance I reviewed the search functions of the websites of Clackmannanshire Council and 18 other UK local authorities, looking for examples of good practice and novel ideas which might improve user experience. In this first piece I'll present some of the findings, concentrating on two aspects of search that impact upon the user:
- The search results data itself - how relevant it is (does it answer the user's query?), how comprehensive (does it include results from files in formats other than HTML?), what visible metadata it includes (does it provide file size, date last modified, file type, etc?).
- The presentation of the results - the validity of HTML, the structure and accessibility of the results, what help was provided for users, and so on.
In two future posts I'll cover some of the interesting and novel features I found, and some tips for maintaining and developing a site search function.
For want of a better method I took the top ten sites from the latest (seriously flawed, but that's another matter) UK local government website rankings, plus the sites ranked 50, 100, 150 and so on, up to site 450. A full list of the sites reviewed is provided at the end of this article.
I searched each site using the search function provided on the front page, except in the one case where search was not available on the front page. Since local authorities serve different functions I needed to use neutral search queries. The two I used were:
- make a complaint - I wanted information on making a general complaint to the Council;
- accessibility - I was interested in the Council's web accessibility policy and provision.
I recorded a range of information about the search results, including the product or package used (where known).
In terms of finding what I was looking for it was an encouraging experience, at least for me as an able-bodied, sighted user. I rated nine of the sites as providing 'good' results, and only three as 'poor'. On all except one of the sites I found the required information for query one. Query two was more problematic, with many superfluous results.
Here's a quick summary of some basic indicators:
- Query one produced between 0 and 6510 results, query two between 1 and 21700 results;
- Three of the nineteen sites had error-free HTML, tested with the W3C validator;
- The most common HTML errors were unclosed tags, unencoded characters (especially ampersands) and lack of doctype declaration;
- Only two sites presented results using structured, semantic HTML, where a relationship between results and their metadata was explicit (see below for more detail);
- Eight sites used CSS for page layout;
- Five sites used the POST method for form submission, meaning that users cannot bookmark results;
- Ten sites did not implement stopwords, meaning that the 'a' in query one produced a high noise to signal ratio on these sites;
- Four sites implemented a minimum query length, two of 2 characters and two of 3 characters;
- Only ten sites provided help for search users;
- The longest search took 47 seconds (and had the courtesy to tell me so!).
Presentation of results
One of the more disappointing aspects was the quality of the mark-up used for the results themselves. In only two cases were results provided with any explicit relationship between the result title and the metadata. In the first case the title was presented as a level 2 heading, with the metadata (the first x characters of the page in question) a paragraph beneath. In the second case the results were presented as a definition list, with the title as definition term and metadata as definition data.
All the other sites used either various table-based layouts, none with table header cells or other assistive mark-up, or simply a paragraph per result. I'm sure none of these would have made for a comfortable experience for a screenreader user - the lack of structure between and within results being a real barrier to accessibility.
It's good practice to provide users with as much information about the destination you're sending them to with any hyperlink, and in my opinion essential with search results. When users follow links from within a content page of a site, they will be able to take some context from the other information on the page. With search results they don't have that context to inform their judgement, and so must rely on the information the search results provide.
Ideally I'd want to know the type of file I'm heading to (is it HTML, a PDF, a Microsoft Word document, etc), the size of that file, and when it was last updated. Here's what I found:
- Nine sites provided the file type, either explicitly or via a graphical icon;
- Only three sites provided the file size;
- Seven sites provided the last modified date.
This lack of metadata surprised me. It's hard to understand why an organisation would consider purchasing or adopting a search package without support for these functions.
It was possible to identify the search engine used by fourteen of the sites:
- Open Objects Kbroker (3 sites)
- Semaphore (3 sites)
- Verity Ultraseek (2 sites)
- Google Mini
- Mambo (CMS)
- Jadu (CMS)
- dotEditor (CMS)
In my unscientific tests the dedicated search packages did seem to produce more relevant results than the CMS searches, but did not necessarily present them in a better fashion. In reality this is far too small a sample to draw any valid conclusions about the value of individual or groups of products.
Search is a critical function on a local government site - the search engine results page (SERP) will without exception feature in the top ten most visited pages. Even ClacksWeb, catering for the smallest Council in Scotland, processes more than 10,000 queries in an average month. Given that it would be reasonable to expect it to be a lovingly crafted, finely-honed page, with relevance of results, validity of mark-up and accessibility all prime considerations. Clearly this isn't the case, with many of the sites reviewed failing to provide what could be described as a high quality site search.
Although I found what I was looking for on most sites, I was largely disappointed with the technical quality of the SERPs. I did get some good ideas for enhancing our search function, which will be implemented in the near future, but I also picked up a number of examples of how not to approach search. I'll post more about both at a later date, plus some tips for creating and maintaining a top-notch site search facility.
Appendix - the review sites
- Aylesbury Vale District Council
- Cannock Chase District Council
- Clackmannanshire Council
- Dorset For You
- East Riding of Yorkshire Council
- Hambleton District Council
- Hastings Borough Council
- Isle of Wight
- Lewes District Council
- Oldham Council
- Portsmouth City Council
- St. Edmundsbury Borough Council
- Scottish Borders Council
- Sefton Council
- Shepway District Council
- South Bucks District Council
- South Lakeland District Council
- Thurrock Council
- Trafford Metropolitan Borough Council
October 28, 2005
Businesses shun UK eGovernment services
The eGov Monitor reported this week that recent figures published by the European Union show UK businesses lagging behind their continental counterparts in the take-up online government services. The Monitor makes no effort to explain this position, where the UK has one of the highest ratings for availability and sophistication of online services for business, but the lowest consumption of those services by businesses in the EU. Only 36% of UK businesses polled had used the internet to obtain information from public authorities, compared to 94% in Sweden and 90% in Finland.
It's hard to infer very much in the way of solid conclusions from the figures alone. My first instinct is that the apparent lack of any co-ordinated development strategy for central government web sites and services is a significant factor in the UK's poor showing. The lack of consistency between the multitude of ministry and departmental sites must be a barrier to uptake - which business wants to spend time navigating through perhaps 3 or 4 websites with different schemes of navigation, separate registration processes for users, and different levels of service provision, when they can rely on paper forms, the telephone and the venerable Royal Mail?
As an example, the HM Revenue & Customs provide a good range of online services, including PAYE for employers and the well-exposed self-assessment for income tax. Both require separate registrations, and those registrations are only valid at the HM R&C site. If I need to register my business with the Health and Safety Executive at the same time, I need to visit another site, with a different design, layout and navigation system, and download a form to print and send by mail, since there's no online service available. It's not a pleasant prospect.
My wife and I have got a small business (a smallholding) and since we sell food we're VAT registered. I'm also a web professional. So why aren't I using the eVAT service? Because it's just too much trouble. We're sent a paper form every quarter with a prepaid return envelope, and we fill it in and bung it in the mail. It's convenient and it works for us, as I'm sure it does for many, many other business. And this illustrates the crux of any successful online service - there has to be a tangible benefit to the user, and where businesses are concerned that means a financial benefit.
There will be other factors of course - mistrust or fear of online services (what if my information is lost?); mistrust of government in general (what will they do with my information?); lack of communication of potential benefits (what's in it for me?); entrenched business practices (we've always done it this way and it's never let us down). But for me the priority for the government is the pursuit of some uniformity in their online offerings, and perhaps even a brand. Just not Directgov.
October 26, 2005
Shortlisted for GC Award
It was very gratifying to learn on Monday that ClacksWeb is one of three government sites shortlisted for the 'Accessibility Award' in the e-Government category of the Good Communication Awards 2005 . I'll be popping down to London on 7th November for the awards event, hosted by Phil Woolas MP, Minister for Local Government, to learn the decision of the judging panel, and even if we don't win it's great exposure for the Council.
October 25, 2005
Shaw Trust Prologue
A couple of enquiries from readers of Blether about my relationship with the Shaw Trust have made me realise that a vital part of the story has been missed - the bits that happened before the ClacksWeb audit even began.
Development on the site started in earnest in autumn 2004, with a target completion date of March 2005. The site's development plan had provision for contracting an external, expert auditor preferably to cover usability and accessibility. I recognised at an early stage that the testing I could arrange myself was inadequate in depth, breadth and quality. In the end the budget didn't exist to get any professional testing done at all before the site's launch. That doesn't mean that no testing was done, however.
I'd imagine that like many developers in large organisations, my main usability testing pool consisted of colleagues with differing levels of web experience, selected to try different bits of the site as it progressed. Forms, navigation, colour schemes - these things and many others all got the informal review treatment, and predictably the results were never conclusive. It was a useful format for finding some server-side bugs, but beyond that it was hard to separate subjective preferences from true usability issues.
For accessibility testing I had access to one screenreader (Jaws) user, blind, who was relatively new to computers and new to the web. Whilst his input was valuable at the time, in retrospect I'm certain it wasn't representative of the average Jaws user. He lacked the experience to have developed strategies for using the software to overcome the barriers that exist in even the most carefully constructed site, so many of the problems he experienced weren't necessarily site design issues. Apart from this I read the books (Joe Clark's and Jim Thatcher's ) and tested and retested against the WCAG myself.
And that was it for testing. The site launched on schedule, but like many sizeable web projects (the site has about 1,000 static pages and tens of thousands of dynamic pages), for a couple of months immediately afterwards I fire-fought problems that would have been picked up by better testing, but that I wouldn't have had the time to fix even if I had known about them. It's called a public beta I suppose!
A new financial year brought the possibility of getting some proper testing done. I contacted a number of usability and accessibility testing service providers and received a few quotes. It soon became clear that the option of separate usability and accessibility testing was beyond my meagre budget, so a decision was made to focus on accessibility, and on getting as much as I could for the Council's money. I looked at a number of accessibility audit providers, but the Shaw Trust's pan-disability user testing was the deciding factor. No other provider I found came close to offering the same breadth of user testing:
- Partially sighted
- Mobility impaired
I also appreciated the Trust's decision to work with a commercial partner, CDSM , in providing the technical audit. The Trust's primary business isn't the web, but CDSM's is - the combination of expert awareness of disability and expert knowledge of the technical web was unique in my experience.
With the decision made it was a matter of getting more details of the audit process, satisfying myself that it was what we needed, establishing cost, and formally contracting with the Trust. A pre-audit questionnaire provided the Trust with full details of the site, covering areas like:
- the site's purposes and its target audiences;
- its size, and whether pages are dynamic or static;
- who edits the site content;
- use of video, audio, or other multimedia;
- areas of the site requiring registration or password access.
That completed the audit was ready to start.
October 24, 2005
As Blether was launched early (it was supposed to be a November CSS Reboot entry) I'm still working on the site, adding a few features that slipped by the wayside. Tonight's addition is Select-a-Blether, a mechanism for changing the styling of the site.
The first alternative to Verbal Kint is world-class blether Richard Nixon. To have Tricky Dicky grace the site just follow his link in the navigation. More blethers will follow when I get around to them, together with some notes explaining their presence.
Hotmail, the email incinerator
A heads-up for anyone who operates any sort of system which sends emails to customers - Hotmail could well be rejecting your email and not telling you, or the recipient, that they've done so. You can get yourself whitelisted by signing up to Bonded Sender, but that costs money. Oh, and the threshold for Bonded Sender to reject your application is 1 complaint per one million emails sent. Oh, and if they reject your application you can't reapply for 90 days.
The full skinny can be found on e-consultancy . If you're a large organisation sending out a lot of email you might like to reassure yourself that a significant proportion of your customer base doesn't think you're ignoring them.
October 20, 2005
Shaw Trust User Testing
Accessibility was a prominent feature of ClacksWeb's development plan, reflecting its status as a local authority site and my increasing awareness of it accessibility as a critical issue. As work on the site progressed in early 2005 I undertook regular checks to make sure it was on track to deliver, both by checking against the WCAG and by using very simple tests like increasing the text size, browsing the site with Lynx, and navigating with the keyboard. Yesterday it became very clear to me just how inadequate that sort of testing was in truly determining whether or not the site was accessible.
I had the pleasure of spending the day at the Shaw Trust's regional headquarters in Llandarcy, Wales, experiencing first-hand some of the user testing of ClacksWeb. Accompanied by Andrea Kennedy, the Trust's Web Accessibility Services Officer, and Grant Broome of CDSM , who co-devised the audit programme and conducts the technical audit, I met, observed and learned from 6 users with various disabilities and levels of web experience. They were:
- Linda - a Jaws user;
- Steve - an accomplished motor-impaired user of a keyboard with a guard;
- Malcolm - a Dragon NaturallySpeaking user who also tested the site with ZoomText ;
- Jamie - a Jaws user;
- Mark - an accomplished Jaws user;
- Ann - a dyslexic user who undertook readability testing.
The users are provided with scripts by Andrea - basically a series of tasks, for example yesterday the users were required to register with MyClacksWeb - and are asked to record their positive and negative experiences while working their way through them. They all use PCs running Windows XP. The atmosphere is relaxed and informal - this isn't a sterile, clinical testing lab, but somewhere the testers seem to come to socialise as well as make their services available to the Trust. They are all volunteers, although there are plans to create a social enterprise offering user testing on a commercial basis - an excellent idea in my opinion.
I spent some time with each user, observing and asking questions, or in the case of Mark listening in to his use of Jaws. This was an invaluable experience for me, seeing and hearing how these users navigated the site, what barriers they were facing and what strategies they each used to overcome them. It only took a couple of minutes of observing Mark, and having him talk me through his perception of the site, to discover a serious problem with the site's contextual navigation menus. Basically they are placed just above the destination of the site's "skip to content" link, so he never knew that contextual navigation existed, and was forced to use alternative methods such as search or the A to Z to complete the tasks.
Another serious problem was encountered by Steve, the keyboard user - I hadn't specified any focus styling on links, so when tabbing around the site it wasn't obvious which link had the focus. The browser default, faint dotted border just wasn't enough for Steve to perceive the current focus. These two examples show just how important pan-disability user testing is. Neither would have been discovered by automated testing (99% of pages on the site are valid XHTML and satisfy WCAG to at least AA), and neither user was affected by the problem the other discovered.
More testing was scheduled for today, by another group of users with different disabilities, with the technical audit to follow soon thereafter. I expect to receive the initial report late next week, and based on my experiences yesterday have no doubt that it will prove to be an extremely valuable resource.
October 18, 2005
DTI e-commerce award winner horror
The winners of the DTI E-commerce awards for 2005 were announced last week. I must admit it passed me by (I was on my hols) but I was interested to see who had won the 'eGovernment National ICT Innovators Award' on my return. According to the award site :
These awards will recognise best practice in the development of information and communication technologies (ICT) by UK business and public sector communities.
Heady stuff, and there's more:
Awards will be presented to individuals and or project teams that have demonstrated clarity of thinking and development/deployment of an approach that has the capacity to change the ICT paradigm in the particular technology,market sector.
While it's not a surprise - there are still a vast number of sites out there that don't degrade gracefully - the fact that this site won a national award is evidence of the long, long way we still have to go as an industry before accessibility and web standards are established as de facto good practice.
World Usability Day
3rd November 2005 is World Usability Day and to mark the occasion the Scottish UPA is running a Scottish Usability Showcase in Edinburgh. It's a series of short, 15 minute presentations, and I'll be speaking about my experiences redeveloping ClacksWeb using web standards, and the process of trying to build in accessibility and usability.
If you're in or around Edinburgh why not come along? In my experience SUPA events are always interesting, and cheap - free for members, and a tenner for non-members (a fiver for students). Bargain.
Update: SUPA have just announced that to celebrate WUD this event is now free of charge. So you've got no excuse for not coming along.
October 17, 2005
A Shaw start
Today marks the start of the Shaw Trust web accessibility audit of ClacksWeb , and is hopefully the first step on the way to Shaw Trust Accreditation for the site.
For those of you who aren't aware of the Trust and its web accessibility services, they offer perhaps the most comprehensive pan-disability audit and accreditation scheme available in the UK (and possibly further afield). As well as automated tests, the Trust uses a panel of real users, with a range of disabilities.
There's a lot more information on the Shaw Trust website , and I'll be recording my experiences of the process on this site as we progress. Although I'm experiencing some high anxiety at the moment (what will they find wrong with the site?!) it's an exciting prospect to have quality feedback from real users, and going through the process can only help to improve the site.