Searching Local Government

The development of a website is almost always an iterative process. Once the core functionality is in place improvements tend to be incremental, either by extending the range of functions, or by improving existing functions. For example, last week I installed the latest version of mnoGoSearch (External link), the search engine software I use for ClacksWeb. For tasks like this I always try to programme in time to have a look at other UK local authorities to see what they're up to in the same area. It helps me to get ideas for future developments, and often ideas I can implement at the same time as fulfilling the task in hand.

In this instance I reviewed the search functions of the websites of Clackmannanshire Council and 18 other UK local authorities, looking for examples of good practice and novel ideas which might improve user experience. In this first piece I'll present some of the findings, concentrating on two aspects of search that impact upon the user:

  1. The search results data itself - how relevant it is (does it answer the user's query?), how comprehensive (does it include results from files in formats other than HTML?), what visible metadata it includes (does it provide file size, date last modified, file type, etc?).
  2. The presentation of the results - the validity of HTML, the structure and accessibility of the results, what help was provided for users, and so on.

In two future posts I'll cover some of the interesting and novel features I found, and some tips for maintaining and developing a site search function.

The method

For want of a better method I took the top ten sites from the latest (seriously flawed, but that's another matter) UK local government website rankings, plus the sites ranked 50, 100, 150 and so on, up to site 450. A full list of the sites reviewed is provided at the end of this article.

I searched each site using the search function provided on the front page, except in the one case where search was not available on the front page. Since local authorities serve different functions I needed to use neutral search queries. The two I used were:

  1. make a complaint - I wanted information on making a general complaint to the Council;
  2. accessibility - I was interested in the Council's web accessibility policy and provision.

I recorded a range of information about the search results, including the product or package used (where known).

Findings

In terms of finding what I was looking for it was an encouraging experience, at least for me as an able-bodied, sighted user. I rated nine of the sites as providing 'good' results, and only three as 'poor'. On all except one of the sites I found the required information for query one. Query two was more problematic, with many superfluous results.

Here's a quick summary of some basic indicators:

Presentation of results

One of the more disappointing aspects was the quality of the mark-up used for the results themselves. In only two cases were results provided with any explicit relationship between the result title and the metadata. In the first case the title was presented as a level 2 heading, with the metadata (the first x characters of the page in question) a paragraph beneath. In the second case the results were presented as a definition list, with the title as definition term and metadata as definition data.

All the other sites used either various table-based layouts, none with table header cells or other assistive mark-up, or simply a paragraph per result. I'm sure none of these would have made for a comfortable experience for a screenreader user - the lack of structure between and within results being a real barrier to accessibility.

Metadata

It's good practice to provide users with as much information about the destination you're sending them to with any hyperlink, and in my opinion essential with search results. When users follow links from within a content page of a site, they will be able to take some context from the other information on the page. With search results they don't have that context to inform their judgement, and so must rely on the information the search results provide.

Ideally I'd want to know the type of file I'm heading to (is it HTML, a PDF, a Microsoft Word document, etc), the size of that file, and when it was last updated. Here's what I found:

This lack of metadata surprised me. It's hard to understand why an organisation would consider purchasing or adopting a search package without support for these functions.

Search engines

It was possible to identify the search engine used by fourteen of the sites:

In my unscientific tests the dedicated search packages did seem to produce more relevant results than the CMS searches, but did not necessarily present them in a better fashion. In reality this is far too small a sample to draw any valid conclusions about the value of individual or groups of products.

Conclusions

Search is a critical function on a local government site - the search engine results page (SERP) will without exception feature in the top ten most visited pages. Even ClacksWeb, catering for the smallest Council in Scotland, processes more than 10,000 queries in an average month. Given that it would be reasonable to expect it to be a lovingly crafted, finely-honed page, with relevance of results, validity of mark-up and accessibility all prime considerations. Clearly this isn't the case, with many of the sites reviewed failing to provide what could be described as a high quality site search.

Although I found what I was looking for on most sites, I was largely disappointed with the technical quality of the SERPs. I did get some good ideas for enhancing our search function, which will be implemented in the near future, but I also picked up a number of examples of how not to approach search. I'll post more about both at a later date, plus some tips for creating and maintaining a top-notch site search facility.

Appendix - the review sites

Comments

Great article, Dan. I hadn't ever really considered how difficult searching must be on very large sites, and I guess accessibility often plays a second fiddle to getting the search to actually work. Do you have to write these results up to your bosses, or is it just for your own benefit that you did all this testing?

Posted by: Dave at October 30, 2005 9:05 PM

No, my bosses aren't interested in this sort of stuff, it's purely for my own consumption. I wouldn't normally record so much technical detail, but thought it would be interesting for the purposes of the post.

As for accessibility playing second fiddle, I'm sure you're right in some cases, but where the product is from a commercial provider then users should be pressuring the provider to improve the accessibility of the product. Sometimes the sales claims of AA or AAA compliance seem to be forgetten once the contract is signed.

Posted by: Dan at October 31, 2005 5:42 PM

Interesting article. Would it be too cheeky (or unethical) to ask how you ranked the sites you looked, and in particular, which were the best, so that we/the others might learn something?

Posted by: catriona hamilton at November 28, 2005 9:58 AM

Catriona, I didn't attempt any formal ranking of the sites as such. It would have mean assigning different weightings to different elements, and trying to come up with some sort of scoring mechanism. I was much more interested in the utility of the results and the technical quality of the mark-up.

As for which were the best, the Lewes, Trafford and Thurrock sites all produced useful results with good metadata, but the mark-up wasn't the best (they all use Open Objects Kbroker). I'd suggest that the Clackmannanshire site provides the best mark-up, with pretty good results and metadata, but then I'm seriously biased! Beyond these four the rest had serious problems in one area or the other.

I will be posting more about search soon, and will highlight particular features on these and other sites that I liked.

Posted by: Dan at November 28, 2005 5:44 PM

Post a comment

Personal information