February 2006 Archive

« January 2006 | Main | March 2006 »

February 27, 2006

Between the Devil and the Deep Blue Sea

@ 8:02 PM

[This piece was written for Public Sector Forums (external link) and is cross-posted here to allow comments from those who don't have access to that site.]

Or SOCITM and SiteMorse vs. the ODPM and Site Confidence...

There's a pithy saying, much loved by researchers and statisticians, that goes something like this:

Be sure to measure what you value, because you will surely come to value what you measure.

Sage advice, which you would be wise to follow whatever business you're in. In reality, a greater danger comes from the likelihood that others will come to value what you measure, or worse still, what others measure about you.

We're all familiar with the arguments for and against automated web testing. It should form an integral part of any web team's quality assurance policy, and can save enormous amounts of time pinpointing problems buried deep in your site. By itself an automated testing tool can be a valuable aid in improving the quality of your website. But when automated tests are used to compare websites the problems start to come thick and fast. The recent disparity between the 'performance' tests from SiteMorse and Site Confidence are a case in point.

Who can you trust? SiteMorse will tell you that their tests are a valid measure of a site's performance. Site Confidence will tell you the same. Yet as previously reported on PSF the results from each vary wildly. SOCITM have offered this explanation for the variation:

"The reality is that both the SiteMorse and Site Confidence products test download speed in different ways and to a different depth. Neither is right or wrong, just different."

And therein lies the real problem. If both are valid tests of site performance then neither is of any value without knowing precisely what is being tested, and how those tests are being conducted. The difficulty is that no-one is in a position to make a judgement about the validity of the tests, because no-one outside of the two companies knows the detail.

It's worryingly easy to pick holes in automated tests. Site Confidence publishes a 'UK 100' benchmark table (external link) on its website, and at the time of writing it has Next On-Line Shopping (external link) sitting proudly at number 1, with an average download speed of 3.30 sec for a page weighing 15.33kb. The problem is that the Next homepage is actually over 56kb. At number 5 is Thomas Cook (external link), with a reported page size of 24.92kb, when in fact it's actually a whopping 172kb. Where does the problem lie in this case? Are the sites serving something different to the Site Confidence tool? Is the tool missing some elements, perhaps those referenced within style sheets, or those from different domains? The real problem is that we can't tell from the information provided, and the same holds true for SiteMorse league tables.

A few associates and I have been in correspondence with SOCITM for some months now about the use of automated tests for Better Connected. To date the responses from SOCITM have not completely alleviated our concerns. While some issues have been addressed by SiteMorse, many remain unanswered, and perhaps the greater concern is the attitude of SOCITM. For example, when pressed on why SOCITM hadn't sought a third party view of SiteMorse's testing methods, the response was:

You wonder why we have not done an independent audit of the SM tests. To date when detailed points have been raised, SM has found the reason and a satisfactory explanation, almost always some misunderstanding of the standard, or some problem caused by the CMS or by the ISP. In other words, there has been little point in mounting what would be an expensive exercise. You may, of course, not be satisfied with the explanations in the attached document to this set of detailed points.

I'll leave you to draw your own conclusions from that response, other than to say that I wasn't the slightest bit comforted by it.

Our concerns extend beyond Better Connected to the publication of web league tables in general. The fact is that we know very little about how SiteMorse conduct their tests, or what they are actually measuring. In some cases SiteMorse, or any testing company, will have to assert their interpretation of guidelines and recommendations to test against them, and have to make assumptions about what effect a particular problem might have on a user. For example SiteMorse will report an error against WCAG guideline 1.1 if the alt attribute of an image contains a filename, despite there being legitimate circumstances where such an alt attribute might be required. The fact is there are only two WCAG guidelines which can be wholly tested by automated tools (external link).

While SOCITM make no use of the accessibility tests from SiteMorse, there are similar concerns about performance tests based on no recognised standard, or which have no impact on users. For example SiteMorse raises a warning for title elements with a length of more than 128 characters, citing the 1992 W3C Style Guide for Online Hypertext (external link) as the source of the guidance. This guide is at best a good read for those with an interest in the history of the web, but for SiteMorse to use it as the basis for testing sites over a decade later is highly questionable. To quote from the first paragraph of the guide:

It has not been updated to discuss recent developments in HTML., and is out of date in many places, except for the addition of a few new pages, with given dates.

SiteMorse justifies the use of this test in league tables by saying that many browsers truncate the title in the title bar. But this ignores the fact that the title element is used for more than just title bar presentation (for example for search engine indexing), and that the truncation can depend on the size of the browser window (at 800x600 on my PC, using Firefox, the title is truncated at 101 characters, for example). While it may be useful as a warning to a web developer, who can then review the title for the use of the clearest possible language, it certainly should not be used as an indicator in the compilation of league tables.

From our correspondence with SOCITM it became clear very quickly that SOCITM don't know much about how SiteMorse tests either - as evidenced above there has been blind acceptance of the explanations given by the company and no independent expert view sought.

In most other arenas league tables are based on clear and transparent criteria. Football, exam results, olympic medals - all rely on known, verifiable facts. Unfortunately the same cannot be said of the current LA site league tables.

Our main assertion is that SOCITM should be working with local authorities and UK e-standards bodies (if there are any left) to produce a specification for the testing of websites using meaningful, independently assessed measures which are based on consensus, rather than blindly accepting the existing, opaque tests offered by SiteMorse, Site Confidence or any other private concern. There needs to be public discussion about precisely what we should be measuring, how those measures are conducted and what conclusions it would be valid to draw from the results.

In the end it all comes down to a question of credibility - for Better Connected, SOCITM, the testing companies, and most importantly those of us who are responsible for local authority websites. It's likely that league tables are here to stay, but unless we are prepared to question the numbers behind the tables, and the way those numbers are produced, we're probably getting what we deserve.

February 21, 2006

Visionary Design Awards

@ 6:05 PM

Robert Llewellyn playing KrytenI found out today that ClacksWeb (external link) has been shortlisted in the Public Sector category of the Visionary Design Awards (external link), and needless to say I'm chuffed. So I'm off to London next week for the awards, presented by old ice cube head himself, Robert Llewellyn.

There are some interesting sites on the shortlists for all the categories, but what particularly caught my eye was the Inaccessible Website Award category. What a fantastic idea that is - it should get some good publicity, and while I'm sure that Kate Bush and Blays Net Ratings deserve their nominations, I can't help but feel that if Disney World were to win it would bring the greatest benefit to the accessibility cause.

Update 23rd February: Last night BBC Radio Scotland (external link) ran a short piece on the awards on their Newsdrive programme. It's nice to hear web accessibility discussed on national radio, especially at prime time (this went out at about 5:30pm). Kudos to Radio Scotland. You can download it here: radioscotland.mp3 (2mb, 4 minutes 27 seconds, mp3).

Update 4th March: Well, we didn't win, but I had a great night nonetheless, thanks in no small part to the lovely Judy Friend and Pat Beech between whom I sat. The speakers were extremely good, as was the food and wine. Congratulations to Great Sampford Primary School (external link) and all the other winners, except Kate Bush that is. :O)

Off to Zanzibar to meet the Zanzibarbarians

@ 5:47 PM

I'm in need of enlightenment. How would one start with a defined need, say for an online marketplace for suppliers to government and buyers in government to connect with each other, and end up with Zanzibar (external link)?

I know it's Tuesday, so we were about due another .gov.uk website which rides roughshod over the government's own guidelines for producing websites, but this one's a bit more curious and has me puzzled.

Technically it's absolute shite. Table-based layout, no doctype, spacer.gif in abundance, terrible CSS, no headings. It's the brochure site of your favourite local firm of solicitors circa 1999.

No surprises there then, but the sites of Managed Services(external link) and OGC Buying Solutions(external link), the two apparent partners in Zanzibar, are not too shabby at all, and curiously identical.

So why as A key part of the Government's procurement strategy is Zanzibar quite so awful? And why is it called Zanzibar? And why is the domain zanzibaronline.gov.uk instead of zanzibar.gov.uk? I think we should be told.

A bonus point to anyone who can name the classic film the title of this post comes from (without cheating!).

Comment on Off to Zanzibar to meet the Zanzibarbarians (2)

February 15, 2006

User-defined accesskeys - update

@ 7:59 PM

In response to a post by Mike Cherim to the GAWDs (external link) mailing list today, I've updated my accesskeys script and re-evaluated the way accesskey defaults should be handled. Mike was contacted by a user whose name contains an accented letter, which he enters using the keystrokes Alt +0228. If a site implements the UK Government accesskey recommendations, Alt-0 is the accesskey for the accesskey page, and a conflict arises preventing the user from producing his or her accented character. In fact any site implementing accesskeys 0-9 is creating potential conflicts for users requiring to input extended characters.

The solution is to implement no default accesskeys. To make it easier for a user to set standard keys I've extended the script to allow the site owner to provide suggested keys, which the user can set with a single form button (an idea borrowed from the implementation by Gez and Rich). I've also fixed a bug which was outputting empty accesskey attributes in some instances (thanks to Gez for the heads-up).

The extended script is running at ClacksWeb (external link) and will be running at Accessites.org (external link) in due course.

February 7, 2006

User-defined accesskeys

@ 10:09 PM

The drawbacks of accesskeys are well documented (external link), but one way of mitigating those drawbacks is to allow the user to define their own accesskeys for a site. In the absence of an established standard this is the best compromise - those who do make use of the functionality can do so, those who have problems with application or OS conflicts can disable them.

The implementation I'll be describing here is a server-side solution, using PHP. Before I started work on this I was aware of the work done by Rich Pedley and Gez Lemon (external link), but hadn't looked at their scripts. I wasn't aware of the work done by Thierry Koblentz (external link), and only found his implementation when searching for Rich and Gez's. Rich and Gez are clearly much classier coders than I, having provided an OOP solution - my version is totally procedural.

So why bother producing yet another script when there are already at least two out there? Two primary reasons - firstly I wanted to provide user-defined accesskeys at Accessites.org (external link) and at my day job  (external link), and that meant I needed to be totally comfortable and intimate with the code and the way it worked; secondly it's a great learning exercise to try to reproduce something someone else has already produced, and then to compare and contrast.

Key features

The script

The script is very easy to install and use:

  1. Download the code (1k), or copy and paste it from below and save it as accesskeys.inc.php.
  2. Edit the $accesskeypages array. This array contains the URLs of the pages you wish to provide accesskeys for. Each URL has an associative array defining the token (internal name), default accesskey (can be left empty), and label (displayed on the user form).
  3. Include the script on every page where you want accesskeys to be available, ideally via a global or header include file. As it sets a cookie it must be included before output is sent to the user's browser.
  4. Pass a URL to the output_accesskey() function and it will return a string containing the accesskey for that URL if one has been set. For example:
    <?
    $navigation_links = array("Home" => "/index.php", "Contact" => "/contact/", "Accessibility" => "/accessibility/");
    echo '<ul>';
    foreach ($navigation_links as $label => $url) {
      echo '<li><a href="' . $url . '"' . output_accesskey($url) . '>' . $label . '</a></li>';
    }
    echo '</ul>';
    ?>
    

To display the form for a user to set their accesskeys call the output_acesskeys_form() function. For example:

<?
include "accesskeys.inc.php";
include "header.inc.php";
echo output_accesskeys_form();
include "footer.inc.php";
?>

The script contains no styling, so you'll probably want to add some classes or ids to the output_acesskeys_form() function and apply CSS accordingly.

Possible enhancements

There are a few easy enhancements which could be made - providing suggested keys and a button to implement these, and sanity-checking a user's choice of key (for example detecting and warning of duplicates) to name two.

In the wild

The script can be seen in action in two places at the time of writing:

If you do use the script please let me know and I'll add the site to the list.

The script

<?
// accesskeys.inc.php
$accesskeypages = array("/index.php" => array("token" => "home", "default" => "", "label" => "Home", "suggested" => "1"),
    "/accessibility/" => array("token" => "accessibility", "default" => "", "label" => "Accessibility", "suggested" => "0"),
    "/contact/" => array("token" => "contact", "default" => "", "label" => "Contact", "suggested" => "9")
);
if ($_POST["accesskeys"]) {
    $setaccesskeys = array();
	if ($_POST["submit"]) {
		for ($x=0; $x < count($_POST["accesskeys"]); $x++) {
			$setaccesskeys[$_POST["token"][$x]] = $_POST["accesskeys"][$x];
		}
	} else if ($_POST["suggested"]) {
		foreach ($accesskeypages as $accesskeypage) {
			$setaccesskeys[$accesskeypage["token"]] = $accesskeypage["suggested"];
		}
	}
    setcookie("accesskeys", base64_encode(serialize($setaccesskeys)), 2147483647, "/");
    header("Location: " . $_SERVER['PHP_SELF'] . "?ak=1");
}
if ($_COOKIE["accesskeys"]) {
    $useraccesskeys = unserialize(base64_decode($_COOKIE["accesskeys"]));
} else {
    foreach ($accesskeypages as $akarray) {
        $useraccesskeys[$akarray["token"]] = $akarray["default"];
    }
}
function output_accesskey($url) {
    global $accesskeypages;
    global $useraccesskeys;
    if ($useraccesskeys[$accesskeypages[$url]["token"]] != "") {
        return ' accesskey="' . $useraccesskeys[$accesskeypages[$url]["token"]] . '"';
    }
}
function output_accesskeys_form() {
    global $accesskeypages;
    global $useraccesskeys;
    $akform = '';
    if ($_GET["ak"]) {
        $akform .= '<p>Your accesskeys settings have been saved.</p>';
    }
    $akform .= '<form action="' . $_SERVER['PHP_SELF'] . '" method="post"><fieldset><legend>Current settings</legend>';
    foreach ($accesskeypages as $akarray) {
		$akform .= '<div><label for="' . $akarray["token"] . '">' . $akarray["label"];
		if (isset($akarray["suggested"])) $akform .= ' <em>Suggested key: ' . $akarray["suggested"] . '</em>';
		$akform .= '</label> ';
        $akform .= '<input type="text" maxlength="1" size="3" name="accesskeys[]" id="' . $akarray["token"] . '"';
        if (isset($useraccesskeys[$akarray["token"]])) $akform .= ' value="' . $useraccesskeys[$akarray["token"]] . '"';
        $akform .= ' /><input type="hidden" name="token[]" value="' . $akarray["token"] . '" /></div>';
    }
    $akform .= '</fieldset><div><input type="submit" value="Set Accesskeys" name="submit" /> <input type="submit" value="Clear Accesskeys" name="reset" /></div></form>';
    return $akform;
}
?>

February 2, 2006

CIPFA Governance

@ 5:43 PM

<sigh>

Here we go again. Another shiny new website launched in the public sector, this time for the CIPFA Governance Network  (external link). Another table-based, inaccessible, doctype-less, javascript-dependent, tag-soup of a site.

They claim:

"the public expect more from those who govern in terms of standards, behaviour and outcomes."

Hmm. So long as those aren't web standards I guess the public will be much happier now.

New professionalism  (external link)? Nah, same old comfortable amateurism.