Warning: include(/www/vhtdocs/befe/includes/prepend.inc.php): failed to open stream: No such file or directory in /srv/users/serverpilot/apps/blether/public/archives/general_blether/index.php on line 2

Warning: include(): Failed opening '/www/vhtdocs/befe/includes/prepend.inc.php' for inclusion (include_path='.:/opt/sp/php7.4/lib/php') in /srv/users/serverpilot/apps/blether/public/archives/general_blether/index.php on line 2
General blether - Blether

General blether Archive

September 11, 2008

Wii Mooch

@ 07:40 PM

One of the benefits of being freelance is the ability to dedicate more time to little side projects. I did this when I was fully employed, but without the freedom to pull an all-nighter when it was needed the possibilities were limited.

In our spare time for the past month or so my bro and I have been working on the first product of our new company, Mooch Marketing. It's called Wii Mooch, and if you own a Wii and live in the UK we hope you'll find it indispensible.

To celebrate the launch we're going to be giving away a copy of Wii Fit - all you have to do is register your interest for the site.

If you've got a Wii and have the time and the will to help us with testing let me know and we'll give you the keys to the backdoor. We're hoping to launch by next Friday, 19th September.

Comment on Wii Mooch (6)

April 05, 2008

Bye bye Bill

@ 02:21 PM

I've always been somewhat equivocal when it comes to choice of desktop operating system. For reasons of convenience more than anything else my personal and business workhorses have until now run on Microsoft OSes. I was a reasonably happy XP user for years - even if the experience of using it wasn't exactly a pleasure, it never got in the way of what I was trying to achieve.

Last summer my main office PC expired, and I had to get hold of a new one in a hurry. I plumped for a Dell - their service was exemplary and the PC itself is a nice piece of kit at a very reasonable price. But like a new girlfriend with an STD it had a dirty little secret that only started to bite once it had inveigled itself into my daily routine. That dirty little secret was Vista.

Vista has plagued my working day for the past 6 months, and my productivity has suffered. I've soldiered manfully on thinking that I could learn to overcome or work around the barriers Vista put in the way of my work, and loathe to write off the expenditure.

I was wrong. I know this subject has been done to death, but I cannot fathom how Microsoft can invest the time, money and effort they have into an nth generation operating system and produce something that is so god-damned awful to use productively. It. Sucks. Hard. I could install XP on it, but that's not a sustainable strategy.

So I'm joining the ranks of Apple Mac users, for lots of reasons not least that it will be a joy to be able to work with a Unix-based desktop OS, and to have my development platform of choice availble natively on the move. Add in no more UAC, TextMate and the beautiful display and I'm wondering why it's taken me so long to see the light.

Must say big thanks to Ann for helping me decide what exactly I needed. I'm off to order my MacBook Pro now, having resisted the temptation to head into Glasgow tomorrow to visit the Apple Store in person.

Comment on Bye bye Bill (9)

February 29, 2008

Leggili - leggi, recensisci, condividi

@ 10:20 AM

It doesn't take a genius to join the dots to work out what Leggili is. Then again if I'd stumbled across it by accident I'd have no idea, not understanding much more than a word of Italian. Thankfully I'm working with the very clever and nice people at Libreria Ledi who do.

So if you read, speak Italian and like books, please sign up for the Leggili newsletter.

December 11, 2007

Not dead

@ 12:00 PM

No, I'm not dead. Nearly two months between posts is a new record for me though, so you're forgiven if you wondered how I met my demise.

It's been a busy time lately, this is some of what I've been up to:

One of the things I haven't been doing is keeping up to date with the blogs I used to read regularly - it's amazing how isolated that has made me feel at times. So I've pruned by Bloglines to under 50 feeds and set aside 10 minutes every morning to check out what's happening in the world.

Now the business is running smoothly and all the tedious but important admin stuff is taking less time I hope to post more often (well, at least more often than every 2 months).

To kick off I've got a series of posts about forms lined up, inspired by my experience on the South African Airlines website...

Comment on Not dead (7)

September 30, 2007

Does business have to be dog-eat-dog?

@ 12:31 PM

Since I left my job back in April to pursue my own business interests I've been on something of a rollercoaster ride of discovery. I've had the good fortune to work with some very talented, knowledgable and committed people and to be challenged by the work I've been hired to do. On the whole it's been more interesting, rewarding and fun than I could have imagined.

Afew weeks ago something happened that made me stop and question why I'm in business and whether I've got what it takes to succeed in business in the long run.

I'm doing some work as a technical advisor for a company, let's call them company A, that produces a piece of software that aids accessibility on the web. Everyone I've dealt with at the company is committed, friendly and knowledgable, and they have a great, successful product. It's the sort of company I hope Champion IS might become in time - always looking for ways to improve the user experience and focussed on their product and their clients.

So when their closest competitor (let's call them company X) started mailing company A's clients with aggressive and pernicious marketing, trashing company A's product with false claims and promoting their own product, I was appalled. It shouldn't have been entirely surprising - company X has a history of dodgy marketing, and has a poor reputation in certain circles - but the two companies are market leaders and have very little other opposition in their sector.

I don't want to go into any detail about the content of that marketing, or what company A will or should do in response, but instead want to consider what it says about company X and whether my reaction was a symptom of naivety. Here are some questions:

  1. To be successful in business do you have to attack the opposition?
  2. Is it naive to think that producing high quality work and maintaining high quality customer service are enough to survive if not thrive in business?
  3. How would you respond if you were company A?

Feel free to guess the companies' identities, but don't expect me to validate them. :-)

Comment on Does business have to be dog-eat-dog? (4)

April 07, 2007


@ 04:23 PM

As I sold my Wii a couple of weeks ago (it was fun while it lasted, but was gathering too much dust to justify its existence) today I unsubscribed from the Nintendo email newsletter. Or at least that's what I tried to do. But I'll need to wait:

Your Nintendo e-mail subscription status has been updated.

Please be advised that it may take up to 10 business days to unsubscribe you.

WTF?! I can subscribe in an instant, but unsubscription takes up to 10 days? The unsubscription method was a heavily obfuscated URL, so where's the issue? Are Nintendo manually processing unsubscriptions? Meh.

Comment on Nintendon't (4)

February 26, 2007

Leaving Clackmannanshire

@ 05:10 PM

On Thursday last week I tenedered my resignation at Clackmannanshire Council. I'll be leaving in April to concentrate on Champion Internet Solutions (external link), its consultancy and training services and some in-house projects, including Revish (external link).

I'm sad to be leaving so many great people behind, especially my immediate colleagues in Communications at Clackmannanshire - my 4 years in the team have whizzed by, which means I must have enjoyed it. But it's tempered by the exciting prospect of working closely with Ian Dunmore, Nick Hill and the other bods at Public Sector Forums (external link) - we've established a strategic partnership and the fit is close to perfect.

So on 19th April I become a free agent. Needless to say if you're interested in working with Champion IS on any forthcoming projects, or want a chat about possible collaborations, please contact me via the company site (external link).

Comment on Leaving Clackmannanshire (13)

December 25, 2006

Alternative Christmas Message BlogSwap

@ 12:01 AM

A few weeks ago Jack Pickard came up with a fun and interesting idea: an Alternative Christmas Message BlogSwap. You can read a fuller explanation on Jack's site, but in short the participants each post an alternative Christmas message on another participant's blog. This year I will be posting Mike Cherim's message here (keep reading), mine will go on Stephen Lang's blog (external link), Jack's will be posted on Mike's blog (external link), and Stephen's message will be posted on Jack's (external link). Should be fun and interesting. Over to Mike:

Mike Cherim's Alternative Christmas Message

A Christmas message should be a simple thing to write, but not if one has writer's block. With writer's block the simplest written works can be challenging. Sure I could tell a related story, describe a childhood Christmas, or even tell you, dear reader, that I would like to see the spirit of the holiday last all year long resulting in world peace. But I'm not. I decided instead to share with you a few little known quotes from some of history's greatest people.

"Ask not what you'll get from Santa; ask if Santa really wants cookies and milk." -- Kennedy to his kids

"I will gladly repay you Tuesday for a Humbug today." -- Wimpy to Popeye

"You just drank the most egg-nog I have ever seen anyone drink, man" -- Chong to Cheech

"Who left the Christmas tree candles on? The wax bill is killing me!" -- Edison grumbling to himself

"One small gift for her, one giant credit card bill for me." -- Armstrong about diamond for Missus

"If I had only known, I would have been an elf." -- Einstein during frustrated moment

"You're gonna need a bigger sled." -- Brody to Santa, looking at bag of toys

"What the hell is Christmas anyway?" -- Overheard from Kwanzaa partygoer

"I'm dreaming of a green Christmas..." -- Eskimo song

"The evidence has spoken: there is no Santa Claus." -- Grissom from CSI

"You think you can catch Santa Claus?" -- Verbal from Usual Suspects

"I am not a reindeer" -- Nixon to the press

"All we are saying is give fruit cake a chance." -- John Lennon sings

"The cookies were eaten, in the library, with a spoon." -- Clue player to friends

"It's beginning to look a lot like profits..." -- Large retailer singing to shareholders

And last but not least...

"The power of Christmas is in its universality. Presents for everyone regardless of income is an essential aspect.”" -- Tim Ho-Ho-Ho Berners-Lee

I have to stop with this else my mind will explode and that'll make a mess, do feel free to add a few quotes of your own.

December 22, 2006

Take one a day

@ 04:31 PM

Inspired by Northshore (external link), and the purchase of a digital camera I can carry just about anywhere (unlike my Sony DSC-717), from 1st January 2007 I'm going to be taking one photo a day and posting it to Flickr, with a thumbnail here somewhere.

It means two things - I'll need to do some redecorating here, which is no bad thing, and you'll probably be subjected to many pictures of Alloa, which may not be such a good thing. I'll do my best to keep it interesting, but sometimes life just ain't that way, is it?

Anyone else fancy taking up the challenge?

Comment on Take one a day (4)

December 17, 2006

5 things you did not know about Dan Champion

@ 06:38 PM

I generally try to avoid memes, not because I don't enjoy reading them on other blogs, but because I don't ever seem to find the time to do them justice myself. But as Mike Cherim's (external link) gone to the trouble of explicitly tagging me on this one, it's a nice theme, and being a pretty private person I don't normally give much personal detail away here, I'm making an exception for this one.

So here are 5 possibly interesting things you're unlikely to know about me:

  1. From about the age of 12 I wanted to be a lawyer. I still don't know where that desire came from, and it wasn't until the age of 19, during the second year of my English Law degree at King's College, London (external link), that I realised I didn't really want to be a lawyer. I completed my law degree nonetheless, although I spent a lot of time during the last two years of the course on "extra-curricular" activities.
  2. I spent two years working as a forester when I left university. It was the most physically demanding work I've ever done, but also some of the most enjoyable. Being outside 10 hours a day in all the weather Scotland can throw at you might not sound like much fun, but by its very nature forestry takes you to some breathtaking places, like the Isle of Mull (external link). After 4 years in London it was somewhat cathartic to spend so much time in so much wilderness with so many midges (external link).
  3. In the last couple of months I've learned to swim at the Scottish National Swimming Academy (external link) at Stirling University. Although I could get from one end of a swimming pool to the other without touching the bottom before the lessons, it would be insulting to swimmers everywhere to describe it as swimming. I was terrified of water too - couldn't abide it on my face, and the idea of putting my head underwater was too much to contemplate. After 8 weeks of tuition from the excellent staff at the academy I can now swim underwater, and do a pretty good breaststroke. I return in February to learn the front crawl and can't wait to get started.
  4. In August 2005 my wife and I featured on BBC Radio Scotland's programme Grassroots (external link). We only expected to be interviewed for a 5 minute slot but ended up being on for half an hour. There's an mp3 of the show on our smallholding site (external link) if you really want to listen to it.
  5. I sponsor a 13 year-old Nicaraguan boy through Plan International (external link). His name is Nelson, he loves baseball and maths and sends me some lovely letters and drawings. If you can spare a small amount of money each month please consider spending it on a sponsorship. The work of organisations like Plan makes a big difference to some of the world's poorest people, and you get the opportunity to establish a lasting bond with a child.

Enough about me, I'm tagging Claire (external link), Grant Broome (external link), Jim O'Donnell (external link), Andy Saxton (external link) and Stephen Lang (external link).

Comment on 5 things you did not know about Dan Champion (13)

June 08, 2006

Commissioner, censure thyself

@ 02:37 PM

The Information Commissioner (external link), the man ultimately responsible for data protection and freedom of information in the UK, has extraordinarily issued a Decision Notice (external link) against himself for the ICO's (external link) handling of a request for information from Friends of the Earth.

What's particularly baffling is that in the first instance the ICO failed to recognise the request as a request for information under the Freedom of Information Act (external link).

There's a very common misconception that one has to "invoke" the FOI for a request to be covered by the legislation - in reality any request for information from any body which falls under the provisions of the Act is required to be dealt with in accordance with the Act. The misconception is perpetuated, perhaps deliberately, by the use of specific FOI email addresses and contact details by many government bodies, the DTI included...

See also: eGovMonitor: Information Commissioner admits he failed to comply with Freedom of Information Act (external link)

Comment on Commissioner, censure thyself (0)

May 26, 2006

Blocking SiteMorse & other unwelcome robots

@ 11:03 AM

Like just about every other site owner on the planet, you probably crave for more traffic to visit your corner of the web. But not every visitor to your website should be welcomed with open arms. A good deal of the hits listed in your logs are likely to come from the many programs - commonly known as robots, bots, crawlers and spiders - that automatically trawl the web for a variety of purposes, including:

Most of these robots are harmless and positively benificial - there are few reasons for trying to stop the Googlebot having access to your site. But not every robot visits your site with your best interests at heart.

Here are 4 techniques for blocking unwelcome robots from accessing your site. Before using any of these you will need to know either the ip address the bot is originating from, or the user-agent string the bot uses.

1. robots.txt

The robots exclusion standard or robots.txt protocol is the most straightforward way of excluding robots from your site. It's platform independent, flexible and easy to setup, but it does require the co-operation of the robot in question.

To implement robots.txt create a text file called robots.txt in the root directory of your site. Here's a simple example that allows Googlebot full access to your site, but blocks all other bots:

User-agent: Googlebot

User-agent: *
Disallow: /

robots.txt is flexible in that it allows you to block access to particular areas of your site, while leaving others open.

For full details of the robots exclusion standard, including a list of known bots and their purposes, visit the excellent Web Robots Pages (external link).

Most legitimate bots will honour robots.txt, but there are some that don't, including SiteMorse. So, for the purposes of illustration I'll use SiteMorse to demonstrate alternative techniques which can be used for bots that don't offer site owners the courtesy of observing the robots exclusion standard.

2. Blocking ip addresses using <Limit> (Apache only)

If you know the ip address from which the robot is accessing your site, and your site runs on the Apache web server, the <Limit> directive provides a convenient and effective method of blocking access. There are uncertainties involved in using this method - ip addresses can change, so you need to check regularly that the bot you're blocking is still using the same address.

At the time of writing SiteMorse's spider operates from the ip address The easiest way to implement <Limit> is via a .htaccess file. To block access to the SiteMorse bot create a text file called .htaccess in the root directory of your site, or if the file already exists edit it, and include this content in the file:

<Limit GET POST>
order deny,allow
deny from

Alternatively if you have access to the Apache httpd.conf file the directive can be included in that file, in any context - i.e. for the whole server, for a virtual host or for a single driectory.

For full details of the <Limit> directive see the official documentation for your version of Apache - version 1.3 (external link) / version 2.0 (external link).

3. Blocking user agents using SetEnvIfNoCase and <Limit> (Apache only)

A more reliable but also more complex method than using ip addresses is to use the user-agent string to restrict access. The user-agent string for the SiteMorse bot will always contain the characters 'b2w'.

This time we use the Apache mod_setenvif module directive SetEnvIfNoCase. This allows us to set an environment variable based on a regular expression. Note that we're using the case insensitive SetEnvIfNoCase instead of SetEnvIf. Here's how it looks for our SiteMorse block:

SetEnvIfNoCase User-Agent "^b2w" bad_bot
<Limit GET POST>
Order Allow,Deny
Allow from all
Deny from env=bad_bot

The first line tests the user-agent string to see if it starts with 'b2w', and if it does it defines the environment variable bad_bot. We then use the <Limit> directive as before to deny access to our site if the bad_bot environment variable is defined.

As before this can be used in any context. However, note that to use the SetEnvIfNoCase directive in a .htaccess file requires FileInfo to be allowed to be overridden.

Again please check the official documentation for your version of Apache for full details: version 1.3 (external link) / version 2.0 (external link).

4. Blocking ip addresses or user agents using PHP

The final technique I'm going to cover involves blocking access from within your web content. While I wouldn't recommend this approach for blocking a bot from your entire site (the above techniques are far more efficient) it can be useful when you want to block a bot from very specific content, or serve alternative content based on the user agent or ip address of the bot. For example I used this in the past to serve SiteMorse a different version of the A to Z pages on ClacksWeb, to prevent it from spidering the page for more than one letter.

Here's a quick example of blocking access based on ip address or user-agent:

if (($_SERVER["REMOTE_ADDR"] == "") || ($_SERVER["HTTP_USER_AGENT"] == "b2w")) {
    header("HTTP/1.0 403 Forbidden");
} else {
    // serve content normally

This code is most effective if you use a single controller script, but can be used in individual scripts as required. Note though that the use of the header function means that it must be the first output from the script.

Other platforms

Comment on Blocking SiteMorse & other unwelcome robots (4)

November 18, 2005

Most Wanted - criminal web design

@ 09:08 PM

Another major UK public sector site launch, another hideously bloated, nested-table monstrosity. This time it's the Crimestoppers 'Most Wanted'  (external link) site.

How's this for starters:

<!-- xinclude virtual="/includes/objPageCache.asp" -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<!-- Document coded using XHTML 1.0 rules | HTML 4.01 DTD used for backwards campatibility -->
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">

That's a new one for me.

Crimestoppers may be a charity, but in 2003 it received total income of £3 million, including £1 million from the Home Office, so it wasn't a matter of resources. It can only be hoped that PAS 78 will help such organisations commission better sites in the future.

The Register  (external link) reported how the site was brought down this morning by heavier than aniticipated traffic. Looking at the markup used on the site a good dose of web standards would have at least halved the page weight, and consequently doubled the number of visitors the site could have coped with, potentially avoiding such an embarassing launch.

Comment on Most Wanted - criminal web design (3)

November 16, 2005

Google Base goodness

@ 07:26 PM

For those who don't know, Google Base  (external link) is a new service from the search giant. They describe it thus:

Google Base is Google's database into which you can add all types of content. We'll host your content and make it searchable online for free.

What sorts of content? Well, jobs, events, courses, reviews, cars for sale, and so on.

Although Google can host the content for you, if you've got existing web content you can feed the Base 'bulk uploads' using RSS over FTP, and just point it at your content. Since I've got a bunch of RSS feeds for ClacksWeb, it was a breeze to repurpose the feeds for Google Base, setup a daily cron on a PHP script to FTP the stuff onto Google's servers.

What's most interesting is the effect this appears to have had on our Google search rankings. For example, one of the Base items included in the RSS uploaded from the site today was a vacancy for a headteacher at one of our primary schools. Search Google Base for headteacher  (external link) and there it is, 1st item of 2. Search Google for headteacher  (external link) and there it is, currently 5th item of 2.4 million.

Coincidence? Possibly, but unlikely. ClacksWeb does do well on Google due to nice URLs, valid, semantic XHTML, reasonable use of headings and so on, but 5th of 2.4 million for what is effectively an ephemeral page?

It's too early to conclude that Google are using Base data to adjust the algorithms used on Google itself, but it would make sense - if it proves to be true it will drive traffic through Base, and a lot of that will be well-classified and tremendously more valuable than data gathered from trawling the web.

I'll be pushing more feeds to Base tomorrow, for events and general vacancies, only this time I'll do some searching beforehand...

Comment on Google Base goodness (1)

November 10, 2005

Tools aren't skills aren't knowledge

@ 05:32 PM

Yesterday I was asked by a colleague in another local authority what software I had used to make our website accessible. The question threw me for a second. Since I no longer think of web development in terms of tools, it hadn't consciously occured to me that others in the business still do. I started to list the software tools I do use - HomeSite, TopStyle, Bullet Proof FTP, Firefox, CSE Validator Pro and so on. But of course software doesn't make websites accessible - at the stage we're at as an industry it's pretty much down to knowledge, research and understanding.

So I explained that it wasn't really the software I had used that was important so much as the time I had invested in learning what makes an site accessible, through the GAWDs mailing list, AccessifyForum and many websites. The more I thought about it the more I realised what a steep learning curve I'd been on in the past 12 months, and just how much time I'd dedicated to re-learning the fundamentals of the job of a web developer. Lists like css-discuss and Evolt's thelist, sites like ALA and QuirksMode, books by Zeldman, Meyer and Clark - I've read them all over the past 2 years or so, tried to learn from them and apply what I've learned. How do you get that across to someone who's looking for a magic software bullet to achieve a worthy aim, without scaring them off?

When it comes down to it tools can only be used to apply the knowledge you have using the skills you've developed (or if you're lucky were born with), and the production of just about anything of quality relies on all the presence of all three.

Comment on Tools aren't skills aren't knowledge (3)

October 24, 2005


@ 10:16 PM

As Blether was launched early (it was supposed to be a November CSS Reboot  (External link) entry) I'm still working on the site, adding a few features that slipped by the wayside. Tonight's addition is Select-a-Blether, a mechanism for changing the styling of the site.

The first alternative to Verbal Kint is world-class blether Richard Nixon. To have Tricky Dicky grace the site just follow his link in the navigation. More blethers will follow when I get around to them, together with some notes explaining their presence.

Comment on Select-a-Blether (0)

Hotmail, the email incinerator

@ 06:04 PM

A heads-up for anyone who operates any sort of system which sends emails to customers - Hotmail could well be rejecting your email and not telling you, or the recipient, that they've done so. You can get yourself whitelisted by signing up to Bonded Sender, but that costs money. Oh, and the threshold for Bonded Sender to reject your application is 1 complaint per one million emails sent. Oh, and if they reject your application you can't reapply for 90 days.

The full skinny can be found on e-consultancy  (External link). If you're a large organisation sending out a lot of email you might like to reassure yourself that a significant proportion of your customer base doesn't think you're ignoring them.