A late whinge about iOS7

March 1st, 2014

An edited version of the following text appeared in MacUser magazine, Vol 30 No 4, April 2014.

So, there have been a few complaints about the iOS 7 interface, eh?

Looks like they’ve gone for Big Fun 2.0 – a new kind of fun. A fun that isn’t really based in anything useful but does look kind of, well, flashy. The dark side of chrome, and a chrome that looks suspiciously like other people’s chrome – Dull Chrome 1.0. A fun that plays with you and frustrates you while you pretend to laugh and enjoy it while actually harbouring really very uncharitable thoughts about what you’re going to do to whoever designed this and broke your favourite toy. I mean phone. Very important tool. Grownup tool that you require for work. And other important things that can’t be trifled with or made to look like My First Colouring Book. Except it cannot be “my first” because it is actually the same as everybody else’s.

Rather like Android and Windows mobile, perhaps? In comparison to iOS up to v6 anyway. I’m not denying earlier iOS versions were fun – far from it – they was great fun, but Fun 1.0 was based on something that we all understood, whether we realised it or not. It was based on experience. Perhaps not our own personal experience, but an experience cloud that we could all tap into. We may never have owned a padded leather diary (I think I was given a corporate one once, but don’t remember ever using it, just prodding the “sumptuous” suede and thinking gently “No, I don’t think so”) but, crucially, we all knew exactly what one was when we saw it.

And then, critically, there are the things that don’t look like anything on earth and are no longer easy to use for people on this earth. The selection scroller interface is broken. I certainly get the idea of fading the adjacent values out and away, but it is now done in such a clueless style. Where are the boundaries? I seem to be endlessly flicking a whole page up and down because I have just missed the edge of the control. The old version was as clear as it was possible to be. Don’t tell me that if I start the flick precisely on the current value, I didn’t have to on the old version and I don’t have to in real life. Remember the worst UX crime is the broken metaphor; if you give someone something familiar to use, make sure it works in the familiar fashion, not sort of similarish as long as you know the differences. Because we don’t intuitively know the differences. And the maths used to define the curve that the adjacent values are fitted to is either faulty or based on faulty thinking. It is not a cylinder any more, perhaps it is parabolic like the front of those idiotic buildings in Las Vegas and London that toast cars?

Who, honestly, thought that a flat rectangle of the most hideous shade of green possible was genuinely better than the rendered glass battery, filling slowly with glowing green power? Come on, who was it? Put you hand up now. Step forward and explain your thinking – I really want to know.

Those thin fonts are de rigeur these days aren’t they: everybody is using them. Absolutely everybody. You’d never be able to guess which phone you were using from the typeface. Sadly that was one of the reasons I preferred iOS to Windows Mobile: I could read the text. I have written before on the importance of being able to read instructions in less than perfect conditions (e.g. when not wearing one’s glasses) and I can no longer read the time on the lock screen when I wake up in the morning and haven’t put my glasses on. Hopeless. Yes, I have set the bold text option on, but I’m not terribly impressed with the difference.

The whole parallax movement effect is quite a fail on an iPad: the screen is so big that you can’t see all the icons and the screen edges at the same time. The illusion that the icons are static is lost because they are not apparently locked to the edges, so your brain tells you that the icons are moving over a static background which your brain “knows” is more likely. It looks like the icons have come loose – and I suspect this may be responsible for the feeling of nausea that some users are reporting. I assumed this would work on the smaller format of a phone because the eye would see that the icons were locked to the edges so it must be the background moving. However I didn’t try it for a couple of months because I was holding off upgrading. But when I did finally try it on my phone I found it pointless and usually not even visible. A lovely idea, a really lovely idea, to give the phone internal depth, but a hopeless CPU cycle sucker in practice.

Flat is trendy. Flat is very now. But think about what that means. Tomorrow may not be flat and our phones will look like last year’s model – and we’ll have to go through this all again. Last September, despite being six versions old, our phones still looked current. They looked different. Now they look the same as everyone else’s. Apple’s history as the relentless innovator has taken a spanking and we are left with an iWindroid phone that is no longer a pleasure to own but has simply become a phone. The phone in my pocket. The phone that I chose and waited for until I could afford because, although expensive, was different enough to be desirable. Now it is just another phone. Albeit vastly more expensive than it is worth.

Own goal Jony.

And so it turns out – it seems to me – that Big Fun 2.0 is actually last year’s Dull Chrome 1.0 nicked from other people’s thinking. Jony, I implore you. Go back to what you are good at. Making the insides of your hardware so good it makes us want to weep. Wearing black Ts. Speaking slowly and quietly without sounding, well, too patronising. And never try to design a customer facing piece of software again. Not ever.

In a very non-scientific survey I personally cannot find one person that I know who has used both and prefers iOS 7 over iOS 6.

I wonder: we’re not finding that we’re missing skeuomorphism after all are we?

Size is an accessibility issue.

May 12th, 2012

UX right?

A slightly left-field post, this one. I’m leaving computers behind. Because they only form a small part of our experience as users. Most of our lives are spent not using them, but (apart from when we’re asleep) using something. So, today, I’d like to talk about shampoo.

There are few times when those of us who wear glasses for farsightedness can successfully take them off. One example would be, once again, in bed – there’s not much I need to look at, and I think I can probably find my wife without resorting to night vision goggles – and another time would be the shower. Except, unbelievably, that this isn’t true. I need to wear my glasses in the shower due to the bonkers thoughtlessness of the collective marketing departments of the shampoo industry. So important is it to them that they cover the bottles of their various lotions and unguents with descriptions of the exotic fruit and vegetables that they have included that there is never room to write the word “shampoo” in anything but eight point text. I haven’t a hope in hell of divining which of the two hundred bottles apparently required by my daughters contain shampoo rather than conditioner or drain unblocker – without my glasses on.

As an aside, God must have been having quite a laugh when he seeded the human race in Africa’s savannahs, and the flora required to wash their hair in South America’s rain forests. Perhaps there is a 2001 parallel here – in the same way that Arthur C Clarke planted a monolith on the moon so the extraterrestrial intelligence that put it there could know when man had advanced sufficiently to unearth (unmoon?) it. Obviously Erik von Daniken’s extraterrestrials planted all this stuff in the Amazon basin while sketching out their Nazca line drawings and have spent the last 100 thousand years expectantly watching our drains – waiting for the moment that jojoba extract (not made from concentrate) appeared there. They now know we have reached some keenly advanced point in our personal grooming regimes, indicating we have reached the requisite level of intelligence for, for precisely what I wonder?

Whatever: surely even the most stupid marketing exec and overworked artist would have realised by now that we mortals can’t read their labels, when we haven’t got our glasses on, when we’re in the shower? WRITE IT LARGE, PLEASE.

Anyone would think Adobe was in the soap business.

Rant over.

Making selecting easier in layout software (V2)

March 15th, 2012

Back in October 2010 I posted about a 3D editing feature that I thought would make it easier to manipulate complex layered structures such as web pages. Today I have been playing with Firefox 11.0 and, there in the view displayed by Tools->Web developer->Inspect is a button at the bottom left titled “3D”. You can spin and zoom the result, and select structures below the top, visible, layer. Brilliant for debugging or looking at how a competitor has built their page. If you then click the buttons titled “HTML” and “Style” two windows open below and to the right, which follow your selections in the 3D view. Outstanding.

OK, Firefox is not the platform of choice for creating web sites(!), but this is a fantastic tool and a really useful feature. I expect to be using it a lot.

Well done Mozilla, oh dear Adobe – where are you? This should be a “must have” feature, particularly for fully manual layout software like Dreamweaver.

Skeuomorphism

May 11th, 2011

An edited version of the following text appeared in MacUser magazine, Vol 27 No 10, 13 May 2011.


Sir,

Re: Alex Watson’s article in MacUser Vol 27 No 8 Pg 82 (15.04.2011)

Wrong wrong wrong.

I do not believe the jury is out on digital skeuomorphism. In fact I’d say the verdict has been in for some time and skeuomorphs are being used successfully now more than ever. And in many, many cases rightly so.

I suggest your author was scraping the barrel with references to easy targets such as the floppy disk “Save” icon. I’m half surprised he didn’t bring up the old “Keyboard not found – press F1 to continue” chestnut just for good measure. My 13 year old son just about remembers seeing a 3.5” disk in my office but is very aware that the “shape” means save. There are some far more bonkers “shapes” in use that we could never have intuited any meaning from. I suspect that some pretty clever brains have applied themselves to the “save” icon  problem. And yet no one has come up with a better icon. Certainly your author didn’t even try in his article. I have tried several times over the years and have always been dissatisfied for one reason or another. A heart superimposed over a document in a cheesy “I love my document” sort of way. Two cupped hands holding a document safely – foundered because I suspected that in some culture somewhere cupped hands might be some mortal maternal insult. Safes, keys, padlocks – all useless as they have been purloined by the security department. Until of course, everything is held digitally then there will be no need for safes, keys and padlocks at which point those images will start a skeuomorphological demise. Interestingly, Apple seem to have dropped the use of any button requiring an icon for the “Save” command, only ever offering Cmd+S.

Another point: the functionality of the “Save” button remains wholly intact, it is only the image that is losing it’s meaning. This change does not match with, for instance, the retention of riveting on jeans that is there purely for decoration, it serves no structural purpose whatsoever any more and I am not sure that both of these can be skeuomorphism at the same time. Treating our language to that kind of elasticity would not be one of its great pleasures.

Look out Logic Studio users – in 20 years all music will be created  digitally (in someone’s bedroom, now there’s an ugly opportunity for a UI background for a DAW), SSL will not be making mixing desks any more and the sliders on the mixer view will become desktop space chomping eyecandy. Five years after that someone will be writing in a magazine about the pointlessness of all those sliders. I mean, just what on earth do they represent? But wait, actually, it’s quite a good way of representing relative values across number of units…

Of course the rate of change is so great in computing these days that it is hard to carry a metaphor over for long before it becomes skeuomorphic. Every three to five years brings radical new experiences – and a new generation comes along every now and again so the churn rate is actually so fast that it will be impossible soon to create, for every requirement, an icon that has meaning through experience for everyone. Perhaps your author suggests we should just give up at that point – or even now to save us the frustration later? I don’t think so. The cost savings permitted by the removal of text are too great. Look at most Adobe software – the sooner they get rid of all the text in their applications the better. And what of the nuclear position – once everything is done on computers there will be no real-world interfaces to mimic.

Your author has a pop at the iBooks “centre of the book” look wherever you are within the volume. I too saw that and, yes, it jarred a little. Not because it looked bad (it doesn’t, despite what any pedant might say) but because, although it suggested information, it did not deliver any so the metaphor is broken. Breaking a metaphor is a UX crime, but using an old one is not – actually, using an old one is the whole point of UX metaphors. And I wonder how many systems he has actually been responsible for that he hasn’t improved the look of later? I have to admit here that I look at things and I see opportunities for improvement – not dismal failures. And I don’t think that the iBooks UX is a dismal failure. As an aside, I showed my 75 year old mother-in-law iBooks running on an iPad the other day. She, uniquely among those I know, has no computer, mobile phone (nor even a car). She hates computers yet this moment was a revelation for her, she completely got the point of the metaphor and even suggested that her other daughter would probably like such a device. She wasn’t phased by the equal number of pages either side. She appreciated the joy that had gone into the thinking behind the representation. She appreciated the fun of it.

Ah, here’s the nub. The fun of it. This is a theme I return to endlessly – why cannot a UX be fun? We endlessly choose fun over boring in our private lives – why shouldn’t we at work? Usually because our bosses won’t allow it – which is a large part of why I work for myself. I love the moment when using something – anything, not just software – and think “oh, s/he must have had a quiet chuckle to him/herself when s/he designed that bit”. It brightens my day.

Let us look at possibly the oldest skeuomorph that still persists. In 1940 a group of boys found what turned out to be 17,000 year old drawings of animals on a cave roof in Lascaux, created, many believe, to aid in hunting and safely capturing. Maps and instruction manuals. To this day people still create drawings and paintings of animals (these new artworks are the skeuomorphs, not the cave drawings), but for fun and even for profit, not always for instruction. So for those new artists (rather than the cartographers and manual writers) the point of their creation is now lost and, apparently, the jury is out on whether or not they should continue. What utter tosh. They do it for fun. And fun is good. Fun is great.

Fun is the reason we should do everything – even work. Fun, satisfaction (personal or philanthropic); these are reasons to enjoy life, not to endure a daily grind. One that relentlessly consigns the past to the trash. As humans we seem to innately believe that the old days were “the good old days”. We are constantly looking backwards to recreate things as lovely as they were in “the good old days”. Granted this is sometimes, or often, a lie. But why shouldn’t we do the same in software? I don’t mean creating mock tudor beamed user interfaces, but I passionately believe in the usefulness of mimicry, humour and unnecessary familiarity in raising a smile and increasing the comfort of a user of any system. Even if it wastes a little screen real-estate (which I suspect is why Apple use more visual metaphors on the iPad than the iPhone – because there is more room to do so). Our customers have given us their money and I do not believe it is simply for the functionality – some of it will be because they made a choice based on how that functionality has been presented.

If everyone made their systems a bit more fun then software would be a much better place to be.

 

The importance of testing – 001

December 9th, 2010

This will be a series that runs and runs… and runs.

As you may have picked up from earlier posts, I think testing is pretty important. Testing for the back end engine is for, well, the back end engineering guys. Usability testing is just as important. It doesn’t matter how clever the database engine is – if the user can’t see the results then who cares about how fast the data was found?

Skinning of applications has brought a new level of ease and problems – it is now up to the user to find a colour scheme that works for him or her. Assuming of course that a decent set of skins have been made available.

For a while now software companies making photo editing software have realised that the apparent brightness of a photo can be enhanced by making the surrounding screen real estate dark. Remember though, that since the original Mac, the mimic has been black text on a white background – just like real paper. So a dark background means subclassing all the controls and rendering your own. This should always ring full-test-required alarm bells – anything missed and an editbox, dropdownlist, whatever, will suddenly become invisible or unusable.

Here’s an example:

Depending on the brightness of your display, the four dropdown arrows may be quite hard to see. No drama, the writers of this well known photo editing application have provided a “light” version of the interface.

Yes, now those arrows are clear as a bell. Pity about the editboxes though… They’re even worse! Surely this was spotted during test? Obviously not. So, the dark skin is easier to use – just. We change back to it and…

…what’s happened to the two controls on the right? They have kept their “light” version graphics. That kind of error is small and can happen to anybody, and is simply a bit careless. The point I am trying to make is the bigger one concerning genuinely unreadable screens. Photoshop is full of (un)usability problems, the panel bin on the right of the window is appalling for visibility.

Testing not a priority at Adobe? It MUST be a priority for YOU.

Never tell the user off.

November 21st, 2010

Do not let them get into trouble in the first place.

You don’t like being told off do you? You wouldn’t like it if someone you were working with explained your next task to you so badly that you were bound to get it wrong – and you were guaranteed to get a ticking off? Too right you wouldn’t. Then why should you allow your software to do that to someone who has paid you to help them?

There is an old adage “the customer is always right” – well I don’t believe it. In fact the customer is pretty well always wrong. But you don’t need to rub their noses in it. And they have paid you, so they do need indulging. They have a task to perform or require a result and they may think they know what is required to achieve that result. In fact most of them won’t have a clue as to what the steps required to get there are – but somehow they will know how to get there. It’s intuitive – and you must not break that unconscious link otherwise your user will be brought up sharply and uncomfortably. He won’t thank for that at all.

It may well be harder to create a workflow for your users that always guides them correctly to where they need to go but surely that is a good investment in time? People may only remember a good software time for minutes but they will remember a bad software time for ever. Never assume that your users won’t walk, and never hope that they understand the software as well as you do. However well they understand it you must understand that they will never ever use it the same way that you do. I promise you that the biggest learning curve here is for you, not for your user.

The only way to catch more paths through your software than you can think of is to test it with real people. Other people. Get them to try to use the latest feature with varying degrees of help, but once they start using the software don’t help them anymore, don’t stand behind them, do not guide them, just let them get on with it in as relaxed an atmosphere as you can arrange. But do watch them – and don’t ignore one single hiccough, slip, difficulty or stupid choice that they make.

Every single mistake they make could be blamed on you. If your code hadn’t let them build up to the error then the error itself could not have been made.

I’m not going to talk about the mechanics of testing here for long, but never underestimate how important it is. During a software meeting once a manager came in and asked the programmers “why do we keep releasing crap software?” He wanted us all to come to the next week’s meeting with two reasons each. Of course no one did, except me (*sigh*), who came back with nearly 20, written up in an essay called “Why we keep releasing crap software”. I have to admit that this didn’t go down too well and not one single thing got implemented until after I had left the company – and then some of it was implemented. And after my ex-boss left the company some more of it was implemented. A large number of my points concerned test, and the lack of it within the company. One of their favourite responses to “We can’t test because we have no hardware to test on” was always “well there is currently an exceptionally high need for test hardware so you’ll have to wait”. Actually you’ll hear a very similar excuse nearly every time you phone a call centre – just before you are put on music on hold at 20p per minute and the magnitude of stupidity of the comment is identical. If you are repeatedly told “this is an exceptional need” then it is clearly not “exceptional”, merely inconvenient – and probably then only according to the accounts department.

Always try to ensure that everyone has enough resource to carry out proper test.

So how do you avoid telling the user off?

There are a number of ways in which a user can feel as if he is being told off and some of these have nothing to do with programming, but everything to do with programmers. Programmers – generally  – are highly skilled at writing code. Writing code is – generally – a lone occupation and programmers are – generally – not brilliant communicators with non-programmers. Telling a user that their “Input is invalid” is nigh on pointless. Particularly if it is displayed as a result of clicking an OK button on a form with 20 fields. Which of the 20 fields of data is (or are) invalid? In what way is it (are they) invalid? What does the user have to do to correct its invalidity? What (if any) damage has been done to the user’s other data? Was anything saved? Was anything lost? How far back will the user have to start again? And was the data on the last form actually valid, but some part of it was inconsistent with something entered three screens back?

There are two options here:

  1. Work out all the complex patterns available to the user via the form (and any others before and after), filter the ones that do (or may) cause a problem, then communicate the problem fully, accurately and gently
  2. Don’t allow the user to enter data that won’t make sense to your code in the first place

We’ll leave error reporting to another time – it’s a big subject. Here we’ll look at designing your screen forms properly. Because if you do it properly then the error reporting can be saved for reporting errors rather than advertising programmer incompetence.

The fundamental aim here is to allow the user as little choice as possible while completely fulfilling his needs in order to progress through the program. The next thing that can be done is fully, completely and clearly describing what is required by every field. Offering help that merely fills out the name of a field is a waste of everybody’s time. You will see this over and again – a handy question mark placed near an input field. You hover the mouse over it and about three words appear, or a simple rewording of the title of the edit box. Believe it or not, the title “User name” next to an edit box is not enhanced by a tooltip that says “Enter user name” is it? Is it?? No, it isn’t. If you are going to offer help then show the user you understand the problems associated with filling out the form you have put in front of them. Show the user (or at least pretend to them) that you care about them. Think about the user names that your customer may have had to get this far. An example – my email requires two user names, my broadband login and my email login. Make absolutely sure there is no ambiguity as to which is required on this form in front of the user right now. If you get asked “which name do I enter here?” during test then it doesn’t mean that the user is being thick (actually it doesn’t even matter if he is) it is not clear. But it might mean that the programmer is. Consider looking back through your interface and give the two user names two clearly different titles such as “login name”, “user name”, “registered name”. Maybe don’t even use the word “name” in one of them. Think longer. Think harder. Set out to make the user have a good time.

If it is possible offer a drop down list of all possible entries (this may not be a good idea where security is an issue such as this instance, when dealing with user names) – obviously it is then utterly impossible to put rubbish in that field. Remember to update any other drop downs once any selections have ben made that may cause any entry to become an illegal combination. This simple effort will sidestep a huge number of problems that have otherwise to be dealt with when the user clicks the OK button. With a bit of thought so many places where a user would normally enter data can be replaced with a fixed set of choices – if a little planning has gone on first.

So don’t tell users off. Don’t belittle them. Don’t ignore their needs. Do realise they think differently and while neither your way nor their way is wrong, their way is valid.

The alignment tool

October 10th, 2010

How many times a day when working on layouts do you want to quickly check the alignment of two objects, but it isn’t worth setting up a guide or a ruler? Or the software just doesn’t have that capability? So you slide another open window over to see if things do line up nicely against an edge – for instance a column of numbers in Microsoft Word – but that’s a little clumsy.

Wouldn’t it be cool if the mouse pointer could sprout a line, vertical or horizontal that you could just move with the mouse to anywhere just for a quick check?

And what about if the line could be rotated to check any alignments. So you have two objects, one to the right and up a bit from the other but you want to check that they are the same height – you sprout your line from the mouse, rotate it to touch the bottom of each object, then slide it up and check that the tops align too. Simple.

And what if this feature was built in to the operating system rather than the application – then your line would cross the whole screen and you could check the size across two applications, neither of which need be aware of the feature!

So we’d need a key sequence to turn the line on and off, one to toggle between vertical and horizontal, and one to enable free rotating. Perhaps one to flip through a number of colour schemes and rendering methods to allow good contrast regardless of background.

Wouldn’t that be pretty useful?

Making selecting easier in layout software

September 6th, 2010

I have been writing an entry about never telling a user off (guess what – they won’t like it) and I had an idea for improving page layout software that I would like to share.

Please excuse the web design slant of this post, I am fighting with Dreamweaver at the moment and it is this that got my subconscious working on a solution.

One of the common problems with layout software (and this is true for all layout software, 3D design, DTP etc as well as web page design) is that because of the layered (and/or nested) nature of the data, any object you want to select may not always be at the top. This means that, depending on how well the software in use has been written, multiple clicks may be required to select an object that is below the top – each click moving down a layer. If the objects have similar extents you may not know for sure when you have selected the object of your desire. In fact it may not be accessible at all until you hide the obscuring objects above. Even if they are not actually obscuring – as far as your eyesight shows you.

An example from a web design package: You have a table under a CSS div – the div is used to hide the tabular matter dependent on an event elsewhere on the page. It is often hard to select the table – even though you can clearly see it – because the edges coincide with the dv above it. Another example – you have a table in a div and they are both the same size. Dreamweaver is simply awful at handling this.

It occurs to me that there is an insanely simple way of dealing with this. 3D packages use it all the time, so couldn’t we implement it in 2D applications as well?

All 2D applications display their layouts in what is effectively an “orthogonal view”. This means that there is no parallax; two identically sized objects with the same origin will overlay each other perfectly and it will be hard to see or select the one “below”.  What I’m suggesting we need is the 3D or “Perspective view”. In TrueSpace or Blender or Maya or 3DMax or whatever, in order to resolve this kind of selection problem you switch to the 3D window or mode and spin the model about the notional Z axis (and perhaps a little about Y as well)  so that you can clearly see the stack of objects. Then you select the one “behind” and process it in whatever way you require.

I thought that a similar mode could be made available in a 2D layout package via a hotkey that spins the page layout and reveals all the objects – parted slightly during rendering to give the appearance of a stack and to make unambiguous selection possible. Now you can easily select an object at any depth within the stack. Some small amount of zooming and panning would probably be useful here. Once the selection is made the hotkey is released and the screen returns to 2D – but the software must not lose the selection that has been made (you wouldn’t believe how easy that is nor how often it happens).

This could be very simplistic and stack everything in the order that the objects appear in the source file. Or we could get really clever and show the stack according to any CSS rules to ensure that it is ordered in the same way that it would be rendered by a browser.

So, what is selected here?

What is selected=

Wrong… it’s a div that is ABOVE the table!

The 3D view separating the objects

As far as Dreamweaver goes, currently I don’t think Adobe have bought a 3D company so there is no chance of 3D from them yet. But perhaps Adobe will want to add 3D to Flash to make it compete with HTML5? Now if that happens then we could be in for some fun.

Or maybe someone could develop a plugin?

In the mean time, what are the other companies waiting for?

This is a Blong

July 11th, 2008

I’ve had a few comments from chums that this is very long for a blog. Well, it is long – but what precisely is a blog? As I understand it, a blog is a stream of consciousness that does not have to undergo the test of a publisher, only the reader. It has no recognised “correct” length. However, I do accept that people can get bored, and if you do while reading this I apologise. Perhaps I should call this a “BLong” to indicate it is a Blog that’s Long? Or perhaps not… Whatever, please let me know what you think. I am very happy for this to be a living organic text that evolves and – hopefully – improves.

I also want to point out that I may refer to individuals as “he”, “him”, “her”, she”, whatever. I have no preference, mission or prejudice. If anyone is counting to see which I use more often then I suggest that they are the one with the hang ups.

Real chapters are on their way.

A little bit of History

July 7th, 2008

User interfaces

I was programming before the IBM PC went on sale. But I have only ever worked on personal computers and microcontrollers. I bought my first computer in 1979. My Nascom 2 was built for me at Henry’s Radio on Tottenham Court Road in London by a man whose name I have shamefully forgotten. I have written software that has never made it out of my office and I have written software that is very public indeed.

I was first introduced to a computer at The Guinness Trust, somewhere in The City, when I was at prep school in London aged about 11 – I had little interest and no understanding. It wasn’t my favourite school outing (though I can’t remember what was my favourite school outing either).

I was next introduced to computers at school near Oxford where, in the head of maths’ office, there was a huge HP desktop calculator – I think it was an HP 9810A – it was the size of a large typewriter and had a tally roll printer. I would go in at lunch times, type in a tic tac toe program a sixth former had given me and play. It fascinated me simply because it was technical, and as a bit of a loner, its exclusivity appealed to me. But I felt no gravity.

I left school with one O level – and even that was a bit of a miracle. I had no interest in school whatsoever,An HP-25 programmable calculator it was as far as I could see a total waste of time. I went to work at Oxford airport as a trainee mechanic. Aircraft were fantastically cool. Eventually I decided to return to college to get more O levels – I had realised that being an aircraft mechanic all my life was not going to work. Then A levels beckoned, but not quite hard enough. Education was lost on me – I just couldn’t see the point. My mother suggested I did History of Art, but I wanted to follow my father and as far as I could divine he had done Zoology at Cambridge. But, even doing a subject that genuinely interested me there was just too much other stuff to do to waste my time on education. I friend of mine, Stephen Conlon, had the coolest calculator imaginable, a Hewlett Packard. It worked, as they still do, in RPN – Reverse Polish Notation. As someone who had consistently and intentionally failed any kind of maths, it made a kind of sense to me. I would endlessly load a moonlander game into it and, although it was not real time, break into a sweat trying to get the thing down safely.

On my twenty first birthday Stephen gave me a Sinclair programmable calculator. I had been jealous A Sinclair Cambridge programmable calculatorof his HP-25 for years. The calculator was temperamental and not in the same class as the HP, and it had an equals button, so it was not RPN but it was thrilling. It was also a bit of a pig to use, many functions had to be accessed by repetitive pressing of a button; as soon as the buttons started to wear these operations were fraught. But this birthday present was the first tangible evidence that gravity was forming.

Later I went to Oxford Polytechnic to study building design for no reason except I had sufficient An HP-33E programmable calculatorqualifications to get in and had figured that zoology, where my interest did lie, was not going to offer me much of a future. My uncle sent me an HP calculator from Chicago; I was aware even at the time that this was an important event. Like Stephen’s HP-25 the buttons were simply in a class of their own; as you pressed them down they clicked over centre. The thing exuded quality. In the second term we did an hour a week of computing as it was assumed that we would probably bump into some sort of computer, probably in the accounts department in the office when we left.

It was as if I had walked from a dark room into one filled with brilliant light. Someone had turned both colour and gravity on and the pull was irresistible.

I changed course to computer studies and threw myself into the parts of the course that I deemed important. Everything else simply got ignored. Cobol, maths and ethics were for other people, ethics simply because most people seem to need to be taught it (including, as far as I could see, the lecturer), for me they were part of my foundation. The subjects seemed to be neatly divided in line with their teachers – cobol was taught by a couple of dropkicks, maths – I don’t even remember. Fortran and assembler were taught by Geoff Tag, the second of three people who have totally changed my life by their input. Most people didn’t seem to get his classes, they were on the course to become cobol programmers at a bank or something. My time in Mr Tag’s classes were the high point, what that course was all about, for me. So while about a third of the course drove me the majority just didn’t get looked at. We also skirted around a bit of hardware with another lecturer whose name I have rudely forgotten. My second year project was to have been to write an operating system for my Nascom, interfacing it to a solenoid controlled Studer tape deck for the file system that one of the lecturers had pulled out of a skip. While I found it really interesting, I was not sufficiently hooked to ever learn how to design hardware and I regret that.

I failed the course.

Comprehensively and miserably.

And I regret that too.

So while all the other students hit the second year and went on to get their qualifications I went home and I programmed, some Basic and a lot of assembler. I spent a year programming for myself before I found a job. I managed to do a little programming there, and I would sit at my desk and laugh uncontrollably at the idea that someone could pay me to have so much fun. A year later I changed job and was programming full time. There I met Keith Frewin, the third of the three people to whom I owe such a very large debt of gratitude.

One day a box arrived from a contact in America and out of it fell a strangely shaped computer with a small built in screen. We found a 110V converter and turned it on. With a strange synthesised sort of trumpet parp a smiley face appeared and we sat there pole-axed. Hornswaggled. There are a few times in our lives when we can look back at an event and think, “oh yeah, that’s when everything changed” but we rarely get the insight “strewth, this is where everything changes – right now”. The latter was the feeling I got when I first looked at a Mac. Now, I don’t really stand anywhere on the Mac/PC debate – I actually think that while they both have the capability of being great, both of them can be pretty mediocre. Whose fault is that?

I suspect I just lost 9/10ths of the audience, but hopefully that means I have only the quality readers left and that’s good, because I’m not interested in talking to people who know so much already that they cannot learn any more. But what’s bad about it is that those that just left are the ones who really need to learn more.

Being partisan is an excuse for getting into a fight. Whether it’s about which PC camp you belong to or which version of creation you believe in (and I believe in my version passionately) I cannot see how life can be improved by getting into a knife fight over which football team is the best. It’s just not important. Really. Communicating is important and giving someone a good time is important, particularly if they are doing something they have to do rather than whatever it may be that they want to do. I believe in many things and because of who I am and the insecurities I feel about much of my life I can get very passionate about arguing for them but I will not trash or flame someone simply because I think they are wrong and there many, many wrong people when it comes to user interfaces.

Eventually I found myself working on a project for a bank (not using cobol – yay!) where there were a large number of very technical tasks to do, plus the user interface. Well, that’s how they saw it anyway. No one wanted the user interface as a task because no one wanted to have anything to do with the users who were basically idiots because they weren’t programmers. It is an unfortunate fact that I have spent so much of my working and nonworking life with two groups of similarly aggressive people: Programmers and guitarists. Now don’t get me wrong here, actually most of the programmers that I have worked with have been really good people, but having read many blogs and forums over the years, I don’t think I like programmers very much (there is one shining exception to this – Joel Spolsky). I have spent many years playing bass and singing in bands but I generally don’t like guitarists, although there are exceptions to this too. There is a joke that works just as well for programmers and guitarists:

Q/ How many [programmers|guitarists] does it take to change a lightbulb?
A/ 100. One to change the bulb and 99 to tell you how much better they would have done it.

So the user interface task fell to me because I didn’t take a step back fast enough and because I was the least confident member of the team and I saw the task as a refuge where no one would poke me with a sharp stick – in case they had to take some of the UI work on. And as I worked on it the brilliant light became brighter and someone turned the gravity up even more.

When that job finished I decided I’d had enough of working for other people and I’d had an idea for a product of my own. I wanted to make a device that controlled several synthesizers all at the same time using only feet. The control mechanism was already invented, MIDI, the Musical Instrument Digital Interface, that was easy all I had to do was follow the rules. The activation method was already invented, most musicians have at least one foot. The bit that I wanted to think about was how would people program the thing? People, as in human beings, not programmers. There was a big thing at the time (late 1990) about how musicians hated computers – computer sequencers were available and pretty ugly. Only geeks were into them. I wanted to make something that was good for everyone. So the first design decision was that it needed a huge screen. The unit was going to be built into a 1U high 19” rack so a huge screen was a two line, 40 character LCD. 80 characters in all. No graphics. This component was by far and away the most expensive part of the product and the only part that I never agonised over. Nowadays of course this would be laughably small, but for exactly the same reason – now people have an expectation of a graphical interface with menus and icons and help. There was to be no freaky numbers, there must be help, everything had to be obvious from what was currently on the display.

In a review in MT, the Music Technology magazine in August 1993, Ian Waugh said of the result “…the controlling front end is superb; the designer is obviously a member of the Musicians School of Friendly Interfaces.” I was pretty chuffed.

After 12 years on my own I went back to having a proper job and went to a company who made big test equipment. They felt that their current range was coming to the end of its life and a radical new product was required. The initial plan was to use the same UI that they had developed for the current product as it meant (quite reasonably) that the work had already been done. Actually I believe that’s being charitable – I don’t think they thought about the UI for one second. It just wasn’t important enough to warrant thinking about. Although my primary responsibility was low-level control of the hardware devices, once again I slipped the user interface into my task list. They were not happy but heck I was used to that. Few bosses ever want expensive developer effort wasted on the user interface – besides, this was a technical product for technical people; there wouldn’t be any users, only technicians. So I let people complain but left my artwork or workflow in. After a week or so people would accept that that was how that feature worked. As a late entrant into that particular market the company had some catching up to do – but we did it. And a lot of customers specifically  commented on the ease of use, which obviously pleased me. But not my boss. He felt that the unit was inherently easy to use purely by dint of its existence. After a few versions had gone by the pressure on me to give up my work on the user interface was becoming pretty intense. I was doing (and had done) all the artwork for all the programmers at home in my own time, but there was a lot of programming to do and the whole thing was getting more complex, trying to squeeze more performance out of cheaper hardware. Eventually I ran out of time and, under almost constant pressure to stop, I dropped the user interface work. The next version was met by a pretty universal wailing and gnashing of teeth by our customers. There were several reasons for this but one of them was stated loud and clear by several customers and by potential customers – The software was not easy to use.

We got talked to by the management – what had gone wrong? Why weren’t the customers – and potential customers – happy anymore? I pointed out that it was because the system is not easy to use – it’s just a mashup of different functions all of which were operating differently – there was no consistency across the functionality written by different programmers. My boss said no, that cannot be it; our customers are saying that we have the easiest to use system on the market. Except of course that they weren’t saying that any more. They were bizarre conversations and I would like to be able to go back and listen to them again to see if I remember them correctly. This mindless stonewalling went on for a while so eventually I asked him why he thought Microsoft and Apple had rooms full of artists and psychologists – and he replied “because they can afford to”. He was completely unable to take the next step and ask himself how can they afford to? People buy their software because despite all the whinging it’s actually not that bad and it always looks good. You get a good feeling from looking at it. Our relationship disintegrated and a while later, along with others, I was made redundant. That was three years ago. I was in their R&D department a few months ago and saw that not much has changed – bigger screens, more functionality but still a bit of a mess really.

Now I do feel that if I had been allowed to continue with my work on the user interface they could have a better product. I feel that if everyone spent more time, effort and money on their user interfaces they could have better products. If you spend more time on your user interface you would have a better product. What are you waiting for? How will you know if your user interface is better than it was? Well, how will you know if it isn’t?

Humans (well, most humans) enjoy humour. Humour survived because it was funny. Films can be serious or they can be funny. Plays, books, stories, cartoons, pictures, everything can be serious or funny (or, of course, they can be boring or ugly). But we choose not to waste our time in front of boring stuff. We choose not to take everything seriously. Most of us choose “enjoyable” over “difficult” when we can.

Why can’t we inject stuff into our user interfaces that make a small flicker of a smile crawl across a user’s face. I don’t mean jokey manuals (actually I really really don’t mean jokey manuals, they’re awful) but something so that a user can think “oh – they enjoyed writing that bit of code, I can tell”. Why not? One of the first examples I can think of was on the original Mac if you disabled the sound interface, where the buttons had been was a hole, with mounting screw fixings and a couple of disconnected wires dangling. Some people would see that and think, “oh – that doesn’t work at the moment” and not give it another second’s thought (which is fine), some, like I did, would laugh out loud. But I can’t think of many who would think “well that’s a stupid waste of time”, can you? My boss did at the test equipment company when he asked me to remove an indicator that didn’t need removing. So I replaced it with a hole with mounting screw fittings and a couple of disconnected wires dangling…<if anyone has an image of the original disabled sound control panel I’d be very grateful for a copy>

When the marketing department of a company creates a brochure they try to show the product in the best possible light, often using humour to catch the eye of the audience. Actually these days they tend to use good looking people usually drinking coffee – I’m not quite sure what they are trying to express here, perhaps that if you use our software you will have more time to drink coffee and either our product or the coffee will make you and your work mates better looking. Or, if you are a programmer and therefore already beautiful, it will make you worse looking. I don’t know, but personally I prefer a screenshot of the product. I know what I look like drinking coffee and I can assure you that it is pretty unlikely that you will ever see me drinking anything in an advert. But I digress. Surely if we are admitting that humour and good looks shift product, why don’t we make the product humorous and good looking? Doh!

Because it takes time and money.

But so did the marketing.

I suspect that some part of the smiling people drinking coffee is about familiarity. And familiarity is absolutely fundamental in user interfaces. Absolutely fundamental. How many of you remember one of the main thrusts of the Windows 1 advertising, which did actually follow all the way through (I think) to Windows 3? It was all about how if you could use one program in Windows you could use them all. Every program would use the same set of simple, obvious controls in the same way. Of course Microsoft pretty quickly allowed everyone to break that by allowing programmers to subclass the controls and stick their own graphics onto the functionality. But fundamentally they were still sliders, progress bars, check boxes and radio buttons; they just didn’t always look like sliders, progress bars, check boxes and radio buttons. But this did allow a terribly important change to occur. The concept of the mimic panel concerns the ability of making a user interface that looks like the real world environment that it is replacing. If you were to replace a nuclear power station control panel with a personal computer it would make the transfer easier if the user interface exactly replicated the hardware – the engineers would immediately know where to find every switch, dial and panic button. You’d barely need a manual. And if the UI controls were overlaid with photographs of the actual controls, surely that would enhance the users’ experience even further. If anyone out there has used a flight sim over the last few years, take a look at how the graphics have matured. If you haven’t, then here are some screen shots of Microsoft’s flight simulator over the years:

The Cessna 172 instrument panel from Microsoft Flight Simulator II

The second version of Microsoft Flight Simulator. In many ways it was a miracle – we could fly any time we wanted and at the time it felt like we were in the cockpit.

The Cessna 172 instrument panel from Microsoft Flight Simulator 5.1

The fifth version of Microsoft Flight Simulator. From here it looked like things just couldn’t get better.

The Cessna 172 instrument panel from Microsoft Flight Simulator X

But they did. The tenth version of Microsoft Flight Simulator.

All of this effort was for fun. Even if the product was being used for serious purposes, real pilot training, the sense of immersion was important. It was never dull or boring.

Oh, sorry, but that’s a game right? So it doesn’t count.

Gimme a break.

Again I ask – why can’t software be humorous and good looking?

Good user interface design is not impossible nor even hard but it can involve the unexpected. It doesn’t even have to have gorgeous and expensive graphics. There are a few things to learn, a lot of things to realise, and a great deal of fun to be had. Give it a go. Spend some time making your software better than everyone else’s and, if we all do that, then we will have raised the bar and made software a better place to go.

And after all this, what of my decision not to do History of Art? Well I think I have discovered that I am much more artistically minded that I realised. I still could not tell you the difference between a good piece of art and a bad one, nor could I draw you anything but I have tools for that. I think now that perhaps I should not have been a programmer, I am not confident enough to compete aggressively nor to believe that I am right and the compiler/operating system/API/library/whatever is buggy – I have wasted too many hours of my life in the mistaken belief that a bug must be mine. So now I work with the artistic side of my brain and I feel a lot more confident about what I think. Oh yes – now I know I’m right!

I was having dinner a few years ago and one of the party was the Financial Director of a very, very large international corporation. He asked me what I did for work and I explained my interest in user interfaces in a way I hoped he would understand. His response was “So, you’re a programmer – but on our side?” Once again, I was pretty chuffed.

I’ve told you why I’m so passionate about this, and how I came to be – next I’ll try to tell you some of the things you can do to your systems to make more sales.

See you then.