Numbers for numbers' sake?
When I was about 10 I used to sit by the side of the road with a small notebook and pencil and write down the license plate numbers of passing cars. I can still remember some of them: SVY673 and XDN210 spring readily to mind, and MDT20 was a particular favourite. Usually as I was writing down one number, ten more would whizz by unnoticed, but I didn't care. It was a good way of passing time and it felt important. It was also perfectly pointless.
I didn't realise it at the time, but the same approach to data collection exists in the adult world, too. Soon after a company discovers the value of usability, it wants to start measuring it. Over time, UX teams gather a sizeable body of metrics and begin to feel important — but these measurements are rarely, if ever, used for anything related to the business and they seldom seem to surface in design meetings or drive development decisions.
Has the exercise of writing down usability numbers become perfectly pointless too?
How to link stats to business metrics
Let's step back for a moment and look at why most organisations measure stuff.
C-level managers use numbers because they have predictive value. For example, I may notice that when the number of people entering my store increases (at Christmas, say), so does the number of sales I make. So if I can increase the number of shoppers at other times of the year — for example, by advertising — I would expect that to lead to an increase in sales and profit. "Footfall" becomes a coincident metric that I can use to predict profit.
Most companies have no shortage of metrics relating to the success and failure of their products. They have sales figures, service incident rates, customer loyalty indicators, product return rates, customer support call-centre volumes, ‘customer instruct' rates and so on.
These are useful metrics, but they all suffer from the same problem: they are so-called ‘lagging' indicators. You can obtain these metrics only after you have launched the product — and sometimes long after the launch date. This is fine if the figures are strong, but if your product is failing, this is too late to find out about it. All you can focus on then is costly damage limitation via retroactive fixes, additional call-centre agents, and product replacements or, in some cases, a product recall.
What businesses need is a ‘leading' indicator: a metric that can predict product success or failure before the product has been released. That's where usability measurements come in.
Step 1: Measure usability
The international usability standard, ISO 9241, contains a definition of usability that we can use to operationalise and measure usability. I've written about this elsewhere, but briefly we need to measure the effectiveness and efficiency of a system (both can be measured objectively) and include a measure of user satisfaction (a subjective measure).
You need to measure these three components of usability for each red route, and then the results can be combined across tasks to give an overall measure of effectiveness, efficiency and satisfaction. Finally, you can aggregate the three measures into a single metric for usability.
When arriving at your single usability metric, you may want to weight one of your usability measures more highly than the others. For example, for a museum kiosk, effectiveness might be the most important of the three measures. For an intranet, efficiency might be the most important measure. And for a game on an iPhone you might want to make satisfaction the most important measure. The point is that although you need to collect all three measures to get a fully rounded picture of usability, it's OK to prioritise one of the measures over the others in coming up with your single usability metric.
Step 2: Correlate your UX metric with business metrics
In this step, you need to work out the predictive value of your metric. There's a slow way of doing this and a fast way.
The slow way is to gradually build up a database of usability metrics for your products, wait for them to be released and then afterwards examine the business metrics associated with the product, like return rate and calls to customer support. Depending on your product, this might take months at best and could even take years.
The quicker way is to run some usability tests of your products that are already in the market place. I'd suggest picking three products, one with (say) a high volume of calls to customer support; one with a ‘typical' volume of calls; and one with a lower-than-average volume of calls. Run a usability test of each product with about 20 participants and calculate the single usability metric for each one. You now have three data points you can use to predict call volume from a usability metric.
Step 3: Start predicting product success and failures
As your body of data increases you will be able to correlate usability metrics with business metrics and start to make predictions about the likely success of a product. You will be able to step up in a business meeting and say things like this: "Our new product has a usability metric of 53%. If we launch it now we can expect to see a customer satisfaction rate of only 40% and a return rate of 25%. In addition, the customer support agents are likely to see a 20% increase in call volume. Are we sure we want to risk this?"
User experience is about more than numbers
Paul Brodeur has written, "Statistics are human beings with the tears wiped off". The same could be said for usability measurements. I'd be the first to confess that these numbers capture only one aspect of the user experience — but it's an aspect that we can use to provide real business value and to ensure that user experience has a voice at the business table. It's not only about numbers, but numbers are certainly part of what we do. And if we're going to collect numbers, we owe it to ourselves to do something useful with them, and not simply write down the license plates of passing cars.
Sign up for our newsletter to hear about future articles where we'll show you how to use usability metrics to diagnose problems with your product or interface.
Thanks to Beth Maddix, Lynne Tan and David Travis for their comments on this article.
About the author
Dr. Philip Hodgson (@bpusability on Twitter) holds a B.Sc., M.A., and Ph.D. in Experimental Psychology. He has over twenty years of experience as a researcher, consultant, and trainer in usability, user experience, human factors and experimental psychology. His work has influenced product and system design in the consumer, telecoms, manufacturing, packaging, public safety, web and medical domains for the North American, European, and Asian markets.
Online training in user experience
Get a job in UX and improve your web site's UI design with these in-depth, hands-on user experience training courses. More details
Every month, we share an in-depth article on user experience with over 10,000 newsletter readers. Want in? Sign up now and download a free guide to usability test moderation.
User Experience Articles
Our most popular articles
Our most commented articles
Our most recent articles
Search for articles by keyword
- 7 articles tagged accessibility
- 4 articles tagged axure
- 5 articles tagged benefits
- 12 articles tagged careers
- 8 articles tagged case study
- 1 article tagged css
- 8 articles tagged discount usability
- 2 articles tagged ecommerce
- 8 articles tagged ethnography
- 14 articles tagged expert review
- 1 article tagged fitts law
- 3 articles tagged focus groups
- 1 article tagged forms
- 6 articles tagged guidelines
- 10 articles tagged heuristic evaluation
- 7 articles tagged ia
- 14 articles tagged iso 9241
- 7 articles tagged iterative design
- 3 articles tagged layout
- 1 article tagged legal
- 10 articles tagged metrics
- 3 articles tagged mobile
- 6 articles tagged moderating
- 3 articles tagged morae
- 2 articles tagged navigation
- 9 articles tagged personas
- 15 articles tagged prototyping
- 7 articles tagged questionnaires
- 1 article tagged quotations
- 4 articles tagged roi
- 15 articles tagged selling usability
- 12 articles tagged standards
- 39 articles tagged strategy
- 2 articles tagged style guide
- 4 articles tagged survey design
- 5 articles tagged task scenarios
- 2 articles tagged templates
- 21 articles tagged tools
- 46 articles tagged usability testing
- 3 articles tagged user manual