"Most software," argues interface design guru, Alan Cooper, "deserves to be spanked". Those of us who use software in our day-to-day jobs would agree. Most software is too difficult to be used by the people who have to use it.
We've all experienced the frustration of software that's made us feel stupid, and we've all lost countless hours grappling with an application that seems to have a mind of its own (or no mind at all). So we are not surprised when we learn that poorly designed software can cost a large corporation tens, if not hundreds, of millions of dollars in lost productivity, employee training and technical support every year. Companies who develop software also incur costs of the same magnitude as they attempt to shore up difficult to use applications with help systems, user manuals, online tutorials, and technical support departments.
In fact, the true scale of the problem may well be understated, as some companies don't always assign an end-user's difficulty to software failings. Often they think that the end user is the problem. For example, it is not uncommon for a Help Desk ticket to be logged in a report as "PICNIC": problem in chair, not in computer. It is a common enough diagnosis to have spawned its own line of T-shirts and mugs. But the root of the problem is almost never the end user, who admittedly may not be a computer expert but who may, instead, be a medical expert, a legal expert, a financial expert, an administrative expert, or some other person who is indeed an expert at their own job, and who is simply trying to use an application as a tool to get their job done.
The PICNIC is over
So why do companies buy difficult-to-use software applications in the first place? Why don't they buy easy-to-learn and easy-to-use applications instead? What are they thinking?
Well, what they are thinking is more or less the same thing that you and I think when we need to buy certain products (an alarm clock is a good example) and you realize that there is no real way of knowing whether it is the right product until after you have bought it and lived with it for a while. "We often can't discover the inefficiencies of a software product until we bring it in-house," says Doug Francisco, Director of IS architecture at Boeing. For companies like Boeing, who have thousands of end-users, that means design flaws can cost millions of dollars in lost time and productivity, and it can trigger further losses if a system then has to be replaced and relearned. Alas, software doesn't typically come labelled with an ease-of-use score, like the calorie count on a jar of peanut butter, or a tog rating on a duvet.
But it could. Ease of use is rapidly rising to the top of the list of purchasing requirements that companies consider when investing in software applications (and other technology products). We now have a way for IT managers and corporate purchasers to know just how usable an application is before they commit to buying it. Enter CIF.
What is CIF?
CIF is a Common Industry Format for usability reporting. It became an ANSI standard (ANSI/NCITS 345-2001) in 2001 and an ISO standard (ISO/IEC 25062:2006) in 2006. CIF describes a method for reporting the findings of usability tests that collect quantitative measurements of user performance. CIF does not describe how to carry out a usability test, but it does require that the test include measurements of the application's effectiveness and efficiency as well as a measure of the users' satisfaction. These are the three elements that define the concept of usability (from ISO 9241-11).
CIF is aimed at usability professionals, at stakeholders in an organization who have to decide whether a software product is ready for release, and at people whose responsibility it is to make informed decisions about which software tools to purchase for their organisation. To comply with the CIF standard, a usability report must include the following information:
- A description of the product.
- The goals of the test.
- The test participants.
- The tasks the users were asked to perform.
- The experimental design of the test.
- The method or process by which the test was conducted.
- The usability measures and data collection methods.
- The numerical results.
Does CIF have industry backing? It certainly does, and already usability-savvy companies are saving millions of dollars. In one pilot test of CIF, Boeing reported a cost benefit for the total ownership of a widely used productivity application of $45 million. In another example, State Farm avoided paying $5 million for a system, and also avoided the further costs of supporting the system, when testing showed that none of the test users could complete their tasks. In addition to these companies who were in the vanguard of CIF's development, about 150 other companies were involved in its development and testing. Among them were Fidelity Investments, Hewlett Packard, Intel, Kodak, Oracle, Microsoft, Northwest Mutual, PeopleSoft, and Sun Microsystems, to name just a few.
The power of usability metrics
Usability is often thought of as a "soft" product attribute — something intangible and difficult to pin down. In the past, usability was often assessed by asking people if they liked a product, or if they felt it was easy to use. Those days are over. It's time to get real. The three critical elements of usability — effectiveness, efficiency, and satisfaction — can be operationalised and they can be measured, just like any other attribute of a product or system.
The ability to measure usability changes the game for companies. The astronomer Carl Sagan had this to say about quantitative data:
“If you know a thing only qualitatively, you know it no more than vaguely. If you know it quantitatively — grasping some numerical measure that distinguishes it from an infinite number of other possibilities — you are beginning to know it deeply. You comprehend some of its beauty and you gain access to its power and the understanding it provides.”
In addition to "How usable is this product?" here are some more questions that can only be answered if you quantify usability:
- Are our products more usable than the competition?
- Is our software usability improving over time?
- Is the usability of products within a brand consistent?
- Is the new version easier to use than the previous version?
- Have we met our usability targets?
- Can we exit the design cycle yet?
- Is the product ready for release?
- What must we improve to "move the needle" on the usability meter?
- Which of these three competing software applications should I buy for my company?
- Can usability predict our traditional success indicators?
- Is our investment in usability making a difference?
These are vital questions. We don't need to rely on guesswork, hunches, or personal opinions to answer them; we can use numerical data derived from observing real users carrying out real tasks with the target product. And CIF provides the framework to ensure the data are presented in a standardized way, and provides the information stakeholders will need to understand how usability was measured so that the data can be verified.
What you should you do
How can you make CIF and usability metrics work for your organisation? Here are some important leadership steps you can take:
If you are responsible for purchasing software for your company
- Add easy-to-use to your list of purchasing criteria.
- Demand that your software suppliers provide a CIF-compliant usability report.
- Decide what level of usability you will accept - the higher the usability score, the lower your support costs will be.
- If your supplier cannot provide usability metrics, have your in-house usability team, or your external usability supplier, test the software and its competitors before you buy it.
If you are responsible for developing software
- Make sure you have usability experts on your team.
- Establish user-centric performance indicators early in the project.
- Introduce usability toll gates to your development process.
- Talk to your marketing experts and let them know you will be providing hard usability metrics that can be used to promote the application.
If you are responsible for marketing your company's software
- Demand that your development team provides you with bullet-proof usability metrics that you can use to make reliable and verifiable claims about how easy your applications are to use.
- Put the weight of your marketing organization behind a push to invest in usability for your company.
- Tell your customers that you are about to begin providing them with a reliable answer to the question they keep asking: "How usable is this application?"
If you are responsible for usability in your company
- Start measuring.
- Measure the usability of your current products and those of your competitors, so that your stakeholders can see how your company compares.
- Let your marketing and development teams know that you intend to start measuring usability, and work with them to set targets.
- Kick off an internal program to educate stakeholders about CIF and usability metrics.
- Get usability into the front line, and show how its impact can be quantified.
The introduction of usability metrics and CIF-compliant reporting will give your company a new vocabulary with which to discuss your products. Promote that language. Host a "Usability Day" within your company to raise awareness of CIF and to launch a new metrics-driven approach to design.
The more that purchasers demand CIF-compliant reports, the more software vendors will have to provide reliable usability measurements. And that will force usability into the design cycle. It will trigger an upward spiral that can only result in easier to use software applications. And that is good news for everyone.
About the author
Dr. Philip Hodgson (@bpusability on Twitter) holds a B.Sc., M.A., and Ph.D. in Experimental Psychology. He has over twenty years of experience as a researcher, consultant, and trainer in usability, user experience, human factors and experimental psychology. His work has influenced product and system design in the consumer, telecoms, manufacturing, packaging, public safety, web and medical domains for the North American, European, and Asian markets.
Foundation Certificate in UX
Gain hands-on practice in all the key areas of UX while you prepare for the BCS Foundation Certificate in User Experience. More details
Every month, we share an in-depth article on user experience with over 10,000 newsletter readers. Want in? Sign up now and download a free guide to usability test moderation.
If you liked this, try
It's a truism that even a bad usability test will help improve your software. But the findings from different usability tests are notoriously difficult to compare. This makes it difficult to track usability improvements or to see how you compare against an earlier product. A new international standard looks set to solve this problem. Standards update: Usability test reporting.
Usability test plan toolkit
This eBook contains all you need to make sure that you're fully prepared for your next usability test. Usability test plan toolkit.
User Experience Articles
Our most popular articles
Our most commented articles
Our most recent articles
Search for articles by keyword
- 7 articles tagged accessibility
- 4 articles tagged axure
- 5 articles tagged benefits
- 13 articles tagged careers
- 8 articles tagged case study
- 1 article tagged css
- 8 articles tagged discount usability
- 2 articles tagged ecommerce
- 13 articles tagged ethnography
- 14 articles tagged expert review
- 1 article tagged fitts law
- 4 articles tagged focus groups
- 1 article tagged forms
- 6 articles tagged guidelines
- 10 articles tagged heuristic evaluation
- 7 articles tagged ia
- 14 articles tagged iso 9241
- 9 articles tagged iterative design
- 3 articles tagged layout
- 2 articles tagged legal
- 11 articles tagged metrics
- 3 articles tagged mobile
- 7 articles tagged moderating
- 3 articles tagged morae
- 2 articles tagged navigation
- 9 articles tagged personas
- 15 articles tagged prototyping
- 7 articles tagged questionnaires
- 1 article tagged quotations
- 4 articles tagged roi
- 16 articles tagged selling usability
- 12 articles tagged standards
- 42 articles tagged strategy
- 2 articles tagged style guide
- 4 articles tagged survey design
- 5 articles tagged task scenarios
- 2 articles tagged templates
- 21 articles tagged tools
- 52 articles tagged usability testing
- 3 articles tagged user manual