The concept of heuristics has a long history, spanning the fields of philosophy, law, psychology, and human-computer interaction among others. Wikipedia's discussion of the topic includes links to numerous well-known psychological heuristics such as anchoring and adjustment, availability, and peak-end rule.

Heuristics in Human Computer Interaction

In the field of human-computer interaction (HCI), one of the most popular inspection-based methods for evaluating usability is the Heuristic Evaluation (HE) as described originally by Nielsen and Molich and later refined by Nielsen. Promoted for its cost efficiency and ease of implementation, the HE method consists of one or more experienced evaluators (3-5 recommended) applying an established set of guidelines (or heuristics) as they review a given system. Also known as an 'inspection' method or 'discount" method' of usability evaluation, the Heuristic Evaluation is seen as an economical alternative to empirical usability tests involving actual users.

In a “true” HE, the group of evaluators apply Nielsen's ten heuristics during their independent review of a system. The evaluators systematically explore the product or system, considering each heuristic in turn in an effort to identify potential usability issues.

The Value of Heuristic Evaluations

Heuristic evaluations are arguably most valuable as an early evaluation technique, performed on an existing system or first prototype in an effort to identify the major usability problems. By conducting a heuristic evaluation early on, two things are effectively accomplished:

  • The major 'showstopper' problems are identified without incurring the expense, time, and certain frustration on the part of the users that would be accompany formal tests with users
  • Anticipated problem areas are identified for later evaluation when empirical sessions with users are conducted

When the circumstances and constraints of a project warrant, heuristic evaluations have been shown by many researchers to offer designers and evaluators a valuable alternative to user testing.

The Challenges of Heuristic Evaluations

In my experience, true Heuristic Evaluations are rarely performed in the manner that Neilsen and Molich originally intended. Rather, they often amount to a single evaluator's 'expert review' of a system, and are subsequently open to criticism about the reliability of their findings and recommendations.

Even when multiple evaluators are used in a Heuristic Evaluation, it often proves difficult for evaluators to reliably interpret the guidelines and identify the same usability issues. Another common criticism of HE's is that they are capable of generating a lot of 'false positives' - usability problems that would likely go unobserved in an empirical usability study, but that end up being identified by evaluators whose awareness is heightened by the guidelines to identify relatively minor problems.

Modifying Heuristics to Meet New Needs

It has now been over 15 years since the Heuristic Evaluation method was first developed, and while it continues to be considered somewhat of a standard in the HCI industry, many evaluators have found that Nielsen’s original list does not always meet their specific needs. They discover that they frequently require alternative guidelines or some re-interpretation of Nielsen’s original descriptions in order for each heuristic to make sense. As a result, several modifications of Nielsen's heuristics have been developed over the years in an effort to help improve their interpretation, reliability, and 'goodness of fit' for new technologies and various industries.

So what's next for the Heuristic Evaluation method? Here are my predictions:

  • The HE method will continue to be regarded as a core method in the HCI toolbox in future years.
  • It will continue to be practiced largely in an ad-hoc, customized manner that only resembles its original form, namely in an informal, individual evaluator format. This will hamper the method's further development, and reduce both its perceived and actual value.
  • As new technologies, products, and interactions are developed, new heuristics will be developed in concert.

Reading list

If you are interested in reading more about Heuristic Evaluation, you may find this reading list useful.

Original paper on heuristic evaluation
Nielsen, J., & Molich, R. (1990). Heuristic evaluation of user interfaces. Paper presented at the ACM CHI'90 Conference on Human Factors in Computing Systems, Seattle, WA.
Refinements on heuristic method
Nielsen, J. (1994). Heuristic evaluation. In J. Nielsen & R. L. Mack (Eds.), Usability Inspection Methods. New York: John Wiley & Sons, Inc.
Neilsen's heuristics
Neilsen, J. Ten usability heuristics.
Pierotti, D. Heuristic evaluation - A system checklist.
Support for inspection-based methods and their comparison results
Fu, L., Salvendy, G., & Turley, L. (2002). "Effectiveness of user testing and heuristic evaluation as a function of performance classification". Behaviour & Information Technology, 21(2), 137-143.
Kantner, L., & Rosenbaum, S. (1997) "Usability Studies of WWW Sites: Heuristic Evaluation vs. Laboratory Testing". In the Proceedings of the 1997 Association for Computing Machinery's Special Interest Group on Documentation (ACM SIGDOC).
Nielsen, J. (1992). "Finding usability problems through heuristic evaluation". Paper presented at the ACM CHI'92 Conference on Human Factors in Computing Systems, Monterey, CA.
Nielsen, J., & Phillips, V. L. (1993). "Estimating the relative usability of two interfaces: Heuristic, formal, and empirical methods compared". Paper presented at the ACM INTERCHI'93, New York.
Sawyer, P., Flanders, A., & Wixon, D. (1996). "Making a difference: The impact of inspections". Paper presented at the ACM CHI 1996 Conference on Human Factors in Computing Systems, Vancouver, BC, Canada.
Challenges to inspection-based methods and their comparisons
Cockton, G., & Woolrych, A. (2002). "Sale must end: Should discount methods be cleared off HCI's shelves?" Interactions, 9.5 (September & October), 13-18.
Gray, W. D., & Salzman, M. C. (1998). "Damaged merchandise? A review of experiments that compare usability evaluation methods". Human-Computer Interaction, 13, 203-261.
Jeffries, R., Miller, J. R., Wharton, C., & Uyeda, K. M. (1991). "User interface evaluation in the real world: A comparison of four techniques". Paper presented at the ACM CHI'91 Conference on Human Factors in Computing Systems, New Orleans, LA.
Karat, C. M. (1994). "A comparison of user interface evaluation methods". In J. Neilsen & R. L. Mack (Eds.), Usability Inspection Methods. New York: John Wiley & Sons, Inc.
Alternatives to Neilsen's Heuristics
Bastien, J.M.C. & Scapin, D.L. (1995) "Evaluating a user interface with ergonomic criteria". International Journal of Human-Computer Interaction, 7(2), 105-121.
Borges, J. A., Morales, I., & Rodriguez, N. J. (1996). "Guidelines for designing usable World Wide Web pages". Paper presented at the ACM CHI 1996 Conference on Human Factors in Computing Systems, Vancouver, BC, Canada.
Emmus (European Multimedia Usability Services) (1999). CELLO: Evaluation by inspection.
IBM (2003). Design basics.
Instone, K. (1997). Site usability heuristics for the Web.
ISO 9241-12 (1998). ISO 9241-12:1998 Ergonomic requirements for office work with visual display terminals (VDTs) -- Part 12: Presentation of information
ISO 9241-110 (1996). ISO 9241-110:2006 Ergonomics of human-system interaction -- Part 110: Dialogue principles.
Kamper, R. J. (2002). "Extending the usability of heuristics for design and evaluation: Lead, follow, and get out the way". International Journal of Human Computer Interaction, 14(3&4),447-462.
Kantner, L., Shroyer, R., & Rosenbaum, S. (2002). Structured Heuristic Evaluation of Online Documentation. IPCC 2002 Proceedings. IEEE Professional Communication Society.
Mayhew, D. (1992). General principles of user interface design. In Principles and guidelines in software user interface design. Englewood Cliffs, NJ: Prentice Hall.
Miller, R. H. (2000). Web site evaluation criteria
Muller, M. J., Matheson, L., Page, C., & Gallup, R. (1998). "Participatory heuristic evaluation". Interactions, 5 (September & October), 13-18.
Purho, V. (2000). Heuristic inspections for documentation - 10 recommended documentation heuristics. STC Usability SIG Newsletter, 6(4).
Rosenfeld, L. (1998). Information architecture heuristics. LouisRosenfeld.com, Aug 17, 2004.
Rosenfeld, L. (1998). Heuristics for Search Systems. LouisRosenfeld.com, Sept 02, 2004.
Sears, A. (1997). "Heuristic walkthroughs: Finding problems without the noise". International Journal of Human-Computer Interaction, 9(3), 213-234.
Shneiderman, B. (1998). Eight golden rules of interface design. In "Designing the User Interface". Addison Wesley, 3rd Edition.
Skinner, Grant. (2003). Usability heuristics for rich internet applications. boxesandarrows.com.
Tognazzini, B. (2001). First principles.
Usability.gov (2003). Research-Based Web Design & Usability Guidelines.

About the author

Todd Zazelenchuk

Dr. Todd Zazelenchuk (@ToddZazelenchuk on Twitter) holds a BSc in Geography, a BEd, an MSc in Educational Technology and a PhD in Instructional Design. Todd is an associate of Userfocus and works in product design at Plantronics in Santa Cruz, CA where he designs integrated mobile, web, and client-based software applications that enhance the user experience with Plantronics' hardware devices.



Foundation Certificate in UX

Gain hands-on practice in all the key areas of UX while you prepare for the BCS Foundation Certificate in User Experience. More details

Download the best of Userfocus. For free.

100s of pages of practical advice on user experience, in handy portable form. 'Bright Ideas' eBooks.

If you liked this, try…

Most people that carry out usability expert reviews use Jakob Nielsen's ten usability 'heuristics'. Many of these guidelines are common sense but they are not based on substantive research. The International usability standard, BS EN-ISO 9241-110 proposes an alternative set of seven guidelines. These guidelines have the benefit of international consensus and they can be applied to any interactive system. Usability Expert Reviews: Beyond Heuristic Evaluation.

Related articles & resources

This article is tagged expert review, heuristic evaluation.


Our services

Let us help you create great customer experiences.

Training courses

Join our community of UX professionals who get their user experience training from Userfocus. See our curriculum.

Todd ZazelenchukDr. Todd Zazelenchuk (@ToddZazelenchuk on Twitter) holds a BSc in Geography, a BEd, an MSc in Educational Technology and a PhD in Instructional Design. Todd is an associate of Userfocus and works in product design at Plantronics in Santa Cruz, CA.

Get help with…

If you liked this, try…

Get our newsletter (And a free guide to usability test moderation)
No thanks