The 3 Lenses of Usability Evaluation

The first lens: 1-1 Moderated usability testing

Organisation A is in the banking sector. As with many companies in the banking sector, the firm has undergone an enormous amount of change in the last 5 years. As part of that change, the organisation has re-oriented itself around customers and they have a large (and growing) user experience team.

Prior to the development of the user experience team, most of the bank’s knowledge about customers came from market research carried out by external agencies. This meant that internal teams tended to base their product decisions on aggregate data and they often felt one-step removed from customers. As a reaction to this, the team’s manager encouraged her London-based team to get face-to-face with customers in usability testing sessions. These sessions take three forms:

  • Lab-based tests. These take place in the organisation’s newly-built usability lab. Building a lab was one of the manager’s first demands when she took over the team. She told me that the lab makes a statement to the organisation: ‘our users matter’. It has good viewing facilities and this means developers, business analysts and project managers can drop in and see real users struggling to use their software.
  • Pop-up ‘guerrilla’ research sessions. These are held every sprint at on-the-fly at locations where customers congregate. This allows the team to test out design hypotheses with users that can be fed back rapidly to the design team.
  • Remote moderated usability tests using technologies like Skype. This ensures the team can include users who live outside the capital or are otherwise difficult to involve in the research.

This team’s strategy is all about qualitative, in-depth, one-to-one sessions. This approach works well for the organisation because it ensures the team gets first-hand exposure to users rather than having user needs filtered through a third party. It also ensures that the team can carry out quick, focused tests that support the organisation’s agile way of working.

The second lens: Remote, unmoderated usability testing

As a web-based start-up, Company B has a very different culture from Company A and this is reflected in their approach to usability testing. The company CEO is driven by the numbers and he has recruited a user experience manager in his image. The manager of the user experience team doesn’t see a lot of point in investing in lab facilities when there are so many useful tools on the web. His team uses remote, unmoderated tools for usability testing. They carry out these kinds of research:

  • Benchmark usability testing using tools like Loop11. These tests provide solid numbers that the team can use to validate hypotheses: for example, they can definitively measure task completion rates and time on task with sample sizes in the 100s.
  • Survey-like tests that uncover where people would click, which of two designs people prefer, and what people can remember after just a few seconds of exposure, using tools like Zurb's Verify.
  • Online card sorting tests that explore navigation, using tools like Optimal Workshop.
  • Remote, unmoderated, ‘think aloud’ usability tests, using tools like What Users Do and Usertest.io. In these sessions, participants are set various tasks, record their experiences as they carry them out, and the sessions are uploaded to the cloud where the team can watch them at their leisure.

This team’s strategy is mainly about using quantitative usability data to steer the organisation. The team also ensure the organisation stays grounded in real user behaviour by encouraging the team to view videos from the remote, unmoderated, 'think aloud' sessions. This approach works well for this organisation because it is strongly driven by quantitative data.

The third lens: Measuring real-world use

Company C is a large retailer with a number of bricks and mortar stores. With its history in retail outlets, this organisation favours data from natural, real world use, not the kind of data that comes from the scripted studies favoured by Companies A and B. Company C knows that moving a product to the end of an aisle in a physical store increases its sales, and they are continually on the prowl for similar behaviours customers exhibit with its web site. This firm wants to know how to optimise the details of each element on its web page. This means the design team wants to know what real customers are doing with their web site right now.

This design team favours the kind of quantitative, real world usage data that comes from A/B and multivariate testing. They use Google Analytics to:

  • Identify the top destinations in their site and how this fits with their advertising and campaign performance.
  • Run A/B tests using Google Analytics Content Experiments to optimise individual page layouts.

This approach suits the company culture because the web site is treated as just another retail outlet (though an extremely profitable one) that submits its sales returns at the end of each week.

Which approach is best?

One-to-one moderated usability testing solves the problem of getting the design team exposed to customers so they make better design decisions. Remote, unmoderated usability testing solves the problem of having quantitative data to choose between design ideas. Measuring real-world use solves the problem of knowing how customers use your system right now.

But none of these is an ideal strategy on its own.

The best kind of user research is triangulated. Triangulation is like observing your users through different lenses. Sometimes you want to be up close and personal. Other times you want to sit back and observe the crowd.

One difficulty is that different organisations (and different user researchers) tend to favour one or other of these three approaches, so it’s rare to find a company observing its users through all three lenses. The good news is that if your organisation favours just one of these approaches, you’ll be able to seriously improve design outcomes by showing the design team how to look through a different lens. Train all three lenses on your users and you’ll have the beginnings of an unbeatable usability evaluation strategy.

About the author

David Travis

Dr. David Travis (@userfocus) has been carrying out ethnographic field research and running product usability tests since 1989. He has published three books on user experience including Think Like a UX Researcher. If you like his articles, you might enjoy his free online user experience course.



Foundation Certificate in UX

Gain hands-on practice in all the key areas of UX while you prepare for the BCS Foundation Certificate in User Experience. More details

Download the best of Userfocus. For free.

100s of pages of practical advice on user experience, in handy portable form. 'Bright Ideas' eBooks.

Related articles & resources

This article is tagged usability testing.


Our services

Let us help you create great customer experiences.

Training courses

Join our community of UX professionals who get their user experience training from Userfocus. See our curriculum.

David Travis Dr. David Travis (@userfocus) has been carrying out ethnographic field research and running product usability tests since 1989. He has published three books on user experience including Think Like a UX Researcher.

Get help with…

If you liked this, try…

Get our newsletter (And a free guide to usability test moderation)
No thanks