The first lens: 1-1 Moderated usability testing
Organisation A is in the banking sector. As with many companies in the banking sector, the firm has undergone an enormous amount of change in the last 5 years. As part of that change, the organisation has re-oriented itself around customers and they have a large (and growing) user experience team.
Prior to the development of the user experience team, most of the bank’s knowledge about customers came from market research carried out by external agencies. This meant that internal teams tended to base their product decisions on aggregate data and they often felt one-step removed from customers. As a reaction to this, the team’s manager encouraged her London-based team to get face-to-face with customers in usability testing sessions. These sessions take three forms:
- Lab-based tests. These take place in the organisation’s newly-built usability lab. Building a lab was one of the manager’s first demands when she took over the team. She told me that the lab makes a statement to the organisation: ‘our users matter’. It has good viewing facilities and this means developers, business analysts and project managers can drop in and see real users struggling to use their software.
- Pop-up ‘guerrilla’ research sessions. These are held every sprint at on-the-fly at locations where customers congregate. This allows the team to test out design hypotheses with users that can be fed back rapidly to the design team.
- Remote moderated usability tests using technologies like Skype. This ensures the team can include users who live outside the capital or are otherwise difficult to involve in the research.
This team’s strategy is all about qualitative, in-depth, one-to-one sessions. This approach works well for the organisation because it ensures the team gets first-hand exposure to users rather than having user needs filtered through a third party. It also ensures that the team can carry out quick, focused tests that support the organisation’s agile way of working.
The second lens: Remote, unmoderated usability testing
As a web-based start-up, Company B has a very different culture from Company A and this is reflected in their approach to usability testing. The company CEO is driven by the numbers and he has recruited a user experience manager in his image. The manager of the user experience team doesn’t see a lot of point in investing in lab facilities when there are so many useful tools on the web. His team uses remote, unmoderated tools for usability testing. They carry out these kinds of research:
- Benchmark usability testing using tools like Loop11. These tests provide solid numbers that the team can use to validate hypotheses: for example, they can definitively measure task completion rates and time on task with sample sizes in the 100s.
- Survey-like tests that uncover where people would click, which of two designs people prefer, and what people can remember after just a few seconds of exposure, using tools like Zurb's Verify.
- Online card sorting tests that explore navigation, using tools like Optimal Workshop.
- Remote, unmoderated, ‘think aloud’ usability tests, using tools like What Users Do and Usertest.io. In these sessions, participants are set various tasks, record their experiences as they carry them out, and the sessions are uploaded to the cloud where the team can watch them at their leisure.
This team’s strategy is mainly about using quantitative usability data to steer the organisation. The team also ensure the organisation stays grounded in real user behaviour by encouraging the team to view videos from the remote, unmoderated, “think aloud” sessions. This approach works well for this organisation because it is strongly driven by quantitative data.
The third lens: Measuring real-world use
Company C is a large retailer with a number of bricks and mortar stores. With its history in retail outlets, this organisation favours data from natural, real world use, not the kind of data that comes from the scripted studies favoured by Companies A and B. Company C knows that moving a product to the end of an aisle in a physical store increases its sales, and they are continually on the prowl for similar behaviours customers exhibit with its web site. This firm wants to know how to optimise the details of each element on its web page. This means the design team wants to know what real customers are doing with their web site right now.
This design team favours the kind of quantitative, real world usage data that comes from A/B and multivariate testing. They use Google Analytics to:
- Identify the top destinations in their site and how this fits with their advertising and campaign performance.
- Run A/B tests using Google Analytics Content Experiments to optimise individual page layouts.
This approach suits the company culture because the web site is treated as just another retail outlet (though an extremely profitable one) that submits its sales returns at the end of each week.
Which approach is best?
One-to-one moderated usability testing solves the problem of getting the design team exposed to customers so they make better design decisions. Remote, unmoderated usability testing solves the problem of having quantitative data to choose between design ideas. Measuring real-world use solves the problem of knowing how customers use your system right now.
But none of these is an ideal strategy on its own.
The best kind of user research is triangulated. Triangulation is like observing your users through different lenses. Sometimes you want to be up close and personal. Other times you want to sit back and observe the crowd.
One difficulty is that different organisations (and different user researchers) tend to favour one or other of these three approaches, so it’s rare to find a company observing its users through all three lenses. The good news is that if your organisation favours just one of these approaches, you’ll be able to seriously improve design outcomes by showing the design team how to look through a different lens. Train all three lenses on your users and you’ll have the beginnings of an unbeatable usability evaluation strategy.
About the author
Dr. David Travis (@userfocus on Twitter) is a User Experience Strategist. He has worked in the fields of human factors, usability and user experience since 1989 and has published two books on usability. David helps both large firms and start ups connect with their customers and bring business ideas to market. If you like his articles, why not join the thousands of other people taking his free online user experience course?
Love it? Hate it? Join the discussioncomments powered by Disqus
Foundation Certificate in UX
Gain hands-on practice in all the key areas of UX while you prepare for the BCS Foundation Certificate in User Experience. More details
Every month, we share an in-depth article on user experience with over 10,000 newsletter readers. Want in? Sign up now and download a free guide to usability test moderation.
User Experience Articles
Our most recent articles
Our most commented articles
Search for articles by keyword
- 7 articles tagged accessibility
- 4 articles tagged axure
- 5 articles tagged benefits
- 16 articles tagged careers
- 8 articles tagged case study
- 1 article tagged css
- 8 articles tagged discount usability
- 2 articles tagged ecommerce
- 17 articles tagged ethnography
- 14 articles tagged expert review
- 2 articles tagged fitts law
- 4 articles tagged focus groups
- 1 article tagged forms
- 6 articles tagged guidelines
- 11 articles tagged heuristic evaluation
- 7 articles tagged ia
- 14 articles tagged iso 9241
- 11 articles tagged iterative design
- 2 articles tagged legal
- 11 articles tagged metrics
- 3 articles tagged mobile
- 8 articles tagged moderating
- 3 articles tagged morae
- 2 articles tagged navigation
- 9 articles tagged personas
- 15 articles tagged prototyping
- 7 articles tagged questionnaires
- 1 article tagged quotations
- 4 articles tagged roi
- 17 articles tagged selling usability
- 12 articles tagged standards
- 47 articles tagged strategy
- 2 articles tagged style guide
- 4 articles tagged survey design
- 5 articles tagged task scenarios
- 2 articles tagged templates
- 21 articles tagged tools
- 56 articles tagged usability testing
- 3 articles tagged user manual