Check the requirements for most user research jobs and you’ll see that they often ask for a background in psychology or behavioural science. People usually assume this is because psychologists know secret mind hacks like reciprocation, social proof and framing that can be used to manipulate people.
In truth, there are a small number of fundamental principles that (a) psychologists know, (b) most people don’t and (c) are relevant to user researchers.
Four of the most important are:
- Your users do not think like you think.
- Your users don’t have good insight into the reasons for their behaviour.
- The best predictor of your users’ future behaviour is their past behaviour.
- Your users’ behaviour depends on context.
Your users do not think like you think
Of all the principles in psychology relevant to user researchers, this is the easiest to understand intellectually — but the hardest to appreciate intuitively. In fact, I’d go so far to say that, not only do most people on design teams fail to appreciate this principle, most adults don’t appreciate this principle. It’s the principle I need to consciously remind myself of on every new project — because I forget it too.
This principle tells us that our users don’t think like we think.
- They don’t value what we value: optimising our product’s splash screen may be important to us but nowhere near as important to our users as having a larger font size.
- We do not see things like our users: they think that the grey placeholder text inside the form field needs to be deleted before they can enter a value.
- We do not know what our users know: they use workflow shortcuts, acronyms and jargon that are entirely missing from our application.
One area where this is most obvious is in users’ technical skills. Design teams almost always overestimate the technical competence of their users.
Since your users will never stop surprising you, there is only one solution to this: you need to put the design team (and yourself) in front of users at every opportunity. This will help you gain empathy for your users and help you see the world through their eyes.
Your users don’t have good insight into the reasons for their behaviour
We like to think that our decisions are rational and made after conscious deliberation. That's why it's tempting to believe participants when they tell us why they did what they did. But people are poor at introspecting into the reasons for their behaviour. In reality, people want to tell a good story — a ‘narrative’ — of their life and will change what they say to fit the view of who they are.
One of many studies proving this is the case comes from the field of choice blindness. In this study, a researcher showed a participant two pictures of different women, and asked the participant to point to the most attractive. If you were a participant in this study, you would have seen the experimenter hand you your chosen picture, discard the other photo, and then ask you to justify your choice.
Unknown to participants, the experimenter was a part-time magician, and using a sleight of hand technique he was really showing the participant the picture of the woman rated less attractive. He now asked participants why they had chosen that picture.
Remarkably, even when the photos weren’t that similar, the majority of participants didn’t spot that they were now looking at the woman they thought was less attractive. Even more curiously, participants now provided “explanations” for their choice. So for example they might say, “Well, I chose this picture because I like blondes”, even though the participant had really chosen a brunette (whose picture was now face down on the table). People made up reasons to justify their choice.
In practice, this principle means that asking people to introspect on the reasons for their behaviour isn’t always useful. You are much better off designing experiments where you can observe people’s behaviour. Which takes me to my next principle.
The best predictor of your users’ future behaviour is their past behaviour
Opinion polling and exit polling provide a nice demonstration of this principle.
An opinion poll asks people to predict what they would do in the future (their intention). An exit poll asks people what they did in the past (their action).
Intention research is the field of market researchers and the tools of choice are usually surveys and focus groups. These are devised to ask questions like, ‘How likely are you to recommend our company?’, ‘Would you use this feature in the next release of the system?’ and ‘How much would you pay for this product?’ Unsurprisingly, the results are variable and often have little predictive value, despite the fact that sample sizes are often huge.
For example, opinion pollsters failed to predict the Leave vote in the EU referendum in 2017 as well as the results of the last two general elections in the UK. In contrast, the results of exit polls — where voters are asked to recast their vote on leaving the polling station — were spot on.
Action research is the field of user researchers. With action research, we interview users about how they behaved in the past and we spend time observing how users are behaving now. How are people solving this problem at the moment? What is the workflow across multiple channels? What systems, tools or processes do people use? How do people collaborate?
Because we are observing real behaviour, action research has strong predictive value, even though the sample sizes are often small. This is because the best predictor of future behaviour is past behaviour.
Your users’ behaviour depends on context
Back in the 1930s, psychologist Kurt Lewin proposed a formula, B = f(P, E). The formula states that behaviour (B) is a function of the person (P) and his or her environment (E). Lewin’s equation teaches us that the same person (such as our user) will behave differently in different environments. This means the best way to predict the way our user will behave with our system is to observe them in their natural environment. Context is a skeleton key for unlocking user needs — which in turn leads to new feature and product ideas.
This principle is easily understood with an analogy from animal behaviour: if you want to understand how an exotic animal behaves, you would plan to observe the animal in the wild rather than in a zoo. User research techniques that take place out of context — like interviews and moderated usability tests — are like a visit to the zoo. In contrast, techniques like field research are like going on safari. With field research you observe real behaviour: you see what people really do in a particular situation, not listen to what they say they do or see them act it out. In short, you go where the action happens.
This doesn’t mean that out-of-context research offers no value. In the same way that a zoo can normalise behaviour by taking animals out of cages and putting them into a wildlife park, with out-of-context research we just need to find some way of incorporating the user’s context.
One way you can achieve this issue with out-of-context interviews is with cognitive interviewing. With this technique you put the user back in the situation by having them restate the context — research shows this aids memory retrieval.
With usability testing, you can recreate the context with realistic task scenarios. There’s a big difference between pretending to buy car insurance and really buying car insurance. No matter how well intentioned they are, participants know that, if they get it wrong, there are no consequences. You can mitigate this risk by giving participants real money to spend on the task — what I’ve called in the past, ‘skin in the game’ tasks. If, in contrast, you find yourself asking the user to ‘just pretend’ in a usability test, don’t expect to observe authentic behaviour.
Applying these principles in the field
These principles are easy to understand intellectually but they are a little counter-intuitive. Knowing something in your head is different from believing something in your gut. This means it may take a while before they change your own behaviour. But these principles repay reflection, because behind every good user research plan is an intuitive understanding of these four principles.
About the author
Dr. David Travis (@userfocus on Twitter) is a User Experience Strategist. He has worked in the fields of human factors, usability and user experience since 1989 and has published two books on usability. David helps both large firms and start ups connect with their customers and bring business ideas to market. If you like his articles, why not join the thousands of other people taking his free online user experience course?
Foundation Certificate in UX
Gain hands-on practice in all the key areas of UX while you prepare for the BCS Foundation Certificate in User Experience. More details
Every month, we share an in-depth article on user experience with over 10,000 newsletter readers. Want in? Sign up now and download a free guide to usability test moderation.
User Experience Articles
Our most recent articles
Search for articles by keyword
- 7 articles tagged accessibility
- 4 articles tagged axure
- 5 articles tagged benefits
- 16 articles tagged careers
- 8 articles tagged case study
- 1 article tagged css
- 8 articles tagged discount usability
- 2 articles tagged ecommerce
- 19 articles tagged ethnography
- 14 articles tagged expert review
- 2 articles tagged fitts law
- 5 articles tagged focus groups
- 1 article tagged forms
- 7 articles tagged guidelines
- 11 articles tagged heuristic evaluation
- 7 articles tagged ia
- 14 articles tagged iso 9241
- 11 articles tagged iterative design
- 3 articles tagged layout
- 2 articles tagged legal
- 11 articles tagged metrics
- 3 articles tagged mobile
- 8 articles tagged moderating
- 3 articles tagged morae
- 2 articles tagged navigation
- 9 articles tagged personas
- 15 articles tagged prototyping
- 7 articles tagged questionnaires
- 2 articles tagged quotations
- 4 articles tagged roi
- 17 articles tagged selling usability
- 13 articles tagged standards
- 47 articles tagged strategy
- 2 articles tagged style guide
- 5 articles tagged survey design
- 6 articles tagged task scenarios
- 2 articles tagged templates
- 21 articles tagged tools
- 58 articles tagged usability testing
- 3 articles tagged user manual