A still from the movie 'Kitchen Stories', about Swedish home scientists carrying out observational studies in consumers' homes in the 1950s.
Predicting what will work best for users requires a deep understanding of their needs. Research methods like focus groups and surveys have obvious face validity but they continually fail to provide the insights that design teams need in product discovery. The reason is that these techniques require users to predict their future behaviour, something that people are poor at doing.
An alternative method is to examine what people do, rather than what they say they do. This approach is based on a simple premise: the best predictor of future behaviour is past behaviour. What people do is a better indicator of the underlying user need than what people say.
To avoid simply asking users what they want, user researchers have appropriated methods from ethnography and applied them to user research. This technique is broadly known as ‘design ethnography’ but it differs in important ways from traditional ethnography.
What is ethnography?
Ethnography is the study of culture. Branislaw Malinowski, who studied gift giving amongst natives in Papua, wrote:
“The final goal… is to grasp the native’s point of view, his relation to life, to realise his vision of his world”.
Replace the word ‘native’ with the word ‘user’ — or extend the metaphor and think of your users as a ‘tribe’ — and you can see why this approach could offer value in product and service design.
Some of the defining characteristics of ethnography are that:
- Research takes place in the participants’ context.
- Participant sample sizes are small.
- Researchers aim to understand the big picture: participants’ needs, language, concepts and beliefs.
- Artefacts are analysed to understand how people live their lives and what they value.
- Data is ‘thick’, comprising written notes, photographs, audio and video recordings.
To some degree or another, design ethnographers appropriate each of these characteristics in the work that they do.
In addition to Branislaw Malinowski, other examples of ethnography include:
- Margaret Mead, who studied ‘coming of age’ rituals in Samoa.
- Sudhir Venkatesh, who embedded himself with Chicago drug gangs to understand drug culture.
- Matthew Hughey, who spent over a year attending the meetings of a white nationalist group and a white antiracist group.
So how does design ethnography differ from traditional ethnography?
It’s a struggle to use a traditional ethnographic approach in modern product development, mainly because of the timescales. That’s not to say it’s impossible: Jan Chipchase (who specialises in international field research) says he spends half the year travelling around exotic destinations. But most people who practice design ethnography in business would agree with these distinctions:
- The purpose of traditional ethnography is to understand culture. The purpose of design ethnography is to gain design insights
- The timescale of traditional ethnography is months and years. The timescale of design ethnography is days and weeks.
- Traditional ethnographers live with participants and try to become part of the culture. Design ethnographers are visitors who observe and interview.
- With traditional ethnography, data are analysed in great detail over many months. With design ethnography, there is “just enough” analysis to test the risky assumptions.
- The findings of traditional ethnography are shared in books and academic journals. The findings from design ethnography are restricted to a team or an organisation.
How should you approach design ethnography?
Instead of asking people what they want, with a design ethnography approach the user researcher tries to discover why people want those things. Through observation and interview, they answer questions like these:
- What goals are users trying to achieve?
- How do they currently do it?
- What parts do they love or hate?
- What difficulties do they experience along the way?
- What workarounds do they use?
You answer these questions by observing users and interviewing them.
As someone who spent several years on a single research project in the Trobriand Islands, we don’t know what Malinowski would think of the compromises made in design ethnography. Our view is that, if we liken traditional ethnography to a prize heavyweight boxer, then design ethnography is more akin to a street fighter. It doesn’t follow all of the rules but it gets the job done. That's usually acceptable for most design projects but be aware that too much compromise can jeopardise the quality of your results. Let's look at some of the ways we’ve seen that happen.
Avoiding some common mistakes
When we work with companies and we suggest a design ethnography exercise, we often hear, "But we already do that."
It's true that most companies carry out some up-front customer-focused field research activities (that are different to their traditional market research). They often dub it "insights research" done by their Insights Team or by their Innovation Team.
But these activities frequently amount to nothing more than going to a customer site to carry out the same interviews or surveys the team would normally do out of context, with little to no observation of behaviour taking place. We've even seen it done with 'concept testing' where researchers write a descriptive paragraph of their idea and ask respondents to read it and say what they think to it — which has to be the worst kind of customer research imaginable.
The consequence of this is that development teams often set out creating the wrong product or system. The team continues blindly on until the UX team get involved and usability test it. Now the development team gets to see real users at work, at which point they suspect they have built the wrong concept. But now the team is too far along in development and too wedded to their idea to pivot.
The mistakes we see most often are:
- Doing research in the field — but doing the wrong kind of research.
- Not knowing what is and what is not data (because there is no focus) so user opinions and comments are prioritised over user behaviour.
- Not sending experienced field researchers — instead sending people familiar with interviewing only.
- Doing it after the company has already decided what the design solution is going to be — therefore looking only for confirmatory evidence and missing other opportunities.
If you would like to avoid these mistakes and find out more, we're running a two day course on design ethnography in May.
About the author
Dr. David Travis (@userfocus on Twitter) is a User Experience Strategist. He has worked in the fields of human factors, usability and user experience since 1989 and has published two books on usability. David helps both large firms and start ups connect with their customers and bring business ideas to market. If you like his articles, why not join the thousands of other people taking his free online user experience course?
Dr. Philip Hodgson (@bpusability on Twitter) holds a B.Sc., M.A., and Ph.D. in Experimental Psychology. He has over twenty years of experience as a researcher, consultant, and trainer in usability, user experience, human factors and experimental psychology. His work has influenced product and system design in the consumer, telecoms, manufacturing, packaging, public safety, web and medical domains for the North American, European, and Asian markets.
Foundation Certificate in UX
Gain hands-on practice in all the key areas of UX while you prepare for the BCS Foundation Certificate in User Experience. More details
Every month, we share an in-depth article on user experience with over 10,000 newsletter readers. Want in? Sign up now and download a free guide to usability test moderation.
User Experience Articles
Our most recent articles
Search for articles by keyword
- 7 articles tagged accessibility
- 4 articles tagged axure
- 5 articles tagged benefits
- 16 articles tagged careers
- 8 articles tagged case study
- 1 article tagged css
- 8 articles tagged discount usability
- 2 articles tagged ecommerce
- 19 articles tagged ethnography
- 14 articles tagged expert review
- 2 articles tagged fitts law
- 5 articles tagged focus groups
- 1 article tagged forms
- 7 articles tagged guidelines
- 11 articles tagged heuristic evaluation
- 7 articles tagged ia
- 14 articles tagged iso 9241
- 11 articles tagged iterative design
- 3 articles tagged layout
- 2 articles tagged legal
- 11 articles tagged metrics
- 3 articles tagged mobile
- 8 articles tagged moderating
- 3 articles tagged morae
- 2 articles tagged navigation
- 9 articles tagged personas
- 15 articles tagged prototyping
- 7 articles tagged questionnaires
- 1 article tagged quotations
- 4 articles tagged roi
- 17 articles tagged selling usability
- 12 articles tagged standards
- 47 articles tagged strategy
- 2 articles tagged style guide
- 5 articles tagged survey design
- 6 articles tagged task scenarios
- 2 articles tagged templates
- 21 articles tagged tools
- 58 articles tagged usability testing
- 3 articles tagged user manual