I've recently been thinking about what makes for a good UX debrief meeting. After all, it's an important part of the UX research process that is sometimes overlooked.
But, if I'm being honest, my thinking was really prompted by two debrief meetings I held recently.
The first bore less resemblance to a productive meeting than it did to the Gunfight at the O.K. Corral. The second debrief meeting ran as smoothly as the engine in a high-performance car.
And what made this all the more surprising was that both meetings were with the same client.
I thought hard about both of these meetings. What went wrong in the first one? And what worked well in the second?
What really is a UX debrief meeting?
When you debrief someone, you question that person about a mission or an undertaking that they've completed, or about some experience they've been through. A UX debrief has this component too, because it's an opportunity for project team members to ask you questions about the work you've done. But if that's all you achieve in a debrief meeting then you haven't moved the project, or the design team, forward — so it’s a missed opportunity.
An effective UX debrief meeting has to accomplish much more than just clarify things people don’t understand. (You could have done that by writing a better report.) To warrant having conducted UX research in the first place, your research findings must connect with the product design process, and the debrief meeting is an opportunity to make sure that happens. What the project team learns from the debrief of a UX study must in some way effect change:
- Change to the design of the system or product.
- Change to the design process itself.
- Change in the way the project team (and the sponsoring company) thinks about its users.
- Change in people's attitudes towards the value of UX.
So you're failing your client and your project team if you think a UX debrief meeting is merely a way of wrapping up the job. Experienced researchers know that the debrief meeting is a golden opportunity to answer the most important question a project team can ask at the end of a study:
"What do we do next?"
Debrief 1: Getting it wrong
Every now and again I find myself trying to accommodate a client's request that, somewhere deep down, goes against my better judgment. I find myself agreeing to a plan that I should have questioned. Sometimes when I do this, the project still works out OK and I live to consult another day.
But sometimes things go wrong.
On this occasion, things went wrong.
Corners got cut during a study due to client pressure on the timeline and budget. As a result no members of the project team observed the study. In fact, my primary contact in the company (I'll just call him 'The Berlin Wall') had created an invisible barrier, with me on one side and the project team on the other.
As a result, the designers and engineers had little awareness of the study — until, that is, they got invited to the UX debrief meeting. I don’t need to say much more. You can already sense this is not going to end well.
Suffice to say, one way or another, I managed to conjure up all of these potential hazards in one meeting. (No mean feat, I might add).
- No one in the meeting (other than The Berlin Wall) knew who I was.
- No one had read the report (The Berlin Wall had distributed it only a few minutes before the meeting).
- The project team was on the other end of a phone line.
- The main decision makers did not turn up.
- The meeting could not be postponed due to deadline demands.
- The 1-hour meeting lasted 59 minutes and 30 seconds longer than the aforesaid gunfight. (The Gunfight at the O.K. Corral lasted just 30 seconds).
Because no one had read the report or had time to think about the findings or discuss them in advance, the meeting pretty much amounted to "some bloke on the other end of the phone" telling the designers and engineers where they had screwed up. No useful discussion could take place until everyone had at least some common grasp on the findings, so the meeting degenerated into little more than a monologue describing the study and its results. Each finding was either summarily disputed (if there was any question about its interpretation) or was met with silence (if its interpretation was undeniable).
Because the team was encountering the usability problems for the first time, the tone of the meeting was characterized by their honest (and sometimes brutal) first reactions. An imaginative stenographer might have captured something like this:
- "Here are the top five reasons why your baby is ugly."
- "Oh yeah? Well here are the top five reasons why your test participants are stupid."
The debrief meeting should probably not have taken place. And, on reflection, the study should not have taken place either — at least not under those particular circumstances. Much was learned by all concerned.
Happily, I was able to have a second debrief meeting shortly afterwards with The Berlin Wall (who had been curiously silent throughout the team debrief) and we worked through the problems together, and the wall came down, and I was able to build a proper relationship with the designers, the engineers and the marketers. And that led to another UX project, and that led to a second debrief meeting.
Debrief 2: Getting it right
The second debrief worked well. The foundations were in place this time to ensure that the project team was engaged and on board from the beginning. They had attended the kick-off meeting, provided the participant screening criteria and the core tasks, reviewed and signed off on the usability test plan, and (most important of all) they had all attended the usability test sessions and shared their thoughts at the end of each test day.
What a transformation.
Now, rather than being hidden behind an invisible wall, I was working with a team that was actively seeking UX guidance.
But I was leaving nothing to chance when it came to the debrief meeting. I had a plan and a clear objective — to get the team discussing what would happen next.
I also wanted them to reflect on the kinds of findings they had observed in the usability test and read about in the report, and contrast those with the kinds of findings they usually got from inadequate focus groups and online concept-validation tests (previously their two main sources of customer information). I wanted the team to experience how confident they could be in objective behavioral usability data and to see how it took the ambiguity, debate and politicking out of making design decisions.
There was also one looming issue that needed to be discussed. The usability test data had strongly suggested that the system under test might not be the right thing to have built in the first place. That finding needed to be out in the open, and this was the opportunity to get the dialogue going.
In contrast to the first meeting, the second meeting was characterized by:
- Attendees who knew the study and its findings and who had all read the report and had time to think about it, and who came prepared.
- The presence of the main project leaders and decision makers.
- A genuine eagerness, on the part of the team, for guidance about achieving optimal usability (fueled mainly by having observed participants struggling with basic tasks that the team had assumed were easy).
- A clearly defined 3-step agenda to: (a) Get all team reactions and feedback on the table — the surprises, the confirmations, the learnings, the "Aha!" moments. (b) Reach consensus on the main 5 'buckets' of usability problems that needed to be addressed. (c) Confirm which issues could realistically be fixed.
- No slideshow presentation, and no page-by-page re-reading of the report. Instead, an open discussion of the usability issues following only a brief summary introduction by myself.
- A discussion — initiated by the team members themselves — about the 'big issue' of whether the concept we had tested was the right thing to be building, and whether it even made sense to customers.
- Ownership of the usability problems by the designers, the engineers and the marketing leaders.
I made sure to co-chair the meeting with the project team's own designated UX representative, and that helped overcome any sense of "me vs. them". Ahead of the meeting I had made it clear that this was not going to be a slideshow presentation and that I was assuming everyone would come prepared having read the report. This seemed a reasonable 'ask' given that they had commissioned the research and the report took only 30 minutes to read.
So I knew that we now had grounds for a common starting point and that we would not be derailed by people attributing the usability problems to "stupid users" as had happened in the earlier meeting.
But just in case, I had prepared a usability test highlights video proving that it was the interface that was stupid.
Ten minutes of listening to what the team had learned from the study told me there would be no disputing the findings this time, and it was easy to flow from their list of critical issues right into prioritizing the most important usability problems they would have to fix, and what process they would use to manage the design changes. Within 30 minutes of starting the meeting I had four different disciplines as well as an overseas satellite group, all singing from the same song sheet.
Then, as the meeting neared its close, something unexpected happened.
The attendees started asking strategic questions. Questions like:
- How can we discover the usability problems earlier in the future?
- Can we usability test without building out the whole system next time?
- How can we mitigate the risk of conceiving the wrong product offering in the future?
- How can we get better at discovering genuine customer and user needs?
And I have to admit, that was one happy, and rather excited, project team.
So here are my 10 practitioner tips for running an effective UX debrief meeting. I recommend you read them here rather than learn them the hard way. Take it from me: learning them the hard way is a lot less fun.
10 tips for an effective UX debrief meeting
- Don’t wing it. Prepare thoroughly. Have a plan.
- Don’t think of the debrief as a wrap-up meeting — think of it as a springboard to the next step. And make sure UX is part of the next step.
- Co-chair the meeting with a lead member of the project team.
- Make sure the main decision-makers attend.
- Don't give a PowerPoint presentation, and don’t rehash the report. If you have 60 minutes talk for 20 minutes, then discuss for 40 minutes.
- Insist that attendees must read the report and prepare comments and questions.
- Before you summarize the study findings, ask the team to share what they learned from the study, what surprised them and what usability issues they feel are most important or most serious.
- Simplify your message. Focus on just the 5 most severe usability problems. Don’t overwhelm the team by trying to cover everything (they can check the report for those details).
- Get consensus on the problems rather than argue over solutions.
- Don’t expect or insist that everything must be fixed. Focus next steps on things that can realistically be changed within the budget and timeline.
About the author
Dr. Philip Hodgson (@bpusability on Twitter) holds a B.Sc., M.A., and Ph.D. in Experimental Psychology. He has over twenty years of experience as a researcher, consultant, and trainer in usability, user experience, human factors and experimental psychology. His work has influenced product and system design in the consumer, telecoms, manufacturing, packaging, public safety, web and medical domains for the North American, European, and Asian markets.
Foundation Certificate in UX
Gain hands-on practice in all the key areas of UX while you prepare for the BCS Foundation Certificate in User Experience. More details
Every month, we share an in-depth article on user experience with over 10,000 newsletter readers. Want in? Sign up now and download a free guide to usability test moderation.
User Experience Articles
Our most recent articles
Search for articles by keyword
- 7 articles tagged accessibility
- 4 articles tagged axure
- 5 articles tagged benefits
- 16 articles tagged careers
- 8 articles tagged case study
- 1 article tagged css
- 8 articles tagged discount usability
- 2 articles tagged ecommerce
- 19 articles tagged ethnography
- 14 articles tagged expert review
- 2 articles tagged fitts law
- 5 articles tagged focus groups
- 1 article tagged forms
- 7 articles tagged guidelines
- 11 articles tagged heuristic evaluation
- 7 articles tagged ia
- 14 articles tagged iso 9241
- 11 articles tagged iterative design
- 3 articles tagged layout
- 2 articles tagged legal
- 11 articles tagged metrics
- 3 articles tagged mobile
- 8 articles tagged moderating
- 3 articles tagged morae
- 2 articles tagged navigation
- 9 articles tagged personas
- 15 articles tagged prototyping
- 7 articles tagged questionnaires
- 1 article tagged quotations
- 4 articles tagged roi
- 17 articles tagged selling usability
- 12 articles tagged standards
- 47 articles tagged strategy
- 2 articles tagged style guide
- 5 articles tagged survey design
- 6 articles tagged task scenarios
- 2 articles tagged templates
- 21 articles tagged tools
- 58 articles tagged usability testing
- 3 articles tagged user manual