What do you mean?

Photo by Jon Tyson on Unsplash

A few years back, I attended a Q & A with the British photographer, David Bailey. Bailey is legendary for two things: chronicling the sixties (with photographs of people like Mick Jagger, the Beatles and the Kray twins); and being cantankerous (his standard reply to the greeting, "Pleased to meet you," is, "How do you f–ing know? You haven't even spoken to me yet.")

After a review of Bailey's photographs, the session was opened to questions from the 1000 strong audience. A hush descended as a young woman bravely took the microphone. She asked Bailey, "You must have used a lot of different cameras in your time. What was your favourite?"

Bailey's face visibly tightened. It was as if someone had quizzed Michaelangelo on his favourite paintbrush. Or asked Gordon Ramsay to identify his favourite oven. "What kind of f–ing stupid question is that?" he responded.

Survey respondents are rarely as forthright as Bailey, though they may share his opinion about some of the questions we ask. Not only do some of our questions appear stupid to respondents, our questions are also frequently misunderstood. Yet the unspoken assumption of the survey creator is usually, "How is it possible for someone to misinterpret a question that is so clear to me?"

Ambiguity is the enemy of the survey designer. Even a trivial question like, "Do you know the time?" can be interpreted in different ways. One person may answer, "9:30" whereas another may simply say, "Yes". Which answer is given depends on context. When it comes to the much more complicated questions asked in surveys, it's no surprise that different people interpret the same question in different ways.

But there's good news. Experienced survey practitioners are aware of these issues and have an incredibly useful tool you can use to test your own survey questions. It's called the cognitive interview.

Fixing ambiguous questions with the cognitive interview

The cognitive interview is a technique for pilot testing the questions on surveys. Interviews are typically run as 1-1 interviews with a handful of volunteers. As with any pilot test, you ask your volunteer the questions on the survey — but you're not really interested in their answers. With a cognitive interview, you are more interested in how your participants arrived at their answers. In some ways, a cognitive interview is similar to a usability test: you ask a participant to "think aloud" as they try to answer the question.

The process is as follows. You ask your pilot participant the question on the survey (such as, "How many times have you talked to a doctor in the last year?"), and wait for the answer.

Once the participant has provided an answer, you then ask these questions (one by one):

  • "In your own words, what is this question asking?"
  • "How did you arrive at your answer?"
  • "How sure are you of your answer?"

It's useful to follow up with specific questions of the form, "What does the term…mean to you in this question?" For example, we might ask, "What does the phrase, 'talked to a doctor' mean to you in this question?"

Problems uncovered with the cognitive interview

To be confident in someone's answer to a survey question, we need to ensure that they can:

  • Understand the question.
  • Recall the answer from memory.
  • Estimate the answer.
  • Map their estimate to the answer choices we provide.

Let's look at these checkpoints by pilot testing the survey question, "How many times have you talked to a doctor in the last year?" Before continuing, try answering this question yourself and then see if the following points resonate with you.

Understand the question

This is the most obvious checkpoint: here we ask, do people understand the question? Do they understand the individual words in the question and do they interpret the overall question in the same way as we intended? When I checked my "doctor" question with a volunteer, here are some of the issues that arose:

  • Should I count visits to the Doctor's surgery when I saw the Practice Nurse, but not a doctor?
  • Should I count those times when I've talked to the doctor about my child's health?
  • I have a friend who's a doctor who I see for a beer every month. Should I include that?
  • Do they mean the last 12 months or the last calendar year?

Recall the answer from memory

I know it's hard to believe, but people often aren't as bothered as we are about the topic of our survey. This becomes a problem when we ask people about events that happened a while back: they may simply have forgotten.

But failures of memory are also a problem with more recent events. For example, a question like, "How many times have you seen the O2 brand in the last week?" may not be something your participants bother encoding in memory even though it happened quite recently.

Taking our "doctor" question, if you've not visited a doctor for years, then you'll probably find this easy to recall. But if you've been a few times for a worrying health issue, you may be avoiding the memory. And even if you've been for a routine test or a minor ailment, it may be easy to forget, particularly for appointments that happened more than a few months ago.

Estimate the answer

The way we make estimates varies depending on the question. Taking our question about visits to the doctor's surgery, you may interrogate your memory for the last 12 months and attempt to count the visits you have made. But if I asked you a question like, "In the past month, how many times have you brushed your teeth?" you would probably resort to an average ("I do it twice a day, so…"). This can lead to an overestimation as you forget the times you went to bed drunk or tired or skipped brushing your teeth for some other reason.

Map the estimate to the answer choices we provide

The answer options that we give people on our survey frame the way people decide to answer the question. Here's a good example of this: researcher Norbert Schwarz and his colleagues asked participants the survey question, "How successful would you say you have been in life?"

One group of participants made their response on a scale that went from -5 to +5. With this scale, the researchers discovered that 13% of people rated themselves in the bottom half of the scale (i.e. they gave a rating of -5 to 0).

Another group had a scale that went from 0 to 10. With this scale, 34% of people rated themselves in the bottom half of the scale (i.e. they gave a rating of 0 to 5). This huge difference was simply down to using two different response scales. (Schwarz, N. et al., 1991. "Ratings scales: Numeric values may change the meaning of scale label." Public Opinion Quarterly, 55, 570-582.)

Similarly, retuning to our "doctor" question, people may edit their responses to be socially desirable. If we offer category choices like "0", "1-4", "5-10" and "More than 11" and you have been to the doctor 5 times, you may decide to pick the lower category so as not appear like a hypochondriac.

Next steps

The best way to experience the usefulness of the cognitive interview is to try it yourself. If you don't have your own survey to pilot test, then take a question from a survey you've been asked to complete for someone else. Pick a question with a reasonable degree of complexity — "What is your age?" would be a bad choice — and run a cognitive interview with a volunteer. I guarantee you'll discover many ways in which the question can be improved.

About the author

David Travis

Dr. David Travis (@userfocus) has been carrying out ethnographic field research and running product usability tests since 1989. He has published three books on user experience including Think Like a UX Researcher. If you like his articles, you might enjoy his free online user experience course.



Foundation Certificate in UX

Gain hands-on practice in all the key areas of UX while you prepare for the BCS Foundation Certificate in User Experience. More details

Download the best of Userfocus. For free.

100s of pages of practical advice on user experience, in handy portable form. 'Bright Ideas' eBooks.

Related articles & resources

This article is tagged questionnaires, survey design.


Our services

Let us help you create great customer experiences.

Training courses

Join our community of UX professionals who get their user experience training from Userfocus. See our curriculum.

David Travis Dr. David Travis (@userfocus) has been carrying out ethnographic field research and running product usability tests since 1989. He has published three books on user experience including Think Like a UX Researcher.

Get help with…

If you liked this, try…

Get our newsletter (And a free guide to usability test moderation)
No thanks