The checkpoints

These checkpoints are based on the work of Julian Meltzoff (1997).

  • Is the research question clearly stated?
  • Does the introduction, statement of the problem, and overview of any literature or previous reports adequately set the background for the reader, and is this material consistent with the research question?
  • Is it clear why the study was conducted?
  • Given the research question and the background material are the research hypotheses and objectives appropriate and clearly stated?
  • Are key terms well defined?
  • Is the independent variable appropriate given the question of the study? Are the levels of the independent variable appropriate?
  • Is the dependent variable appropriate for the study?
  • Are the criterion measures of the dependent variable appropriate, valid, and reliable?
  • Are the scoring, rating, and judging procedures valid and reliable?
  • Is the measuring apparatus (if any) accurate and reliable?
  • Are the controls appropriate? Can the results be affected by variables that have not been controlled? Are the controls or control groups (if used) properly selected?
  • Is the research design suitable to meet the objectives of the study? Is the research design appropriate to test the hypotheses and answer the research question?
  • Are the methods and procedures clearly described in sufficient detail to be understood and replicated?
  • Is the presentation sequence of test stimuli (including any randomisations or counter-balancing) appropriate?
  • Are the test participants properly oriented and motivated? What is their understanding of the task? Are the instructions sufficiently clear and precise?
  • Are there any signs of experiment bias in the design, data collection, assessment, analysis, or reporting?
  • Are the participants properly selected? Is the sample representative and unbiased? Do the procedures adhere to the guidelines for the protection and well being of participants?
  • Is the sample size appropriate? Are the appropriate procedures used to assign participants to groups, treatments, or conditions? Are suitable techniques used to establish group equivalence, such as matching, equating, or randomising?
  • Does participant attrition occur and if so does it bias the sample?
  • Are bad data properly identified and set aside (not included in the final test data set); and are instances of bad data reported and explained as such?
  • Have the data been appropriately analysed, sorted, categorised, grouped, prioritised etc.?
  • Are descriptive statistics used? Are these accurate?
  • Are the inferential statistical tests appropriate? Are the assumptions for their use met? Are there any errors in the calculation or presentation of statistical results?
  • Are all graphs correctly labelled (both the X and Y axes)? Are data elements on graphs properly coded, and identified?
  • Are tables and figures clearly labelled and accurately presented and referenced in the text? Are results and findings correctly interpreted, properly reported, given meaning and placed in context?
  • Are recommendations unambiguous? Do recommendations follow clear usability, human factors, or ergonomics guidelines?
  • Are recommendations supported by references to prior literature or to industry standards?
  • Is the discussion section of the report reasonable in view of the data?
  • Are the conclusions valid and justified by the data?
  • Are the generalizations valid?
  • Do references (if used) match the citations in the text?
  • Have ethical standards been adhered to in all phases of the research?
  • What can be done to improve or re-design the study?


Meltzoff, J. (1997) Critical Thinking About Research: Psychology and Related Fields. American Psychological Association.

Download a printable version

Use this version as a working checklist to evaluate a usability study.

Research checklist (pdf format, 60KB)

About the author

Philip Hodgson

Dr. Philip Hodgson (@bpusability on Twitter) holds a B.Sc., M.A., and Ph.D. in Experimental Psychology. He has over twenty years of experience as a researcher, consultant, and trainer in usability, user experience, human factors and experimental psychology. His work has influenced product and system design in the consumer, telecoms, manufacturing, packaging, public safety, web and medical domains for the North American, European, and Asian markets.

Foundation Certificate in UX

Gain hands-on practice in all the key areas of UX while you prepare for the BCS Foundation Certificate in User Experience. More details

Download the best of Userfocus. For free.

100s of pages of practical advice on user experience, in handy portable form. 'Bright Ideas' eBooks.

Usability test plan toolkit

This eBook contains all you need to make sure that you're fully prepared for your next usability test. Usability test plan toolkit.

Related articles & resources

This article is tagged tools, usability testing.

Our services

Let us help you create great customer experiences.

Training courses

Join our community of UX professionals who get their user experience training from Userfocus. See our curriculum.

Phillip Hodgson Dr. Philip Hodgson (@bpusability on Twitter) has been a UX researcher for over 25 years. His work has influenced design for the US, European and Asian markets, for everything from banking software and medical devices to store displays, packaging and even baby care products. His book, Think Like a UX Researcher, was published in January 2019.