Toolkit

How to conduct a usability test

Usability tests can help teams develop products that are user-centered, accessible, and inclusive. This guide will help you conduct a successful usability test, from coordinating with participants to analyzing your findings.

Three people sit around an office table with open notebooks and laptops.

Getting recurring feedback from the end users of your product is a critical component of iterative design and development. During a usability test, designers ask participants to think out loud while they complete tasks using a prototype or test site. This method allows teams to identify and validate assumptions about how users will interact with their product or service, ensure that the product or service is user-centered, and improve inclusivity and accessibility, by engaging directly and frequently with end users, particularly those from less-represented communities. 

This toolkit is the second in a series of two focused on usability tests. The first toolkit covers all of the steps that lead up to conducting a usability test, while this toolkit covers the steps involved in implementation, synthesis, and analysis. The process described here was developed and streamlined through Nava’s work with the Commonwealth of Massachusetts’ Paid Family and Medical Leave program. 

A usability test is: 

A user research method used to identify problems and opportunities in design, by observing participants as they attempt to complete defined tasks on a prototype or interface. 

This toolkit can help you:

  • Screen and schedule participants from your pool
  • Conduct a usability study
  • Synthesize and analyze data to craft actionable insights and recommendations

Screen and schedule participants

After completing recruitment, you should have a pool of potential participants for your test. Your next step is narrowing down the pool to determine which participants best fit the screening criteria you had defined earlier, and together are representative of your target users. 

  1. If you’re using a tool like Ethnio, you’ll be able to screen within the tool itself. If you’re using another approach like Google Forms, Microsoft Forms, or Formstack, you’ll screen participants more manually through a spreadsheet. 

  2. From the participant pool, select a group of 7-10 possible participants to reach out to, with the expectation of conducting your usability test with no more than five. This leaves you with a few additional potential participants to account for non-responses or scheduling challenges. In most cases, a test with no more than five users is sufficient and will help you identify as many usability challenges and opportunities as you might identify with a larger group (see Why You Only Need to Test with 5 Users from the Nielsen Norman group for more on this). 

  3. For a moderated test, after you’ve identified the participants you’d like to test with, send those participants a second email which includes information about scheduling, as well as an informed consent form. Ethnio has its own scheduling tool, but you can also schedule participants manually or using other scheduling tools like Calendly

  4. For an unmoderated test, you won’t need to schedule a specific time with each participant as participants will complete the test independently. 

Conduct the test

  1. Determine which tools to use. For a moderated test, you’ll need to decide on a videoconferencing tool, such as Zoom, Google Meet, WebEx, or teams, as well as a tool to take notes, such as Google Docs, Google Sheets, or Word. For an unmoderated test, you’ll need to set up the test in an unmoderated testing tool, such as UserZoom, UserTesting, and Maze. You’ll also need to plan for a secure location to store any notes or videos. You may need to work with your organization or agency’s IT, legal, or operations teams to decide on tooling. 

  2. Consider your approach to informed consent and protecting privacy. You can use Nava’s toolkit on Plain Language templates for user research as a starting point. 

  3. Determine your protocol for providing incentives, such as gift cards, to research participants. Offering incentives encourages a more diverse, representative group of participants, and compensates them for their time. However, if you work for a government agency, you should check with your legal team for rules on compensation. For more on incentives, see the 18F Method card on Incentives

  4. Conducting usability studies is a skill that requires practice and refinement. If possible, conduct a practice interview or two with a colleague to refine the script and flow, and account for any technological setbacks. The more you practice, the more you can focus on the content during the interview rather than reading off your script or resolving technical difficulties. However, you should be prepared for surprises, like a participant showing up late, leaving early, or bringing someone else to the interview.

Synthesize and analyze data

The process of synthesizing your data will vary depending on the nature of the test, time available, and end goals. One example process is as follows:

  • If the test is moderated, a notetaker takes notes during the interview on a separate note-taking document. After each test, conduct a short post-interview debrief with the researcher, notetaker, and any observers. 

  • If the test is unmoderated, the researcher will watch each recording while transcribing or taking notes on a separate note-taking document. 

  • After all interviews are completed, schedule a longer collaborative meeting with the research team and any other relevant stakeholders to review the findings.  You may find a white-boarding tool like Mural or Miro useful. 

  • In the synthesis meeting, discuss what you observed from the usability studies, what issues the participants experienced, and how the team will use the learnings for actionable product or service improvements that better meet the needs of end users. 

Written by


Nikki Brand

Design Manager

Nikki Brand is a Design Manager at Nava. She previously served as a Senior Human Centered Designer at Viamo and has significant field experience across Latin America, Africa, and Asia.

Makaela Stephens

Design Lead

Makaela Stephens is a designer/researcher at Nava. Previously, Makaela worked with California, New York City, and civic non-profits to design and deliver critical services that benefit the public.

More from Nava

Partner with us

Let’s talk about what we can build together.