Is a usability evaluation right for your project?
User research establishes the foundation for all of the work we do here at Crux Collaborative. We use it to identify opportunities to improve an application’s functionality and gain insight into the behavior of our client’s customers at the outset of a project and we also include it as part of our design process to validate that our proposed solution works for the intended end users.
There are two types of research, quantitative and qualitative research. There is often some confusion about what each entails and when to use them.
Want more Insights?
Don't miss a single Point of View! Receive our articles and podcasts in your inbox.
Quantitative research involves collecting statistical data to understand something. This type of research requires a large number of responses and provides insight into how often participants make one choice vs. another. Surveys are most often used to gather quantitative data. Everyone who responds answers the same questions. The results of the survey tabulated and a hypothesis is formed based on the data.
Qualitative research involves gathering the opinions and experiences of people to understand something. This type of research requires fewer responses and provides insight into why participants make one choice vs. another and how they feel about it. Qualitative studies can be conducted using a variety of approaches, including interviews, focus groups, diaries or journals, For qualitative studies, data is collected by observing and interviewing participants to gain a clear understanding of their behavior and motivations. Qualitative research is conducted using one-on-one interviews, focus groups, and usability evaluations.
At Crux Collaborative, we conduct qualitative research. Most often, that takes the form of a usability evaluation. These evaluations are a combination of task completion and interview questions, and are usually conducted with 8 participants. We ask participants to complete key tasks in an interface and ask them for their opinions about what they are seeing. This gives us great insight to what is working well, what can be improved and why.
How do you know a usability evaluation is the right approach for your project?
What are we evaluating?
A usability test or evaluation is an effective tool to uncover barriers and discover opportunities to improve the user experience of an interface. It is also effective when evaluating the perceived value of an interface. The key is that there is something that a user needs to interact with, such as a website or software, to accomplish a goal.
A web application or online tool is a great candidate for a usability study because the interface supports specific tasks and outcomes. For example, if our client is asking us to evaluate the process to enroll in a health insurance plan, a usability evaluation will show if participants can complete the enrollment and if they understand the options being presented to them in order to make an informed choice.
A usability evaluation is not the best tool to use to determine if participants are more interested in one type of health insurance plan vs. another. In this case, you’d want a larger number of people to provide their opinions to get an accurate understanding of interest.
One of the common questions we get from clients when we start planning for a usability evaluation is: “Will 8 people really give us the information we need? We have a lot of different audiences and users. How can we be sure of a direction if we only talk to 8 people?”
While it is true that your audience may include a number of personas, most of them will visit your site or use your application to accomplish the same primary goals. We always recommend 8 participants, and it is the industry standard for this type of research. In a usability evaluation, we will see the same things surface within the first 4 to 5 sessions. Interviewing more participants rarely uncovers additional insights.
If specific audience segments will complete different tasks or encounter unique content on your site, it may be beneficial to test with more participants. For example, if you wanted to see if people can choose and enroll in a health insurance plan, you may recruit some participants who are young and healthy, and some participants who are managing a chronic health condition. Each one of these groups will have a different perspective on what is important to them, they may start in the same place and complete similar tasks, but they will be interested in different information and details as they make a choice.
It can be beneficial to recruit additional participants if you want to see how your application will be used by different age groups. A 22-year old will probably access details of their bank account differently than someone who is over 65.
The most effective way to understand how many participants you need for an evaluation is to clarify what you want to learn about your interface and determine if a particular audience segment warrants its own test day.
Research is at the core of everything we do at Crux Collaborative. We are very good at understanding and extracting information about the quality of an experience from people that help us make recommendations about how to improve experiences and make them easier to use. If you would like to chat with us about your users and where we can help identify the right people to uncover your applications biggest opportunities, contact us. We’d love to learn more.
By John Golden Learn how to plan, maximize, and be “strategic” about where to spend money on UX services in the coming year. This article will focus on research services and outline the value they provide as well as considerations for what stage to use them.
By Katherine Deschain In this article, we’re going to explore why having more than one participant in a task-based user research session is ultimately more trouble than it is worth and produces inconsistent results.
Hosted By Mahtab Rezai In episode 018 we discuss what happens when the participant in a research study isn’t….. participating, and what to do about it.