Loading
lesson

Usability Tests

Nick DisabatoNick Disabato

Usability testing is a time-tested research tool that involves recruiting a small group of users, usually non-paying customers, to complete a series of predetermined tasks on your site or product. During the process, participants think out loud, explaining their actions and thought process, while you record their behavior and identify any issues that arise.

The Usability Testing Process

The main book on usability testing, "Don't Make Me Think" by Steve Krug,

This lesson is part of Value-Based Design Fundamentals and can be unlocked immediately after purchase. Already purchased? Log in here.

Transcript

Usability testing is one of the more powerful research tools in your arsenal. It essentially involves recruiting a handful of users, generally non-paying customers or not already yet paying customers, to complete a series of predetermined tasks on your site, your product, and they explain their inner monologue out loud, and you record where they're looking at and what they're trying to do, and you try and identify any sort of issues that are coming up as a result. It's a very, very old school, classical way of vetting a product's usability. It's been going since, effectively, the dawn of personal computing over 40 years ago. And you can use it too.

It's essentially evergreen. The main book around usability testing is called Don't Make Me Think. It's by a man called Steve Krug. It came out over 20 years ago. It's been updated a handful of times since.

It is tremendously powerful. And essentially, the process involves going to usually a framework and tool that helps manage the recruitment side of things, because that's the hardest part of it. And you give them a series of questions and a series of criteria for the customer to actually go through these tasks. And then you wait, and you get back a series of videos. I go transcribe those videos.

I'll use an AI tool like Whisper or something like that that's open source and free and really easy to use. And Then I'll go through and try and isolate any sorts of issues. If somebody is furrowing their brow at something, it means they might be confused. If someone enthusiastically clicks on precisely the wrong thing, that's very interesting to be keeping in mind. And then you sit there with a pen and paper as you're going through these transcriptions and these recordings, and you identify any sorts of issues that might have arisen and you're really just trying to report out at This point again, you're really observing before you're jumping to conclusions around how to address a specific problem The deliverable is what those problems are, right?

And that's really what you're trying to be doing when you're putting together a usability test. So let's dive into a few of the things that I like to do when I'm putting together a usability test. First, for the setup, you need a series of criteria. So who is typically likely to use your product? What situation are they in?

Are they in a B2B context? Are they in a B2C context? Are there specific demographic constraints you want to be thinking about? Are there specific lifestyle traits that you want to be considering as somebody's going through. Write those down.

Make sure that you have them really clearly defined so that you're recruiting the right kinds of people who might be the most likely to use your product or service. Then you write down a series of tasks for people to be completing. If you are new to usability testing, you know that the task list can be pretty vast. You can really have somebody do anything. I like having them provide first impressions about the site.

So if you get a clear sense of what the branding is, what the overall vibe is, if it looks professional, if it looks trustworthy. Those are good things to kind of be getting a sense of as people are going through. And then you offer suggestions for improvements. So usually a lot of people with usability testing will just say, everything looks great, no notes. Well you want notes, right?

So really what you want to do is get them thinking creatively about how to actually start addressing things. You say, identify three things that you would be fixing on this page. And then that gets them thinking, all right, well, there are some things that might need to be fixed on this page, which is really, really useful for you. If they still have an aw, you might need to throw out that result. So what you want to do is run this with five to seven participants.

And that way, you get three to five viable responses from them. And it can be really, really useful on that front. As far as the tasks are concerned, then I move into try and search for a given product, fulfilling these specific criteria. Check out what that product, or if it's a software business, take a look at the pricing page and sign up for a plan that matches your specific business's needs. Describe what that business might look like.

Get imaginative and creative with it and have people complete it up to the point where they would be entering personal information or a dummy credit card or something like that. I love having people go all the way through checkout, so if you have the ability to enter like a test credit card or something like that, that might be really, really useful for people. And then you go through and try and synthesize all of this information. So you get back these videos. Again, download them locally so you have them to share out.

And then I transcribe them. And then I write down as many things that are interesting that I've observed, especially if it happened with multiple participants. And then I write up a big PDF that outlines my findings, attach it to all of the recordings and transcripts, and I send it along to the rest of the team. I'll create Trello cards or whatever project cards that might synthesize decisions into something that's more actionable. So if something is broken, we need to fix it.

If people are not landing with a particular headline, we might need to research it further and understand what actually resonates with them. If they think that the site does not feel credible, then it might be calling for a redesign or a deeper rebranding, something like that. Either way, you're going through and trying to understand and isolate things that might not be working for you and the product. And I do this roughly every three months. It's a relatively involved technique, but it yields so much deep insight.

Next up, we'll be talking about the most powerful and profitable research method in your arsenal, which is interviews.