Contact Us

UX Research Blog

A Step-by-Step Guide to Usability Testing

[fa icon="calendar"] Nov 4, 2015 12:30:00 PM / by Rachel Decker

This article was originally posted on the AppCues Academy.

It’s hard to get something right the first time you try it. The New England Patriots had four grueling decades before they won three Super Bowls in four years. Morgan Freeman didn’t land his first major Hollywood role until he was 52.

And this is especially true with software. It’s rare that a new feature is perfect after its first build. And when things go wrong, they can go very wrong. Launching a bad product experience can mean hordes of upset customers, lost revenue and, of course, a waste of your team’s most precious resource: time.

So to mitigate these risks, we turn to user testing. User testing provides us the customer insights needed to ensure our next release isn’t a total flop. And whenever you are designing a new user onboarding flow, you’ll want to incorporate extensive user testing into the scope of the project. Testing your onboarding experience with real users before you release it will save you tons of time, and help eliminate confusion that makes users bounce.

Don’t have a UX researcher on your team? Not to worry. You don’t have to be a designer, developer or UX researcher to get great customer insights from user testing. But you do need to know how to ask the right people the right questions to come to important design conclusions. Here’s everything you need to know about how to run an effective user test (and how one particular user test helped HubSpot achieve a 400% lift in one of our KPIs).

Setting Up Your Test

First, you need a design to test. It does not necessarily have to be a working piece of software. It can be sketched on paper, in clickable mockups, or whatever medium you want. There are just two requirements:

  1. The design should be task-based
  2. The task(s) should be useful and relevant to the tester. This often means incorporating the user's own data into the mockup. (Yes, we customize each mockup for each usability test. It's worth it!)

If you’re designing an onboarding flow, you have a goal: get the user to complete (at least) one meaningful task in your product. For Twitter, this may be following people the user admires, while for Duolingo this may mean starting your first Spanish lesson.

When designing an onboarding flow for HubSpot’s 30-day free trial, our goal was to increase the number of trial users who install the HubSpot tracking code on their site. Our hypothesis was that once they have they tracking code installed they would receive more value, and thus have a higher likelihood of converting into a paid customer. Here’s what the freemium dashboard looked like at the time:

old_dash

By doing research calls with trial users, we discovered that many of them had no idea what the tracking code was, where to go to install tracking code on their own site, or what the benefit of installing the tracking code would be.

To remedy this, we designed a task in the trial onboarding flow that would teach them why this was important and make it easy to get help with installation. Our goal was to user test this task to learn what was keeping people from installing the HubSpot tracking code, and identify what we can change to increase our conversion between signup and install.

Recruiting the right people

In order to get the right results from your user test, you need to recruit the right users. Google’s Michael Margolis has a great piece on the best way to select participants. For our HubSpot user onboarding designs, we wanted to get feedback from 30 people. To recruit them for the study, we sent emails like this one to unengaged trial users (we found them via tracking using Intercom):

ux_recruiting_email

There are two important things to note about these emails:

  1. If your company has an inside sales team, anybody who is trialing your software is likely already getting inundated with emails. Make it very clear that this is not a sales call.
  2. Including an incentive (like an Amazon gift card) and being very clear about the time commitment will improve your odds of getting someone to commit.

For any substantial change to your product, like a new feature or a new user onboarding flow, we recommend you have no fewer than five interviews. You should keep testing until you are no longer surprised with the results. Surprisingly enough, you can expect a high response rate (about 50%) to these emails, so sometimes it’s best to only send 10 at a time until you have your desired number of tests scheduled. Also, remember to follow up after a few days to those who haven’t responded - you’d be surprised how many appointments get scheduled after a second attempt.

At HubSpot, we schedule these calls virtually and use WebEx to run them. There are other software options that will work, but you should look for one that has screensharing, recording and chat capabilities (as well as the ability to switch who is sharing the screen). If you are doing in-person interviews, download Steve Krug’s in-person planning checklist.

Creating your Interview Script

For each user test you perform, you’ll want to follow the same process as everyone else in the study. This helps ensure you can compare your observations apples to apples, and draw logical conclusions without bias from your findings. There should be three parts to your script:

  1. Background questions
  2. Task-based walkthrough
  3. Wrap up

1. Background questions

First you want to ask basic questions to get to know the user, put him or her at ease, and discover some relevant information about their background with the trial thus far. Here are some of the background questions we used in our HubSpot user test:

  • How did you find out about HubSpot?
  • What do you know about us?
  • How would you explain HubSpot to a friend?
  • Do you have someone dedicated to marketing or with a marketing background? Are you considering hiring someone for this?
  • Are you familiar with inbound marketing? Have you done any learning on this topic?
  • Why did you decide to start a trial?
  • What expectations did you have about what HubSpot is? Did the trial match that?
  • What have you done with HubSpot so far?
  • What value have you found in HubSpot?
  • Why haven’t you come back in to your trial?

Notice that most of these aren’t yes or no questions. They are open-ended and designed to get the person to open up about their experience and expectations.

2. Task-based walkthrough

Next, you want to move into the task-based portion of the interview. Before your user takes any action, you need to set the context. In our HubSpot test, we told one user to "imagine you’ve been doing research on the HubSpot website and decide to start a free trial. You know that you’ve been looking for a way to consolidate all of your marketing to one place. Start the trial and walk through what you are doing.”

We knew this user was looking for one tool to handle all of their marketing from the background questions and our recruiting processes, and that’s what we use to make the tasks contextual. We could have framed the task for another user by saying, “Your boss said you need to sign up for HubSpot and decide whether it’s the best marketing solution for your company. Start the trial and explain what you are looking for. How do you decide that this is the best solution?”

During your task based portion of your interview, ensure that you:

  • Record the session so you can share the observations with your colleagues later on.
  • Encourage the user to think out loud. You want to hear everything that goes through the user’s head when completing the tasks. 
  • Tell the user that you are testing yourself and your designs, NOT them. Put them at ease and let them know they cannot do anything wrong.
  • Get every user to go through the same tasks without your help. This allows you to aggregate your findings and agree on design changes with your team.

Once you’re ready, share the task with your user and ask them to share their screen. At HubSpot, we give users clickable wireframes made with InVision. But this could also be a staging version of your software or even hand drawn mockups.

With each new screen or step in the task, first let the user think out loud and make note of his or her initial impressions. If you think there’s more to learn, there are some questions you can ask (download our user testing toolkit to check them out).

3. Wrap up

Once you’ve gone through all the tasks, wrap up the interview. Let them know that they’ve reached the end of the tasks, that their feedback was incredibly helpful (this part is really important - don't forget it!), thank them for their time and say goodbye. Shortly after the interview, follow up with a thank you email and promised gift.

Imporant note: After each interview, take 5 minutes with your team to go over what you just heard. This ensures everyone who was watching the interviews is on the same page with what just happened and leaves time for discussion and synthesis. This 5 minutes saves hours of debating and going over video later. It's one of the most important parts of the process.

Turn your observations into action

Since you had each user go through the same set of tasks, it’s time to aggregate and see how many users actually completed those tasks. This could be something as simple as:

  • 1/5 users understood why they should add the tracking code to their site
  • 2/5 users clicked the button to send the tracking code to their developer
  • 4/5 users clicked the “What can I do with this tracking code data?” button
  • All users completed the entire flow

It’s also helpful to share questions or comments that came through during testing. For example, we noted things like:

  • “Oh I don’t have a web developer, I can’t [install the tracking code]!” - This user wasn’t familiar with the term “developer” and used “webmaster” instead. The user really did have all the resources she needed to be successful but didn't know it. We can use that information to remember to speak the language of our users.
  • “This is cool! This is all the stuff I did on HubSpot.” - Indicated that the contact timeline we showed made sense to this user in the way we intended.

You don’t need a lengthy write up on the findings. If you took 5 minutes after each interview to go over what happened, it should not take much time at all. Just get your team to agree on what the most important takeaways were, document them, and decide on next steps together. Here is how we document our user testing results at HubSpot.

In conducting these customer interviews, there were three themes that we learned:

  • Users didn’t understand the value of the contact timeline until they saw their own data on it. They understood it most when the timeline information related back to the original goal of why they started a trial in the first place.
  • Code is intimidating. We needed to make it as easy as possible for the trial user to send the tracking code to the appropriate person to install via email, and explain the value of the code to motivate users to do so.
  • If people got stuck right away, they didn’t want to spend the time to figure it out and wanted to quit.

From these observations, we made the following changes to our free trial experience:

  1. We always showed each user his or her actual data on the contact timeline. No more dummy data (read more on this concept here.)
  2. We made our copy incredibly clear that the tracking code only has to be installed once. We also made our onboarding flow longer with additional benefits of installing the tracking code.
  3. We asked users to install the embed script right after showing them these benefits, and gave users the ability to email the embed script to another person on the team. The email contained text explaining why this is important so the developer is more likely to install it.

After making these changes, we shipped a variation of the below screens and saw a 400% lift in the number of tracking code installations after one week. All because we solicited feedback on a design before building it out.

step1 step2  step3

Get out there

Now’s the time to try your hand at user testing your onboarding flow. Start with designs, call trial users, and get feedback! Use that feedback to make something even better. You won’t get it right the first time, and maybe not even the 10th time, but keep trying. We’d love to see what you come up with.

 

Ready to dive in and get your hands dirty? Download our user testing toolkit to access schedules, proven email templates, and an example script to get started.

user testing kit

Topics: Usability Testing

Rachel Decker

Written by Rachel Decker

Rachel is the sole UX Researcher at cybersecurity startup Barkly. Previously she spent 3 years as a Researcher at HubSpot with her UX Sister Molly. She loves making ice cream, riding her bike, and thinking about adopting a cat.