Avoid the Top 3 Usability Testing Pitfalls

Posted by Robyn Benensohn | Comments

Tags: ,


avoid the top 3 usability testing pitfalls

Usability testing is a very powerful tool, but it’s not a very versatile one.

Your web site exit surveys, as a source of data, can uncover commonly failed tasks, general satisfaction about the web site, and demographic information about your users. Your web analytics tool can, as a source of data, uncover effective marketing channels, help you find the top pages to fix, and assist you in spotting ideal candidates for split testing.

Usability tests, as a source of data, can do none of that.

Instead, usability tests can do one thing, and do it really well: Uncover where people will fail when using your site.

Seeing where people fail and understanding the nature of that failure can help you fix the user experience … but you have to be pretty methodical about how you approach your tests.

You have to be careful that you don’t fall into one or more of these traps.

1. Using Usability Tests to do Market or Usability Research

For companies with tight budgets, it can be tempting to try and hit two birds with one stone. So when they conduct usability tests and ask users to perform tasks, they also ask questions like these:

  • Is this feature valuable or intuitive?
  • Would you pay for something like this?

Other companies go right ahead and try to use usability tests to check if they are building the right thing.

The problem with this is that usability tests are great at finding out where users stumble when using something … and not great at market or usability research.

It’s just not built for that.

You’ll get “good” signal or data about where users succeed or fail. Most of everything else - their thoughts about your pricing, business model, etc. - that’s just the noise that comes with the signal.

You can use other usability research techniques like interviews or market research to find out if you’re building the right thing or whether people will pay for a feature you’re developing. But you can’t burden usability tests with that task. The scope creep will make the tool fail miserably, even if it’s a great tool.

2. Looking for Design Suggestions

When you conduct your usability tests, you’ll often encounter users who will suggest potential “fixes” to your web site’s issues. Some users will say things like they wish your site used blue more on your site. Others might say they want a popup somewhere to explain something.

You have to be very disciplined in the way you approach comments like those.

On the one hand, you want to encourage the user to say whatever is on his or her mind - your users need to talk aloud if you want to get the most out of your usability test. On the other hand, the usability test is about uncovering the actual issues on the web site - it’s your job to determine the fix.

Most users are not going to have the design, analytics, online marketing and user experience background to come up with viable fixes - don’t make them do your job.

Use their comments to find the critical issues they encounter, and get to the nature of those issues.

When it’s time to come up with the fix, you should meet internally about the best way to solve issues, rather than rely on usability test comments about fixes.

3. Making Usability Tests too Cumbersome

Usability testing often doesn’t get done because it becomes this … THING. You know, a THING.

It was supposed to be just this tiny little project that happens once every month or every two months, but instead has become this multi-committee, time-sucking, wait-three-weeks-for-a-big-report-from-an-agency line item on your expense report.

It doesn’t have to be that way - you just need to treat it as a process and avoid overcomplicating things:

  • Don’t get impractical about selecting users.
    • Don’t get hung up on finding exactly the right people. Make sure they can use a browser, and that they are from your target region (not state!) and they are native speakers of the language. Everything else is just gravy. Even people who are outside your core market will be able to spot crippling usability issues on your site.
  • Don’t get too many users.
    • You need about 4 users for 3 rounds of tests. Any more will just add to your time and expenses, and make testing tougher to get buy in for.
  • Don’t present it as something you’ll need statistical significance for.
    • Split tests require statistical significance to be valid. Usability tests do not. They are there to find critical issues quickly - don’t oversell the science internally.
  • Don’t go for the fancy report.
    • Because some companies pay for usability tests via agencies, it can be tempting to want and demand the posh report that takes just shy of a month to generate. You don’t need that fancy report.
    • You need to find the top 5 or so critical issues that keep people from doing what they need to do on your site - that takes less than a day to generate after watching the usability tests. You need some bullet points on an email or word document, and no more. Creating the expensive-looking report keeps you from actually fixing the issues sooner.

Putting It All Together

Usability tests are more like scissors than Swiss knives.

They do one thing, they do it really well, and if you use them for their intended purpose, they will always work.

For that to happen, though, you have to know when to use the tool, and when to use something else. They are not market research tools. They don’t serve the same purpose as user acceptance tests or split tests.

If you …

  • avoid scope creep,
  • keep the tests about the issues rather than the fixes, and
  • schedule usability tests as regular small projects rather than large tasks with fancy reports

... usability tests will help you drastically improve the user experience on your web site.

'Want to optimize your conversion funnel but don't know where to start? Find out how we can help you with conversion rate optimization

 

Recent Blog Posts:

Should You Be Using Google Tag Manager?

Posted by Alexander Svensson | Comments

One of the great things that happened to online marketers over the past decade is the proliferation of tools.

comments | Read the full post

Personalization 101: The Least Online Marketers Need to Know

Posted by Alexander Svensson | Comments

There’s this belief in the online marketing world that web personalization is synonymous to big data, and that you have to be an enterprise-level company to do it.

comments | Read the full post

Google RankBrain: What the Artificial Intelligence Revolution Means for Online Marketers

Posted by Robyn Benensohn | Comments

Machine learning is changing the face of search. Here’s the least marketers need to know.

comments | Read the full post

The Science of Motivation: 3 Things Marketers Need to Know from Neurochemistry

Posted by Robyn Benensohn | Comments

When marketers think about getting people to act, there’s a tendency to focus on the technology and not the operating system, a.k.a the human brain, on which the technology functions on.

comments | Read the full post

Using Cognitive Biases to Improve Online Conversions

Posted by Tim Ash | Comments

There’s a famous example of how biases can lead to near-automatic behavior in “Influence,” Robert Cialdini’s now popular book about persuasion. A mother turkey, when faced with a stuffed polecat making a “cheep cheep” noise, will lookout for the welfare of the polecat.

comments | Read the full post

Join Tim Ash at Retail Global Las Vegas 2016

Posted by Alexander Svensson | Comments

Retail Global is an annual conference and exhibition for eCommerce retailers focused on marketplaces and cross border trade.

comments | Read the full post

Google Search Console Tricks to Monitor Early Stage Conversions

Posted by Alexander Svensson | Comments

Google has taken away quite a bit of data with (not provided).

comments | Read the full post

4 Best Practices to Get Your Call-to-Action Button Clicked [Video]

Posted by Martin Greif | Comments

The call-to-action button is a critical part of the conversion process. If it gets missed or misinterpreted, you’re leaving money on the table.

comments | Read the full post