User Tests

User Tests are a conclusive and data-driven way to find out if your website is truly as good as it could be. If both old and new users of your website can use it successfully, you know you've got a fantastic user experience.

What do User Tests involve?

First we need to identify your goals, and what you expect users to do. For example, if your goal is for them to hit that big 'BUY' button, we can test the following things:

  • How they get from the homepage to purchasing a product
  • Which journeys they take through the website, and if they go the route you expect.
  • What confuses or hinders them at different stages of their journey
  • What makes them sure/unsure if they can trust your website, before making a purchase
  • Why they're choosing your website instead of a competitors

Why not just ask people these questions?

Generally, people don't tend to give honest answers. This is simply because they'll say something that makes them look more favourable, or is biased in some way by their previous experience. If you ask someone to criticise your website vocally, they probably won't, for fear of being rude. Especially in Britain! We're forever afraid of being impolite or hurting people's feelings. And that doesn't provide accurate data to improve a website.

We do Focus Groups; those are basically the same as User Tests, right?

Wrong. Put a group of people together, ask them questions and they will always - without fail - be affected by three things:

What the other people in the group think of them.

They won't say anything that will make them look unpopular in front of their peers - even if they've never seen these people before and never will again. Human nature means wanting to be perceived as better than you are by others. An example: If you asked people which side dish they eat when they order food: chips or salad? Most would reply 'salad' when they've never ordered it. Simply because they want to boost their status within the group, and be seen to be healthier than reality. These lies do nothing for the study and can damage the results.

The bias of the interviewer

Because of the variety of answers a group of people can provide, the interviewer should ask vague and open-ended questions. This avoids simple yes or no answers, which split the group and decrease the likelihood of future answers being helpful. But because the questions are so vague, they don't provide useful data. And any data they do provide is subject to the interpretation of the interviewer.  

Group-think

The group-think mentality is that in any group, there tends to be one or two people who speak louder than the others. When this happens, the rest of the group tend to subconsciously side with them rather than forming their own opinions. If someone talks loudly and confidently, it's natural to assume they know what they're talking about and follow them. The passive members of the group have opinions that are equally valid, but are overshadowed by the louder voices. So again, you end up with biased results.

"Why aren't you clicking there?! It's obvious! - The usual reaction designers have to seeing a user test

The real results come from observations

If your UX specialist can get someone to use the website and vocalise their thoughts, you'll see pure results that are nearly 100% accurate. The test participants will need to understand that you're not testing them - you're testing the product. That's a key distinction to make and if this isn't handled in the right way, it can again produce biased results from the start.

The best results come from observing and recording someone using the website. You'll often find the users do things you don't expect, and get frustrated with the results. This shows that not everyone is like you when it comes to how they use your website. In order for your website to be more successful, it's important to account for people who think differently to yourself.

We're currently taking on new clients!

Let's make your design problems disappear:

Thanks! We've got your details and we'll be in touch.
Oops! Something went wrong. Please check the form and try again.