I recently worked on a project where we conducted ten 90-minute moderated remote usability tests and I want to share my biggest takeaways. We ran five tasks that and asked participants to conduct both a concurrent think aloud (CTA) protocol (asking them to speak while they complete their tasks) and a retrospective think aloud (RTA) protocol (asking them questions at the end of each task).
Usability tests are a great way to uncover what your users actually think and feel when using your site. Even if you think that your design is easy and awesome to use, the only way to find out is by putting it to the test with real users.
1. Always run a pilot test
Can’t stress this enough – pilot tests are truly eye-opening. I spent a lot of time with my team preparing our usability test – writing the script, creating the assets to show the participants, creating a notes template, you name it. I thought we were good to start testing, but boy was I mistaken! We ran pilot with a test participant before starting our official testing and it revealed odd task flow transitions, confusing follow-up questions, technological challenges, and unexpected comments.
Don’t forget to plan a few days in between the pilot run and your first actual test in order to have enough time to make all the revisions you need.
2. Remain neutral and friendly
It’s easy to build up an opinion about the design you are testing, but in order to collect truly valid results, it’s important to keep those opinions out of site of the participant. Biases can creep into your tone, your reactions, and your actual questions, so make sure to keep these in check while running your test.
A tricky time is when a participant gets stuck on a task and asks you for help. One tip is to ask the participant to imagine that you aren’t there, and then ask them what they would do in real life.
Participants can also be quite vocal about their thoughts on the design at hand. We had a couple of people say things like “This is really confusing, I would add a button there and have voice control there,” and we had to kindly remind them that we were not looking for design advice, but rather insights into how they would go about completing the tasks.
One relatively easy way to help eliminate bias in your actual questions and create a friendly tone is to write a script and validate it with your team. This leads me to my next point…
3. Prepare a script with an introduction, task flow, and follow-up questions
This is essential, not only because if you are the moderator and suddenly get sick that your team won’t be at a complete loss, but also because this gives each participant a consistent test experience and therefore makes your data more comparable.
The script we used had the following format:
- Introduction and Purpose of Test
- Pre-Test Questionnaire
- Task 1-5
- Participant Scenario and Objective
- Pre-Task Questions*
- Task Questions*
- Post-Task Questions*
- Post-Test Questions
- Thank You and Closing
*Some questions may be unscripted based on participant answers.
Some quick tips for the script: You should introduce you and whoever else on your team is sitting in the test to help make the participant feel more welcome and comfortable. Make sure to stress that you are not testing the participant, but rather the design, and they should not feel that they are making any right or wrong actions. Also offer the participant a time to ask questions and take a break, should they need one.
4. Include technical requirement details in your invitation e-mail
Technology is a crucial part of conducting a remote usability test. Odds are that you will be using some sort of screen share or screen tracker which only works on certain types of browsers. Maybe it needs a plugin, maybe it only works on Macs, maybe it only works when you click your heels three times. The point is, in order to save time during the actual test, make sure you give all the requirement details to your participant beforehand to help them prepare for the test.
Our invitation included the confirmed time slot our participant signed up for, the meeting share link, the password, the browser and version they needed to use, the phone number to call in to the meeting, and an ask to test out their Java plug-in before the meeting started (just to name a few). Don’t forget to ask them to sit in a quiet location with a good internet connection.
5. Build in enough time in the test session for resolving technology issues
I don’t think even one of the ten tests we ran had zero technical issues. Sometimes it’s that the participant can’t access the meeting, sometimes the screen share suddenly closed and the participant couldn’t see the prototype, sometimes there was a delay between our script and showing the prototype, and sometimes the browser just crashed. It’s technology – expect the unexpected. Nobody likes it when meetings take longer than they should so do yourself a favor by planning extra buffer time for these types of issues to happen without needing to run over the scheduled time.
Check out http://www.usability.gov/how-to-and-tools/methods/remote-testing.html for more tips on remote usability testing.