The Process of Usability Testing

July 23, 2015

In case you missed the first post, this is part two in a two-post series about our work with Visit Indy. The first part covered how user research brought value to the beginning of this project. Now, we’d like to cover how usability testing bookended our website production work with them. (Nerd alert: get ready for some charts and graphs and fun data points!)

After the launch of Visit Indy’s new website, we (and the Visit Indy team) wanted to understand how visitors used some of the site’s new features. While there are various forms of usability testing (remote and in-person, moderated and unmoderated, to name a few), in-person usability testing worked best in this case because it allowed us to observe the tester and ask a lot of follow-up questions to clarify some of their decisions and thought process.

Begin with goals

We began the process with brainstorming what we wanted to get out of the testing. What were our hunches and assumptions for how people used the site? Were there any we could confirm or disprove with testing? Did analytics show us anything that seemed off? In addition, the Visit Indy team was curious about how people engaged with the site’s content and navigating through search results, and we ultimately landed on centering most of the testing around the site’s new search and filtering functionality.

Create and implement the usability test

Once we determined we wanted to know as much as possible about how people used the search and filtering functionality, we crafted a test that included five tasks asking testers to search the Visit Indy site in various ways. We recruited event goers (a key site audience) in the Indy area to be our testers and offered an incentive to sit down with us and interact with the site and complete the tasks. After each task, we asked the tester how easy or difficult they perceived the task to be. At the conclusion of our session, we had them fill out a usability scale based on their experience with the site during the session.


One of my favorite parts of testing is the measuring, both quantitative and qualitative data. It’s really fascinating to observe the different ways people use and think about websites (often completely different than the way I would use a site), and back it up with data. For this test, we chose to measure ease of use, overall usability and task success.

Ease of Use
Ease of use tells us how easy or difficult people perceive a task to be. With ease of use, it’s important to keep in mind that it is based on a person’s perception, which can be very subjective. A tester may have nearly been unable to complete a task, but perceived the task as fairly easy. That’s why it’s important to try to pair this measurement with a success rating (discussed below) to see how the two compare.

For Visit Indy, a few tasks received higher marks than others, so we took a closer look to determine why that was the case. Reviewing the qualitative data revealed a few ways people were getting confused or tripped up on the site, which we noted for the Visit Indy team to consider when tweaking the site.

Task Success
For each task, we captured whether the tester was completely or partially successful (going a roundabout way to the succession point in a slow or normal fashion) or failed altogether. This rating, along with the ease of use, can provide insight into which tasks are the easiest or more problematic than others, and to what degree. As with ease of use, we noticed a few tasks saw a significant amount of partial success or failure, and one task with a lot of success instances.

System Usability Scale
The System Usability Scale was developed in 1986 and is considered a standard usability measure that is technology independent and used to measure a range of systems, from interfaces to hardware and beyond. It consists of a questionnaire with five positively worded statements and five negatively worded statements that testers rate on an agree/disagree scale. Those responses are then calculated to reach an overall score in a percentile. For websites, a score of 68 is considered average.

The Visit Indy site scored higher than average for websites, which we considered to be good news! While it didn’t give us any insight into any changes to be made, it did tell us that a lot of great things were happening on the site.

Insights and Revisions

After presenting this data and our observations to the Visit Indy team, we discussed ways to evolve the site. Many of the revisions centered on specific areas of the site where visitors needed more clarity on the content displayed (are these events this month’s or next?) and finding ways to make the search results more relevant and clear.

After this first round of testing and site updates, we conducted another round of testing to further iterate on the site. It’s been really fun to see the site continue to evolve over time. If you’re ever looking for something fun to do in Indy (or have out-of-town friends who need to book a hotel), be sure to check out Visit Indy for everything that’s happening in this awesome city!

Related Posts

Oct 20, 2016