Usability Tests

About Usability Testing

Usability testing is a method for observing people using a resource (software, websites, dashboards, physical objects) in order to measure the effectiveness, efficiency, and satisfaction of the experience. Before testing, a site or a prototype will be assessed for areas of interest; i.e. what the team or client wants to learn more about. The team then creates a task script to be used with each participant, toward consistency across sessions. Tasks can be very general, such as, “Take a couple of minutes to look around this site and tell us what you think about it while you look,” to more specific, such as, “Please try to find information about how to simplify an expression with negative exponents.” Usability tests can include talk-aloud protocols, where participants are encouraged to express their thinking about their experience while using a site. Other times, usability tests are administered with far less communication between the tester and participant: participants are asked to complete tasks on a site and the facilitator is there to address any technical difficulties. This last approach is useful for quantitative measures of effectiveness and efficiency, looking at things such as time on task, accuracy, number of requests for assistance, number of errors.

Usability Testing Process for the Universal Design of College Algebra Project

Our goal was to identify freely available basic algebra learning resources to evaluate for redesign opportunities, to create those redesigns, and then to do comparison usability tests of the redesigns with the originals. We looked for six sites to test that were freely available, had low-level technical requirements, had minimal need for technical support, and were applicable to Algebra 1 curriculum. We created a learning resource evaluation rubric [PDF] for use in comparing a variety of websites. We consulted the math instructors on our advisory board and resource-sharing websites such as Merlot for popular math support websites.

We used multiple rounds of testing to first probe for general impressions, then to observe use of the resources through directed tasks. In the second, third, and fourth rounds of testing, each participant was asked to imagine he or she was a student struggling with math and was asked to use a website to complete two assignments (one on factoring and one on exponents). In the second and third rounds we used talk-aloud protocols. We used the results of the first two rounds to create design guidelines for creation of our own learning resources. In the third round, we tested our redesign and two of the original sites. In the fourth round, we did not use talk-aloud protocols, in order to get clearer measures of effectiveness and efficiency. For all rounds, after each site, we administered a satisfaction survey.

Usability Test Task Scripts

  1. Task Script Round 1: Focus questions [PDF]
  2. Task Script Round 2: Tests of six websites [PDF]
  3. Task Script Round 3: Tests of exponents & factoring [PDF]
  4. Task Script Round 4: Tests of exponents & factoring (no talk-aloud protocols) [PDF]

Common Issues Found

In the first two rounds of testing (pre-redesign), 13 usability tests were completed (out of 24 scheduled) with students, with a majority of tests consisting of qualitative evaluations of accessibility and usability rather than quantitative assessments. An initial analysis of student feedback revealed 174 unique comments separated into 37 general themes. Due to the number of LRs reviewed and the primary goal of the evaluations (to develop guidelines to design LRs to be usable and accessible), comments were aggregated across LRs to identify barriers and develop design guidelines that could be used to create LRs that addressed student usability concerns. Below are themes that were mentioned by at least 2 different usability participants:

  • Details or explanations are missing from site
  • Helpful examples are provided (with video, sequenced steps, presentations)
  • Navigation is confusing
  • Validity of site and information needs to be provided for student to trust information
  • Advertisements are distracting and prevent student from focusing on task
  • Clear layout and use of graphic/text cues makes it easy to follow the lesson
  • Color coding provides unique cues to facilitate understanding
  • User workflow should be consistent with presentation and resources
  • Clarity of links is important to indicate what can and should be clicked
  • Lack of feedback prevents student from understanding if doing problem correctly
  • Text density is overwhelming and confusing
  • Appeal of attractive colors and animations engages students
  • Clarity of information makes it useful as a study aid
  • Site error causes browser to crash or information to not display properly
  • Supportive information such as summaries and rules helps communicate important information
  • Unclear instructions prevents student from knowing what to do next
  • Video & animation can be helpful for learning when student has control over information delivery speed
  • Vocabulary is confusing and needs clarification
  • Distractions from moving images and poor format makes it difficult to focus
  • Positive features such as print-ready materials and clickable examples makes it easier to learn
  • Readability needs to be supported through effective use of whitespace, chunking, and clear wording
  • Voice tone needs to be consistent and written for a student instead of an instructor

Usability Test Example Video Clips

The videos linked below are clips from the usability tests. In each clip, there is video of the participant’s face and their actions on the screen (with a yellow circle highlighting the cursor’s movements and a red triangle highlighting clicking action) and audio of the interactions between the participant and the facilitator. (The videos will open in a new window. Transcripts will be available soon.)

Pop-ups Block Text

usability clip of pop-up advertisement blocking textIn this usability session clip, the participant uses his mouse cursor to follow along with words while reading. Each time his mouse hovers over the left margin of the text, an advertisement pops open blocking the view of the text.
View this usability session clip (link opens new window)
Text transcript of video

Provide Feedback to Users

usability clip of user looking for feedbackThe website in this clip offers a practice exercise. The user spends four minutes working on the matching exercise but gets no feedback about the accuracy of her answers.
View this usability session clip (link opens new window)
Text transcript of video

Provide Procedure Steps

usability clip of user looking for more information and less dense textIn this usability session clip, the user comments on the website’s use of dense text and lack of explanation given for the steps involved in the site’s animated example of solving for prime factors.
View this usability session clip (link opens new window)
Text transcript of video

Lessons Learned

Students who participated in usability evaluations and design activities appeared to genuinely enjoy the experience, and were engaged with the process. Below are some of the techniques we used to get the most useful data in our brief timeline:

  • Double-book test sessions. We had a 50% no-show rate. This may have been due to low incentive amount ($25), the topic, or complex academic schedules. We decided to double-book time slots and if both students showed up, one would be engaged in design activities. An alternative would be to run the two sessions separately. After the sessions are over the people who ran the sessions could observe video of the session they missed.
  • Review written responses with participant to ensure accuracy of interpretation. Handwriting can be difficult to interpret and participants may also include useful unexpected feedback as they review their responses.
  • Debrief at the end of each usability testing day. Conversation almost always brings out memories that did not get captured in notes during the session and will surely be lost if not brought out very soon. Ideally at least two people observe the session (one can be remote) while it takes place or sometime later that day. Together, the observers identify problems and successes the user had, problems with the usability test protocols or administration, and opportunities for further investigation.

Recommended practices in usability testing can be found in the resources and references listed below.

Usability Testing Resources

Usability.gov
“A guide for creating usable and useful websites” created by the U.S. Department of Health and Human Services. Includes usability and accessibility basics, design guidelines, and example documents, such as a sample statement of work documents for hiring a usability consultant.

Usability.net
“A project funded by the European Union to promote usability and user-centered design.” Offers resources for usability professionals and for managers learning about usability testing and planning a project.

Usability Professional’s Association
Offers usability basics, community for usability professionals, a consultants directory and hiring tips.

References

Caldwell, B., Cooper, M., Guarino Reid, L., Vanderheiden, G. (2008). Web content accessibility guidelines 2.0. Retrieved from http://www.w3.org/TR/WCAG20/ May 2, 2008.

Dumas, J.S., and Redish, Janice, A. (1999). Practical guide to usability testing. UK: Intellect Books.

Koyanl, S.J., Bailey, R.W., & Nall, J.R. (2006). Research-based web design and usability guidelines. Retrieved October 24, 2007 from http://usability.gov/pdfs/guidelines_book.pdf .

Rubin, Jeffrey. (1994). Handbook of usability testing. New York, NY: John Wiley and Sons.

Shneiderman, B. (2004). Designing the user interface. MA: Addison Wesley.

Travis, D. (2003). E-commerce usability. NY: Taylor & Francis.