Unmoderated, Remote Usability Testing:
Common Sense Media

 
CSM-Desktop.jpg

Note: The use of the Common Sense MediaⓇ logo is being used for educational purposes only, for the User Experience Design Master’s Degree program at Kent State University, and has no association with myself or Kent State University.
 

Determining the Need for Testing

The purpose of this study is to gather data about how people use review sites, in order to improve the overall experience. In this case study completed for Kent State University’s Usability II class, Common Sense Media was selected as my review site of choice.  

THE PROBLEM

As a parent, I was introduced to the Common Sense Media website while walking the aisles of Barnes and Noble with my daughter (pre-pandemic). I was unfamiliar with the titles she was reviewing in the fantasy fiction section, and an employee saw me reading the book jackets feverishly and pointed me to this website.  

Fast-forward to my Usability II class, we were tasked with selecting a review site. Since I had used this site, in limited capacity, I thought the site would be a solid candidate for usability testing, since this review site’s results contribute to the greater good and safety of children.

Common Sense Media’s website states that it “rates movies, TV shows, books, and more so parents can feel good about the entertainment choices they make for their kids.” 

INSERT PANDEMIC DIGITAL CONSUMPTION

In a world that was increasing its technology usage, by the second, well before the pandemic catapulted digital consumption, it is important for parents to know what their children are reading, watching, hearing and playing for school or pleasure. 

According to research conducted in July 2020 by the Pew Research Center, more than nine-in-ten parents say parents have a lot of responsibility in protecting children from inappropriate online content.

 
View these charts and more in the Pew Research Center article: Parenting in the Age of Screens (July 28, 2020)

View these charts and more in the Pew Research Center article: Parenting in the Age of Screens (July 28, 2020)

 

Parenting sites and blogs resonate across multiple educational backgrounds as a resource utilized by parents for advice, according to the Pew Research Center. The efficiency of the search results from websites such as Common Sense Media impact a broad audience of parents ranging from high school or less to college plus.

 

navigatiNG THE DIGITAL LANDSCAPE

  • The Common Sense Media website is structured to provide parents with “reviews and advice to help them navigate the digital world with their kids.” It was created to take the guesswork out of digital consumption uncertainties.

  • This usability case study was conducted (remotely during the COVID-19 pandemic) to identify if the site was performing as intended or if improvements were needed, and, if so, where.

 

Initial Questions

Can users successfully navigate the website? 

 What problems do users encounter? 

 How can we resolve the problems identified?

 

Usability Testing Goal

Successfully conduct remote, unmoderated usability testing to assess the functionality of Common Sense MediaⓇ - in order to determine if it performs as intended for parents, guardians and children searching for content.

 
Book review search: age 12; type: books; genre: fantasy; popular with kids.

Book review search: age 12; type: books; genre: fantasy; popular with kids.

Where We Began ...

METHODOLOGY OVERVIEW

Diving deeper with user research, we were able to inquire about what the user does, why they do it, and how they feel about their experience, so that we can make informed, research-driven, design decisions, based on the needs, desires and perceptions, that also meet the business goals of  Common Sense MediaⓇ  

TESTING TYPE

Unmoderated
Remote Usability Testing

PARTICIPANTS

FIVE
(2 MALE; 3 FEMALE)

AGE RANGE

18-45

TESTING DATES

March 11 & 12, 2020

TESTING PLATFORM

Desktop

TASKS

Planning included generating carefully worded tasks surrounding the top areas of concern. Participants were reminded that was no right or wrong answer. Measurement of the tasks were both qualitative and quantitative, with results containing the expressed user feedback, along with the success rates of the tasks among users. 

  • TASK 1:

    • Next week is your niece’s 11th birthday, and she is an avid poetry reader. 

    • Using this site, where would you find poetry books for an 11 year old?

  • TASK 2:

    • Where would you find ratings of books about friendship, for 8 year olds?

  • TASK 3:

    • Your child brought home a new book from the library, “To All the Boys, I’ve Loved Before.” 

    • How would you find more information and reviews on this book?


Note: Participant size is limited, based on UX graduate school assignment parameters.
 

ABOUT THE PARTICIPANTS

RECRUITMENT

Recruitment took place, identifying five participants for the testing sessions. Participants provided consent, prior to the test.

  • There was a mix of experience with review sites among participants ranging from having never used one to using review sites often.

  • The age range for participants was 18-45, with two males and three females participating.

  • Four out of the five participants had visited a review site before, at least once, with one user visiting review sites often. 

EQUIPMENT

  • Sessions were conducted remotely on the individual participant’s computers (microphone and camera) using Validately.

  • Users were prompted to download the Chrome extension, prior to testing.

  • Users tested on desktops.

  • (Note: 3 out of 5 users prefer to access websites on desktops)

Measurement Overview

Measurement of the tasks were both qualitative and quantitative. The official report contains the expressed user feedback, along with the success rates of the tasks among users. 

USER PERSPECTIVE

Step into the users’ shoes and watch through the user’s eyes, as they navigate the three tasks set before them.

  • Listen as they verbalize their likes, dislikes and frustrations. 

  • Explore the top video clips for each of the three tasks, along with a few discovery points, revealed in each along the way in this short video (00:02:47), below.


TASK SUCCESS MEASUREMENT

As a quantitative measure, each task was analyzed based on the success and failures of how individual users were able to accomplish each task using the current interface.

Task Scoring Key

Task Success Definitions:

  • Located quickly, preferred method = 1

  • Located in a roundabout way = 2

  • Could not locate = 3

The scores were averaged to determine the overall success rate of the task, in order to convey if there is a need to improve a section of the interface. 

REVIEWING THE CHARTS

Overall, two out of three tasks were successful; and one out of three tasks was not successful.

  • In this usability study users were successful 11 out of 15 times in the three tasks.

  • While users were unsuccessful four times throughout the three tasks.

Reviewing the time on task data reveals the bumpy path for some to success. 

  • Task one, while mostly unsuccessful (3 out of 5 failed), it took the successful users (2) more time to complete the task than those who failed.

  • Task two was successful, although users completed this task in a roundabout way.

  • Task three was successful with those failing taking the most time on task. 

 
 

Task Findings detail

The following links detail the findings by task in the full report and conclusions that can be drawn from the successes and failures. 

 

OPPORTUNITIES FOR CHANGE

As displayed in both the quantitative and qualitative testing results in the full report, the ease, disappointment, disruptions and positive experiences of the users differed during the usability tests, providing actionable insights, which can become opportunities for change.

Future testing sessions will take place to identify if the implemented changes are making a measurable difference. 


Key Findings

  • Filtering bug removes filter, and displays unfiltered results, with some users not realizing it was removed. 

  • Result counts conflict when looking for books on Friendship. 

  • Additional filtering options desired by users in Best Book List section.

  • Users experienced issues with navigational terminology. 

  • Users expressed they liked the age filtering option and the site overall. 

Primary Recommendations

  • Additional filters should be added to “Book Lists” section. 

  • Fix the functionality of existing filters in “Book Lists” section.

  • Include a filter for reading level and/or Lexile score. 

  • Additional testing is needed to evaluate genre and topic similarities.

  • Content analysis needed to evaluate “Books” secondary navigation terminology.

  • Card-sorting exercise is needed to address slim navigation under “Books.”

  • Reduce pop-ups on the site. 

  • Further evaluate and conduct additional testing on search icons.

Important Items to Keep in Mind

 

REMOTE, UNMODERATED USABILITY TESTING

  • Data accuracy equates to credibility. Listen, re-listen, watch and re-watch recordings to pick-up on both expressed issues and those that can be viewed through the user’s task completion methods.

  • Design with research in mind. Have a good understanding of major tasks, mental models, opportunities and the like.

  • Know the mission. Be sure to gain a clear understanding of the client's mission.

  • Accept feedback. Insight from outside sources can help push us beyond our personal boundaries and create the best experience for our users.

  • Do not lead the user. While we want the answer to be correct, we want the user to find the answer on their own. Tasks are crafted to allow the user to provide answers to the tasks and to collect their unique answers, which can always provide additional insight unknown prior to testing.

  • Plan for the unexpected. Recruit extra participants; you never know what may come up, prohibiting them from participating. Prepare for technology glitches and the like with your equipment and supplies. Prepare to mitigate UI issues relating to testing — prior to launching testing, test the test yourself. Example: Because this was done, instructions included the request to download a Chrome Extension.

  • Always provide a clear justification for your case. Show value in your results and how it would relate to the client.

  • Save time and money. Proper research can also potentially prevent spending unnecessary dollars, dealing with unsatisfied users, extended timelines, and so forth — that is, if the issues are caught in time during the UX process. Advocate for additional testing.

  • Iteration is the word. The creative process is never complete.