Archivist Round Table Website Moderated Usability Testing

ART+Case+Study+Main+Image.jpg

Overview

This was a team project conducting moderated remote usability tests to evaluate the usability of The Archivists Round Table of Metropolitan New York website in order to identify users’ search and navigation behaviors, potential issues, and design opportunities. The usability experts worked collaboratively to come up with user profiles, a screener, a recruitment strategy, tasks, a script, pre/post-task questionnaires, and a consent form to conduct the usability tests on UserZoom Go, and offered recommendations to the client to improve the accessibility, discoverability, and findability of content on the website

Purpose

Moderated Remote Usability Testing of a website

 

My Role

Research

Scripting

Storytelling

Mockup building

Tools

UserZoom Go

Validately App

Figma

 

Client

The Archivists Round Table of Metropolitan New York, Inc.

Project Type

Graduate Usability Theory Course Project

Team

Lu Pei-Yun Chung, Danielle Kingsberg, Yuki Shimano (Me), Yi Zhong

Period

Oct 2020 - Dec 2020


 

Screenshot-2020-05-12-at-6.07.43-PM-1536x426.png

The Client

The Archivists Round Table of Metropolitan New York, Inc. is a non-profit organization founded in 1979 that represents a diverse group of more than 400 archivists, librarians and records managers in the New York metropolitan area. The A.R.T. website serves as an advocacy platform of collections, events, forums, archival journals, and mentorship programs for the archivist community to raise awareness of the cultural preservation of materials.

 

The Challenge

The main problem the A.R.T. website had been facing was it’s difficulty of use for first time or non-expert (in archives) users, and content discoverability. The A.R.T. wanted to make their website easy to navigate, informative about the skills of the organization, and have an organized content structure.

 

Goals 

ipad-pro-mockup-on-white-table.jpg

The study was designed relying on both qualitative data (test observations, notes, and video recordings) and quantitative data (metrics of task difficulty, findings severity, questionnaire responses, etc.) in order to:

  1. Gather visitors’ first impressions of the website

  2. Make sure the website tells who/what A.R.T. and archivists are to visitors

  3. Make sure opportunities to get involved and network are discoverable

  4. Find redundant information

  5. Find out the expectations people have with the website

  6. Make sure visitors can easily navigate the website on both desktop and mobile.

 

My Role

Me and three other usability experts designed the moderated remote usability tests, recruited test participants, conducted the tests, analyzed the collected data, and identified problems and recommendations for the A.R.T. In creating the final report and presentation, I was in charge of describing the findings and suggestions related to information on the Events, Members, and Mentorship pages, and minor content fixes.

 

Overall Process

The team decided to follow the process of meeting with clients, test preparation, data collection, data analysis, and delivering the report. The team made sure that suggestions to findings in the final deliverables were supported by user research and usability test results.

Timeline.jpg

 

Client Meeting

Screenshot of a Zoom meeting with the A.R.T. staff

Screenshot of a Zoom meeting with the A.R.T. staff

The team met with the client in the first week of the project to ask about the website’s existing challenges and goals for the project (see The Challenge and Goals sections).

The client also wished to find out the expectations visitors had with the website and whether information on ways to get involved were discoverable, in order to change the layout and content of some of the pages, especially the homepage and About page.

 

The team also noted the key tasks or features that the client wanted to test:

Group 88.jpg

 

Participant Recruitment

From the client meeting, the main users of the A.R.T. website were revealed to be archivists, A.R.T. members, employees of organizations that have archives who are looking for a community or for advice, non-expert people who want to connect with archive experts, and educators. Based on this information, a target/primary user persona was designed.

image.jpg

Primary User

Based on the target audience description, the research team created user profiles of aspiring archivists looking to connect with local archivists in the community to broaden their network and engage in events to progress their careers.

Screening

As per the user profile, participants for this study were expected to be interested in archiving and archiving events. The team recruited a total of 9 people (8 were students from the School of Information at Pratt Institute, and 1 was an alumni) who expressed interest in archiving content and going to museums to participate in the usability test. A recruitment email with a screener was sent to the Pratt Institute’s School of Information Listserv. The screening questions listed below allowed the research team to qualify or disqualify relevant participants:

  1. What degree are you pursuing at Pratt Institute?

  2. How often do you visit museums or libraries? (Including Pre-COVID) 

The first question was used to identify participants who are in the Library and Information Science (MSLIS), Museums and Digital Culture (MS), or History of Art and Design (MSLIS/MA) degree, based on the assumption that students pursuing those degrees most likely wish to pursue a career in archiving.

The second question was asked based on the hypothesis that if a student frequented museums or libraries, they were more likely to pursue career opportunities in archiving or attend events hosted by the A.R.T. 

Questionnaires

Pre-Test

The moderator asked the following questions similar to the screener before proceeding with the tasks portion of the session: 

  1. [If participant did not sign the consent form before the test] Can you review, sign, and date the consent form?

  2. What degree are you currently pursuing?

  3. Which websites do you commonly use for networking?

  4. How frequently do you visit libraries? [can be pre-covid]

Post-Task

The following questions were asked after most of the tasks (tasks #2, 3, 4) in order to learn about the difficulty of each task felt by the user:

  1. Is this experience what you expected? 

  2. Do you feel you have completed this task successfully? 

  3. From a scale of 1-5, 5 being very easy, how would you rate this task?

Post-Test

The following questions were asked after the test in order to learn more about the users’ behaviors, expectations, and experiences in summary.

  1. How does this experience compare to other archive websites that you’ve used? (1- worse 5 - better)

  2. Which of the following did you encounter as you were using the site? (Choose all that apply) 

    Hard to navigate; Too much scrolling; Not well organized; Too much reading. Text too small; Too much clicking; None of the above

  3. What frustrated you most about this site?

  4. What did you like most about this experience?

  5. Did this activity improve your understanding of what archivists do? If not, what would help you to better understand their role?

  6. Is there anything else that you would like to share or discuss?

Usability Test Scenarios and Tasks

Participants were asked to complete the following questions that were made to cover the goals and key tasks the client wished to have addressed:

  1. Scroll about the homepage without clicking on anything. What are your first impressions?

  2. Use the website to find an A.R.T. event that you would be interested in attending.

  3. You met an A.R.T. member from Princeton University, but forgot to get their contact information. Use the website to find and contact the member.

  4. You’re interested in progressing your career. Use the website to learn about the mentorship programs.

 

Usability Testing/Data Collection

The 9 participants were given a choice to take the moderated usability test via UserZoom Go on a Chrome browser from either their laptop or mobile device, and ultimately 7 tested for laptop and 2 for mobile. The research team conducted a pilot test before starting the series of usability tests to get used to the UserZoom Go tool, and resolve any technological issues. Each moderated usability test was conducted by two team members, one acting as the moderator who asks the participant the questions on the script, and one acting as the note-taker of the participant’s behavior and comments. Each team member acted as the moderator for at least two tests. Moderated usability testing was helpful in directly asking the participants insights about A.R.T.’s website in depth. Participants were asked to speak out loud their thoughts throughout the test.

Screenshot of a recorded Moderated Remote Usability Test from UserZoom Go

Screenshot of a recorded Moderated Remote Usability Test from UserZoom Go

Notes compiled on the Miro Board Chart

The notes taken were made into cards that were compiled into one chart made using Miro, and sorted by participant and task. Observations about the participant behavior and quotes during the task and post-test questionnaire were written on the cards.

Data Analysis

The quotes and observations of the usability tests written on cards on the Miro Board were put into a spreadsheet and assigned a usability severity level (0-4, with 4 being most severe) and frequency (how many participants encountered the problem) to determine which issues required priority. There were a total of 183 pieces of findings aggregated from the sessions.

34% of these pieces of findings contained positive or neutral feedback (severity level 0 or 1) and 35% of the elements were identified as issues that should be prioritized (severity level 3 or 4. The team decided to focus on providing recommendations to the findings with severity level 3 and 4 issues.  

Average difficulty rating of Tasks 2, 3, and 4

Average difficulty rating of Tasks 2, 3, and 4

From the collected answers to post-task question #3, it was found that participants had the most difficulty completing Task 3 of finding an A.R.T. member, with an averaged difficulty score of 2.2 out of 5. Four of the nine participants gave it a 1 on the rating scale, one stating

“I don't even know how I can find the directory. It's not what I expected when I clicked ‘Members’.”

Task 2 of finding an event was the next most difficult, with a 3.7 averaged difficulty score out of 5.  Participant’s feedback surfaced concerns about missing information such as “links to certain websites” or a  “price for the session” particularly in the case of non-members vs. member tickets. 

Task 4 of finding mentorship programs was the least difficult; averaging at difficulty score of 3.8 out of 5. Some participants pointed out that they

“Couldn't see who the mentors or mentees are or detailed info”

However, most participants successfully navigated to the right information, and became interested in the program.

Pie charts showing the percentage of each difficulty rating for each task

Findings and Recommendations

The participants provided positive feedback about the website’s posts of engagement opportunities, use of photographs, and not looking too busy as compared to other library websites. For areas to improve, the team came across the following key findings based on the usability test results:

 

Finding 1: Content appears too long and is hard to navigate on the homepage.

Participants found overload of unordered content on the homepage, redundant information linked from the homepage to ‘Open call to [activity]‘ pages, and difficulty of navigating the homepage and header on mobile devices.

Screen Shot 2020-12-09 at 1.33.39 AM.png

Recommendation 1a: Order and group content by priority

Adopt a grouped card format to segment topics into easily digestible sections by type and priority will improve navigation (1). Each card would have a small image and brief summary of a topic with a hyperlink to more information.

Adding a column for more time pressed or highlighted topics can make better use of the margins (2).

The logo should be made the home button to reduce page redundancy (3).

Screen Shot 2020-12-09 at 1.33.45 AM.png

Recommendation 1b: Implement mobile friendly navigation

Implement a more responsive, modular format to improve the mobile navigation experience by making the A.R.T. logo as the ‘homepage’ icon (1), adding a ‘search’ icon for quick navigation (2), and adding a ‘hamburger’ icon to access page tabs from a dropdown menu (3).

Finding 2: Inconsistent purpose of search bar use and results.

The search bar was used by 4 of the 9 participants, and 3 of them gave feedback that the search capability could benefit from adding some standard features such as filters to make it clear what they can search for.

Screen Shot 2020-12-09 at 1.33.56 AM.png

Recommendation 2: Enhance the search bar to set expectations

Although participants requested more advanced filtering, adding a type-ahead feature and suggested drop-down lists will improve the expectations and overall usability of the search feature.

Finding 3: Missing information on ways to get involved.

On the ‘Events’ page, most participants preferred to use the calendar view than the list view to find upcoming events, because future events did not show up in list view and the long list of events written in the same font made it “hard for [their] eyes [and] to know where to go.” 

On specific event pages, participants expected to see additional information such as other events happening in the same week, resources, discussions, and event facilitator contact.

Participants also expected to find a list of members for networking purposes when they clicked on the Members tab, but there was no members directory. The page title was also different: it was called ‘Membership’ instead of ‘Members’, and gave information on how to apply for membership, instead of contacts of individual members.

When searching for mentorship programs, at least five participants noticed that all of the links for the ‘Open Call for A.R.T. Mentorship Program’ were broken because it was past the application deadline. Participants wished the broken links to mentors and mentees information would be updated, or the error page could say where else to look for the information, because they would still be a good resource.

image.jpg

Recommendation 3a: Prioritize Calendar View and make additional event information accessible

Users should be directed to the Calendar View first to benefit those looking for upcoming events. The List View is good at showcasing past events, so it should be a secondary feature that can be viewed only when the user wishes to see it. The team suggests putting it at the bottom of the page under a section that expands when the user clicks on it (1).

Screen+Shot+2020-12-09+at+1.34.07+AM.jpg

Other event details can be added on individual event pages for convenience, such as facilitator info (2) and links to other events in the week (3).

Screen+Shot+2020-12-09+at+1.34.18+AM.jpg

Recommendation 3b: Change the Membership page title and add Members Directory

To improve the experience of users seeking member contacts to network, the title of the Members menu bar landing page should be changed to ‘Members’ instead of ‘Membership’ (1), and a members directory that’s only visible to those logged in to maintain privacy should be included (2).

Screen Shot 2020-12-09 at 2.15.53 AM.png

Recommendation 3c: Hyperlink the communications email

The email provided at the bottom of most pages should be changed to a hyperlink to be more accessible.

Screen+Shot+2020-12-09+at+2.19.53+AM.jpg

Recommendation 3d: Update links and add additional mentorship information

Application links should be updated, or information about the next application date should be provided. ‘Mentor’ and ‘Mentee’ links should show a list of current mentor profiles and mentees if they are made public. Otherwise, the links should be removed.

Screen+Shot+2020-12-09+at+2.19.53+AM.jpg

Finding 4: Minor content fixes

Minor issues that should be very easy to fix were found related to the About page and the font used overall. For the About page, participants expected to find a clear description of what the A.R.T. does. Two participants expected to find information about Mentors and Members in the page as well.

Secondly, while most participants praised the font colors of the website, the font size was “too small” and difficult for some participants to read.

Screen Shot 2020-12-09 at 1.35.15 AM.png

Recommendation 4a: Add a clear explanation of A.R.T. services and other relevant links to the About page

A link to a Members Directory that is only accessible to members (1), a link to the Mentorship program page (1), and a section dedicated to explaining the services offered by A.R.T. (2) should be added to the About page

Screen Shot 2020-12-09 at 1.35.21 AM.png

Recommendation 4b: Increase font size and distinguish working links

The font size should be increased (2). Working links can also be underlined to show that they’re clickable, especially the headers that lead to another page (1). If links are out of date, they should be removed.

Presentation

The deliverables to The A.R.T. included:

  1. A written report of the moderated remote usability tests, and its findings, and recommendations from the team for improvement.

  2. A slides presentation of the final findings and recommendations.

Client Response

“It was great to learn that the language was hard for non-native English speakers. I even think the font is small and don’t know if links are clickable.“

“I liked the highlighted side bar you guys made to scroll the homepage. It allows people to skim.“

“Hopefully we can work with the current Wild Apricot platform to make these changes.“

 

Conclusion

The Moderated Remote Usability Tests were able to accomplish the goals and answer the questions the client wanted to know.

  1. The team was able to uncover the visitors’ first impressions of the website, which was that it made good use of images and looked informative about engagement opportunities, but had long content on the homepage.

  2. It was found that participants were able to learn what archivists do from reading the engagement activities mentioned throughout the website, but had difficulty figuring out what exactly the A.R.T. organization does.

  3. Some opportunities to get involved and network were hard to discover due to missing links or information.

  4. Redundant information were found for Open Call pages, which mentioned the exact same description provided on the homepage.

  5. The team found out that visitors expected to see more details about events, a members directory on the Members page, lists of mentors and mentees, and a clear description of the A.R.T.’s services.

  6. It was found that navigation was extremely hard for the mobile version of the website due to a hidden navigation menu.

Challenges

The project faced the limitations of:

  • The qualitative data not being as rich, because there were only two participants who conducted the test for mobile, limiting the number of findings for the mobile version of the website.

  • Reduced engagement and adherence of participants, because some participants were too focused on the task, and forget to think and talk aloud. 

  • Difficulty to control the process due to technological issues, because two participants who signed up for mobile testing encountered issues downloading the testing app on their device, and had to switch to testing on their laptop instead.

Next Steps

A richer qualitative data can be gathered if more usability tests can be conducted for the mobile version of the website. More participants who are non-experts on archives can also be gathered to collect feedback on whether the A.R.T. website is easy to understand and navigate.

Additional research methods such as card sorting or tree testing can be applied to further validate user navigation. These methods will help determine how users group the different topics that the website presents, and identify patterns users will make to navigate to the right information. It will lead to creation of direct paths to information and reducing the number of clicks required to get to a specific place.

Suggestions can also be presented better to the client in the form of a prototype. A mockup prototype of the entire website was not made due to time constraint, but a high-fidelity wireframe and prototype with simple clickable features of the homepage was made by team member, Lu Pei-Yun Chung.