Accessibility usability testing with blind and low vision users
How might we ensure the platform is usable for teachers and students who rely on assistive technology?
My Role: UX Researcher
Tools: Google Forms, Zoom
Jump to a phase of the research process:
The Problem:
National Geographic Learning (NGL), an educational publisher of K-12 digital learning products, needed to identify usability issues within their platform, especially as it related to users who rely on assistive technology such as screen readers, screen magnifiers, refreshable braille devices, and inverted color applications.
As a first step, NGL participated in a platform and content Voluntary Product Accessibility Test (VPAT). While the VPAT addressed accessibility standards and requirement deficiencies within the platform, it did not address any usability issues that blind or low vision users may encounter.
Considering that 15% of the world’s population experience some form of disability, it was important to ensure the NGL product and platform not only meets accessibility standards, but is also usable for all teachers and students who access the platform with assistive technology.
“If I was in a real classroom, I would be nervous that the other students would have already gotten to the assignment. The teacher may have moved on, and I might be behind.”
— Usability Test Participant 6
The Solution:
An extensive usability test with users who rely on assistive technology:
7 tests with blind and low vision users, across varied devices and forms of assistive technology
Key insights shared across internal teams for transparency and knowledge sharing
Recommendations to development team for improvements
Promote overall accessibility awareness across National Geographic Learning teams
Research Objectives
Objectives:
Knowing that time was limited, and that this research would be shared across several internal NGL teams, it was important to align on research objectives from the very start of the project. This was the first accessibility usability research study, so we needed to be clear on our objectives and what was in and out of scope. Therefore, we focused on high level, key user tasks for this test.
When using assistive technology, what barriers occur when:
Logging in (teacher and student)
Interacting with content, including: text, videos, tables and assessment items (teacher and student)
Annotating content, including: making a highlight or note (teacher and student)
Navigating between pages in an activity and between activities (teacher and student)
Making a student assignment (teacher)
Completing and submitting a student assignment (student)
Grading an assignment and adding comments (teacher)
Viewing graded assignments and comments (student)
User Test Protocol
Writing the Test Protocol:
Once the research objectives were determined, I wrote the usability test protocol which would be used to guide our participants through various tasks in the platform. The test protocol was an important tool to keep the usability tests on track as well as ensure we were having users run through tasks that aligned with our overall objectives.
Our original protocol was written to take each participant through all tasks (each participant would act as both a student and teacher) within the newest NGL product, AP Human Geography. It was important to write the script so that it flowed well and made sense sequentially.
I ran through the protocol with a teammate to ensure the wording and sequencing of tasks was intuitive. After a few edits, the protocol was ready for the tests.
Participant Recruitment
Participant Recruitment in Partnership with The Carroll Center:
For this test, we needed to recruit participants who utilized assistive technology. Therefore, we partnered with The Carroll Center for the Blind, a non-profit organization that helps blind and visually-impaired people gain the skills they need to be confident, active, independent individuals.
The Carroll Center sent an email to their network with the details of our usability test, including the participant requirements:
At least 18 years old
Visually impaired or blind (we are looking for participants who rely on some form of assistive technology when using a computer, such as a screen magnifier, screen reader or refreshable braille display)
Steady WIFI connection
A desktop computer/laptop (Windows or Mac) or a Chromebook
Zoom access with the ability to share audio and video during the test
We asked participants to express their participation interest via a Google Forms survey that was included in the outreach email sent by The Carroll Center. Before launching this recruitment initiative, we tested the survey across several screen readers (JAWS, NVDA and Voiceover) to ensure it was accessible. View the complete survey here.
We received over a hundred responses to our survey, but narrowed down the final participant list to a group of 7 who represented varied types of devices (Mac, Windows, Chromebook), varied types of assistive technology (VoiceOver, JAWS, Chromevox, screen magnifier, inverted colors, refreshable braille display and ZoomText) and varied years of experience with their assistive technology (from 1 year up to 20 years).
Conducting the Usability Tests
With the protocol written and participants confirmed, it was time to conduct our usability tests! Each 1 hour test was conducted over Zoom.
In order to limit the amount of technical difficulties that participants could encounter with accessing Zoom via their assistive technology, we sent each participant a pre-test email with instructions on how to enable Zoom screen and audio sharing (it was important for the participant to share their audio so that the recording would pick up their screen reader).
Each participant's usability test session was recorded, with their permission. To ensure thorough note taking, my teammate joined each test to act as the scribe and note taker.
Test Results and Insights
After completing the 7 usability tests, I identified key accessibility issues to address in the platform, grouped into three takeaways:
TAKEAWAY 1
Screen reader users require clear audio cues to tell them where they are and where they can go.
TAKEAWAY 2
When assistive technology is enabled, some functionality becomes difficult, or even impossible, to use
TAKEAWAY 3
Users with assistive technology are forced into very long navigational journeys, with no way to quickly jump to where they want to go
TAKEAWAY 1: Screen reader users require clear audio cues
Throughout the test, it became aparant that not all icons and elements were properly labeled to communicate their function when activated by a screenreader. There was also an issue with missing headers on the pages.
Screen reader users rely on page headers (H1, H2, H3 etc.) to quickly understand the content that is on the page. When headers are mislabeled, or missing altogether, it makes it very difficult for users to navigate the pages via headers to understand the content on the page.
As a user navigates through the Table of Contents (TOC), the “back” button helps them move back to previous levels within the TOC. For a sighted user, the “back” command is associated with the course table of contents, but there was a lot of confusion when a user can no longer rely on that visual hierarchy. Users did not understand what or where they were going back to.
Additionally, when navigating within the course content, if a user wanted to navigate to a previous activity, they could not distinguish between the “previous” button and “back” button.
Providing more specific labels and roles to these buttons would help screen reader users navigate more efficiently.
Screen reader users need to understand when a new page, or activity, has loaded in the reading pane. Across the test, we found some new page loads communicated “activity is ready”, but not all. Users also gave feedback that “activity is ready” is sometimes too vague and it’s better to use more specific verbiage to let them know what has loaded on the page.
This was very apparent when users interacted with videos. Missing context and audio cues, along with hidden video controls, makes interacting with a video very difficult.
“You have to let people know when the new page load is done. You’ve got to manage focus.” -Participant 6
TAKEAWAY 2: Some functionality becomes difficult, or impossible, to use
Screen magnifiers make any content within the reading pane very difficult to read and navigate. The user utilizing Zoomtext was only able to view a few lines of text at a time.
“I’m only seeing a couple lines at a time. There is a lot of header text and that space isn’t that useful to me. It would take a very long time to get through this.” -Participant 3
When enabling inverted colors on a PC, links should show up as yellow but some were inaccurately showing up as white. This makes it difficult for the user to understand which elements are links, as they are not able to correctly identify different types of content and functionality.
When interacting with an assessment (PDF format), there were a lot of usability issues including:
No notification that the activity (PDF) loaded into the frame
Unlabeled page forward & page backward buttons
Unknown audible ”X” alert
“The PDF is going to present a huge accessibility challenge and the experience of filling out PDFs is going to range across operating systems and screen readers. For these types of assessments, it’s better to have some sort of online fillable form.” - Participant 7
TAKEAWAY 3: Forced navigational journeys, with no quick way to navigate
The tab order is the order in which a user moves focus from one control to another by pressing the Tab key. This is a very common hot key used by screen reader users.
Users often got stuck running through the global navigation and table of contents when trying to get to main course content. The tab order should be rethought completely. Applying headers would also help the user navigate via keyboard hotkeys.
“I don’t want to go through every single thing on the page to get to the good stuff.” - Participant 7
When interacting with the calendar selector to set a due date for an assignment, screen reader users had a lot of difficulty. There is no option for enter a desired date via their keyboard (MM/DD/YYYY). When a screen reader user activated the calendar selector, all dates were read out loud by the screen reader, until the user told it to stop. This proved to be a very inefficient way to input a desired date.
When making an assignment for students, the platform prompts the user that they must select their students first, but then it hides this task below the fold.
A user with a screen magnifier could not see the student selector in reading pane, even though this is the first action they must complete.
Revising the information architecture on this assignment page would greatly help all users (including sighted individuals) complete this task more efficiently.
Next Steps and Key Learnings
Next Steps:
Following the completion of the test, I synthesized all of the findings, issues and pain points into a priority list. JIRA tickets were created to get these issues in the development queue to update. The UX design team will be included back in the process when needed (for example, for any issue that needs design work beyond role and function labels).
I gave a 1 hour presentation of all test findings to a large group of internal NGL employees including designers, developers, product managers and business stakeholders. Video clips of major issues were included, which helped many of the meeting attendees understand how screen readers and other assistive technology users interact with the platform. For some attendees, this was the first time they ever heard a screen reader in action! This meeting helped bring broader awareness of accessibility issues across several teams at NGL and has promoted accessibility thinking in their work.
Learnings from Running Usability Tests with Blind and Low Vision Users:
Clear communication with participants is key
Set the expectation of all technology that will be used during the test with clear instructions. We found that many of our participants practiced setting up Zoom and sharing their audio a couple days before the test because they had never done it before. This helped limit any technical difficulties at the start of our tests.
Ensure all email communication and surveys are clear, concise and accessible
Always double check that any and all third party applications or products you ask the participants to use are accessible. We found Google Forms was a very accessible survey option.
Send an email reminder to each participant at least 24 hours ahead of the the test
This helped to prepare the participants and set expectations. Overall, we wanted our participants to feel comfortable. We sent an email reminder of the test date and time, NGL portal login URL, Zoom meeting URL and Zoom sharing instructions.
Avoid using the Zoom chat function during the test
We found the Zoom chat function to be very difficult for our participants to use. This feature is difficult for screen reader users to open, close and activate. We avoided sending any URL and logins via Zoom chat and instead included that information in our email communication to the participants ahead of the test.