By using their imagination and listening to audio description from assistive devices, people who are blind or visually impaired can enjoy movies. Having access to reliable audio description is essential to a positive movie theater experience. However, movie theaters often don’t provide adequate services and the process of getting and utilizing the assistive devices are often met with frustration.
How might we improve accessibility for the blind and visually impaired at movie theaters?
AudioView is an app that enhances the entertaiment experience for everyone. The app provides audio description tracks to movies in theaters. By bringing access to audio description directly in people’s phones, they no longer need to rely on inadequate theater services and clunky devices.
Accessible design
A critical part of this project was to really understand a user group I had never considered before. To empathize with the users and experience movies from their perspective, I visited two movie theaters and used the audio description devices. I also joined several Facebook groups specific to people who are blind and visually impaired where they shared their first-hand experiences of going to the movies.
Due to limited access to my target audience, I also conducted secondary research via Reddit forums and online articles to gain a better understanding of who they are, their frustrations, and how they use smart phones to tackle daily challenges.
The audio description is transmitted to a receiver box and the user listens to it through headphones provided by the movie theater. The current method relies on proximity to the data transmitter and having a strong signal in order to successfully listen to the audio track.
After downloading the research into an affinity map, I uncovered consistent pain points among the users. I grouped them into three big themes and came up with insights that summarized the main problems with the movie experience.
Based on my learnings from the research, I created two personas as my target users. My first persona is partially visually impaired and my second persona is blind; I chose to put them into two groups because they have very different needs and considerations.
The current user journey in getting and using the audio description device is a long, tedious process. I listed out all of the current steps that the user takes so that I can find ways to decrease the steps and improve the experience.
After understanding the problem space and defining the users’ pain points, I jumped into ideating solutions to answer the three “how might we” questions. My goal was to reduce friction, ensure reliable access to audio description tracks, and lessen the steps a user needs to take.
The problem was centered around the physical devices not working properly and the staff not trained to provide proper service. I came up with AudioView, an app that allows users to access audio description tracks directly on their phones. This eliminated the need to rely on physical devices that could potentially break and reduced user’s dependence on staff that didn’t understand their needs.
Because the purpose of AudioView is specifically to give access to audio description tracks, it was important to keep the user flow very simple and easy. Considering that the users have difficulties with vision, I had to avoid making the app more complex than it needed to be.
Part of the challenge of this project is to be very mindful of accessible design. After reading extensively about the topic, I learned that blind users have very different needs from low-vision users when it comes to accessible apps. Details like contrast, typography, and color don’t matter to blind users, because they will interact with the app using VoiceOver.
From my research, I learned how blind people and the visually impaired navigate through a smartphone, basically a visual touch screen. Users develop a mental map of how a device or an app is laid out. By using VoiceOver and moving from left to right and top to bottom of the screen, they memorize the basic anatomy of their phones. Similarly, they learn the typical skeleton of apps. For example, they know that a back button is usually on the top left corner while navigation tab bars are usually at the bottom of the screen.
To make sure that the app was usable by blind and visually impaired people, I had to design my screens by following standard conventions and making it easy for users to navigate through the app.
For people who have low vision or blurred vision, it’s really important for the app to support dynamic type. Text should scale smaller or bigger depending on the user’s preference for readability. The dynamic type should also be accommodated in the app layouts.
When it comes to color, people with visual impairments find it easier to read light text in dark mode. They also read better when there is high contrast between the background and the text. I used online tools like Color Safe to determine an accessible color palette and I used the Sketchapp plugin Stark to simulate different types of color blindness.
For users who are blind, they rely heavily in the VoiceOver feature in iOS. By reading the contents in the screen, VoiceOver allows users to hear and interact with the screen without needing to see it.
One of the most overlooked accessible design/development choices is the absence of labels or lack of context in labels. Buttons and inputs should have associated labels and descriptions based on how VoiceOver would read it to the users.
Another example of content labeling is loading states. When a screen is loading and a label is absent, a blind user would only perceive that nothing is happening on the screen. By adding a label to loading states, it lets users know that the app is loading and the content will be available soon.
With appropriate labeling, VoiceOver provides feedback and gives users confidence in knowing where they are in the app.
Turn on the volume on your computer to experience the app walkthrough with VoiceOver simulation.
Due to time constraints and limited access to the user groups, I was unable to conduct user tests on the prototype. However, if time permits, I would recruit participants from university student groups and community groups for blind and the visually impaired.
During the testing session, I would test the usability of the app and observe how the users interact with it. Are there certain features that they didn't notice? How can I improve on the design so that it's more accessible for them? Are certain areas unclear? These are some of the questions I would find feedback to.
One of the biggest learnings from this project is knowing how unaware I was of accessible design and the needs of people with disabilities. To better empathize with them, I used audio description devices at the theater to get a better understanding of their movie-watching journey from beginning to end. By putting myself in their shoes (to a certain extent), I was able to uncover pain points in their experience.
From this project, I learned a variety of things to consider in my designs such as color palettes, dynamic type, and content labels. These are all things that I can easily implement into the rest of my projects to improve the usability of a product. Accessible design doesn't have to be a constraint. On the contrary, accessible design makes the product better for everyone.