Enhancing the movie experience for people who are blind and visually impaired.



The Problem

By using their imagination and listening to audio description from assistive devices, people who are blind or visually impaired can enjoy movies. Having access to reliable audio description is essential to a positive movie theater experience. However, movie theaters often don’t provide adequate services and the process of getting and utilizing the assistive devices are often met with frustration.

How might we improve accessibility for the blind and visually impaired at movie theaters? 

The Solution

AudioView is an app that enhances the entertaiment experience for everyone. The app provides audio description tracks to movies in theaters. By bringing access to audio description directly in people’s phones, they no longer need to rely on inadequate theater services and clunky devices.


Accessible design

My Role

UX Researcher, UX Designer, Visual Designer, Logo Designer


Sketch, Adobe XD, InVision, Illustrator



  • Field Research
  • Online Forums
  • Secondary Research



  • Themes & Insights
  • “How Might We” Questions
  • Personas
  • Storyboarding



  • Ideation
  • User Flow
  • Wireframes
  • Visual Design



  • Interactive Prototype
Jump to Prototype


Primary & Secondary Research

A critical part of this project was to really understand a user group I had never considered before. To empathize with the users and experience movies from their perspective, I visited two movie theaters and used the audio description devices. I also joined several Facebook groups specific to people who are blind and visually impaired where they shared their first-hand experiences of going to the movies.

Due to limited access to my target audience, I also conducted secondary research via Reddit forums and online articles to gain a better understanding of who they are, their frustrations, and how they use smart phones to tackle daily challenges.

See Primary Research
Requesting device at AMC’s Guest Services
Audio description device with headphones from AMC
Testing the device at Regal Cinemas

How Audio Description Devices Work

The audio description is transmitted to a receiver box and the user listens to it through headphones provided by the movie theater. The current method relies on proximity to the data transmitter and having a strong signal in order to successfully listen to the audio track.

Image copyright Sony

Distilling The Research

Themes & Insights: Key User Pain Points

After downloading the research into an affinity map, I uncovered consistent pain points among the users. I grouped them into three big themes and came up with insights that summarized the main problems with the movie experience.

Untrained Staff

  • Staff often give the wrong device to the users, confusing between “audio description” for the visually impaired and “assistive listening” for the hard of hearing.
  • Users are annoyed that they have to educate the staff of the accessibility services available at the movie theaters.
How might we shift users’ negative perception of going to the movie theaters into an enjoyable leisure activity?
How might we offload users’ dependence on others to receive the proper services?

Additional Planning

  • Users take additional steps to account for problems that may occur during their trip to the movie theater, such as calling in advance to request a device ahead of time.
  • Users expect to encounter problems every time they go to the movie theaters.

Technical Issues

  • Audio description devices are often incorrectly formatted, out of battery, or simply do not work due to weak signal.
  • Users are upset to miss parts of the movie because they have to return to Guest Services to fix the issue.
How might we provide a frictionless experience for users in getting and using audio description tracks?


Based on my learnings from the research, I created two personas as my target users. My first persona is partially visually impaired and my second persona is blind; I chose to put them into two groups because they have very different needs and considerations.

Storyboarding: Current User Journey

The current user journey in getting and using the audio description device is a long, tedious process. I listed out all of the current steps that the user takes so that I can find ways to decrease the steps and improve the experience.


Ideation: Design Concept

After understanding the problem space and defining the users’ pain points, I jumped into ideating solutions to answer the three “how might we” questions. My goal was to reduce friction, ensure reliable access to audio description tracks, and lessen the steps a user needs to take.

The problem was centered around the physical devices not working properly and the staff not trained to provide proper service. I came up with AudioView, an app that allows users to access audio description tracks directly on their phones. This eliminated the need to rely on physical devices that could potentially break and reduced user’s dependence on staff that didn’t understand their needs.

Storyboarding: New User Journey

User Flow

Because the purpose of AudioView is specifically to give access to audio description tracks, it was important to keep the user flow very simple and easy. Considering that the users have difficulties with vision, I had to avoid making the app more complex than it needed to be.

Wireframes & Visual Design

Design with Accessibility in Mind

Part of the challenge of this project is to be very mindful of accessible design. After reading extensively about the topic, I learned that blind users have very different needs from low-vision users when it comes to accessible apps. Details like contrast, typography, and color don’t matter to blind users, because they will interact with the app using VoiceOver.

Designing for the partially visually impaired (color blind or low vision)

  • Scalable font sizes
  • Accessible color palettes
  • High contrast between background and text

Designing for the blind

  • VoiceOver in iPhone and other screen reading tools
  • Use sound to communicate with user
  • Inputs/buttons with associated labels and descriptions

A Mental Map: Using a Smartphone

From my research, I learned how blind people and the visually impaired navigate through a smartphone, basically a visual touch screen. Users develop a mental map of how a device or an app is laid out. By using VoiceOver and moving from left to right and top to bottom of the screen, they memorize the basic anatomy of their phones. Similarly, they learn the typical skeleton of apps. For example, they know that a back button is usually on the top left corner while navigation tab bars are usually at the bottom of the screen.

To make sure that the app was usable by blind and visually impaired people, I had to design my screens by following standard conventions and making it easy for users to navigate through the app.

Dynamic Text Size

For people who have low vision or blurred vision, it’s really important for the app to support dynamic type. Text should scale smaller or bigger depending on the user’s preference for readability. The dynamic type should also be accommodated in the app layouts.

Standard text size
Larger text size
People with low or blurred vision are unable to read the text at standard size
Supporting large text sizes improves readability for users

Accessible Color Palette

When it comes to color, people with visual impairments find it easier to read light text in dark mode. They also read better when there is high contrast between the background and the text. I used online tools like Color Safe to determine an accessible color palette and I used the Sketchapp plugin Stark to simulate different types of color blindness.

VoiceOver with Input Labels

For users who are blind, they rely heavily in the VoiceOver feature in iOS. By reading the contents in the screen, VoiceOver allows users to hear and interact with the screen without needing to see it.

One of the most overlooked accessible design/development choices is the absence of labels or lack of context in labels. Buttons and inputs should have associated labels and descriptions based on how VoiceOver would read it to the users.

Example of VoiceOver Labeling

“Select this theater button”
“Select this theater button. Tapping this will show you the movie list.”

Example of Loading States for VoiceOver

Another example of content labeling is loading states. When a screen is loading and a label is absent, a blind user would only perceive that nothing is happening on the screen. By adding a label to loading states, it lets users know that the app is loading and the content will be available soon.

With appropriate labeling, VoiceOver provides feedback and gives users confidence in knowing where they are in the app.

“Synchronizing 25%”
“Synchronizing completed”
“Synchronizing 50%”

Final Prototype

Turn on the volume on your computer to experience the app walkthrough with VoiceOver simulation.

Next Steps: User Testing

Due to time constraints and limited access to the user groups, I was unable to conduct user tests on the prototype. However, if time permits, I would recruit participants from university student groups and community groups for blind and the visually impaired.

During the testing session, I would test the usability of the app and observe how the users interact with it. Are there certain features that they didn't notice? How can I improve on the design so that it's more accessible for them? Are certain areas unclear? These are some of the questions I would find feedback to.

Project Learnings

Mind the empathy gap

One of the biggest learnings from this project is knowing how unaware I was of accessible design and the needs of people with disabilities. To better empathize with them, I used audio description devices at the theater to get a better understanding of their movie-watching journey from beginning to end. By putting myself in their shoes (to a certain extent), I was able to uncover pain points in their experience.

Accessible Design Benefits Everyone

From this project, I learned a variety of things to consider in my designs such as color palettes, dynamic type, and content labels. These are all things that I can easily implement into the rest of my projects to improve the usability of a product. Accessible design doesn't have to be a constraint. On the contrary, accessible design makes the product better for everyone.