This is an unpublished draft preview that might include content that is not yet approved. The published website is at w3.org/WAI/.

Video Script for Auditory

Back to Index Page

Video script for Auditory from the page Diverse Abilities and Barriers (in the 2020 Update version).

Submit an Issue

Summary

Script

Seq. Time Audio Visual
1 0:00 - 0:10 There are many forms of auditory disabilities, including people who have reduced hearing, do not hear, or hear sounds differently. We see a collage of five people going about their lives in their different settings [they are the protagonists that we will see in the coming scenes].
2 0:10 - 0:25 Some people with auditory disabilities can hear sounds but not sufficiently to understand all speech. Cochlear implants and other hearing aids often don’t fully compensate for auditory disabilities, especially when there is background noise. [New scene; zoom into the context/setting of one of the people (#1) shown in the collage in the previous scene.] The person (#1) is with a group of people standing and talking to each other (like in a cocktail party / reception setting). We realize that the persopn (#1) is having difficulty understanding what others are saying. The camera continues to close-up on that person (#1), and we realize that the person (#1) has a cochlear implant and that the surrounding is loud.
3 0:25 - 0:50 For online meetings and multimedia like videos, podcasts, and music, people with auditory disabilities rely on captions. Captions provide important audio information in text format. For example, captions indicate who is speaking and important sounds, such as the creaking sound of a door in a thriller movie. Many people who use captions also need to adjust the text size, font, and color to make the captions more readable. [New scene; switch to another person (#2) from the initial collage in the first scene.] We see someone (#2) watching a movie (thriller) and turning on the captions [there is no visible indication that the person (#2) has any disability].
4 0:50 - 1:10 While automatic captions are gradually improving, they are currently too inaccurate for reliable use. For example, they don’t recognize specialized terms and names, it’s hard to tell who is speaking, and the sentences can sometimes run together making it hard to keep up. [Continuation from previous scene.] We see the person (#2) switch to another video, and turn on automatic captions for that second video. The captions are clearly inaccurate and the person (#2) is looking confused / frustrated.
5 1:10 - 1:30 High-quality foreground audio that is clearly distinguishable from background noise is also important for many people with auditory disabilities, with and without hearing aids. Volume controls to turn up the audio playing independently of other system sounds makes it easier for people to hear too. [New scene.] We see a person (#3) watching a video and adjusting their hearing aid. We see the person (#3) adjust the volume of the video independently from other audio [we see a sound mixer widget with multiple volume sliders, and they adjust one of the volume sliders on the mixer].
6 1:30 - 2:00 While many people with auditory disabilities do not know sign language, for many people sign language is the primary language for communication. Sign language has a different grammar and vocabulary than spoken and written language, so some people might not be as fluent in reading and writing depending on the schooling they received. Sign language users rely on high-quality video transmissions for online communication. This includes the speed of their connection as well as the capability of the video device and software. [New scene.] We see someone (#4) in an online meeting with sign language interpreters on the call. We see the sign language interpreters sign to the person (#4), and the person (#4) nodding/signing/gesturing back to signal acknowledgment to the interpreters. We see the person (#4) open an electronic document [maybe one sent in the online meeting app]; the document has large blocks of text (walls of text) without any structuring, and the person (#4) has difficulty reading it. We see the person (#4) turn their attention back to the sign language interpreters and maximize their window to communicate more easily with them.
7 2:00 - 2:40 Finally, auditory disabilities also includes people who are deaf-blind. This involves varying degrees of auditory and visual disabilities at the same time. Most people who are deaf-blind rely on tactile means of communication. For example, screen readers and portable braille displays convert text on the computer to braille letters that can be felt on the finger tips. For multimedia, people who are deaf-blind rely on descriptive transcripts that they can read instead of watching a video. These include descriptions of the auditory as well as visual information in text format. [New scene.] We see someone (#5) using a portable braille display to read what’s on the screen. The person (#5) selects “read transcript” link beside a video. Zoom into the transcript, to show that there are captions as well as descriptions of the visual content [we can’t really read all the text but the formatting makes it clear that there is text for the audio and visual content].
8 2:40 - 2:45 All these people have one thing in common: your design can include or exclude them. [New scene.] We see a collage of the five protagonists from the previous scenes [in the same style and continuing the first scene] happily using computer technologies [each person’s setting is a continuation of their respective scenes].
Back to Top

This is an unpublished draft preview that might include content that is not yet approved. The published website is at w3.org/WAI/.