Skip to the content \ accessibility

« »

Sounding out the web: web accessibility for people who are deaf or hard of hearing

By David Swallow.

This is a condensed version of a two-part article originally published on the blog of The Paciello Group, an international accessibility agency. Links to the original posts can be found at the end of this article. David Swallow is an Accessibility Associate at The Paciello Group.

The largely visual nature of the web means that there’s a lot of focus on supporting people who are blind or partially sighted. But deaf and hard of hearing people are often overlooked. Ruth MacMullen, a copyright and licences officer at York St. John University, explains her experience of being deaf and how it affects her use of the web.

“It’s a matter of quite ferocious debate at times,” says Ruth on the topic of preferred naming for deafness and hearing loss. “For me, it depends on what sort of mood I’m in. If I’m feeling a bit shy about it, I think ‘hearing impaired’ sounds softer than ‘deaf’. Sometimes I say ‘hard of hearing’. That’s usually enough to tell people that you’ve got a problem hearing, that you need a bit of extra consideration.”

Ruth was born profoundly deaf, meaning she has little to no residual hearing. Powerful hearing aids later gave Ruth some hearing, and with the support of a speech and language therapist, she learned to speak. A combination of cochlear implants and hearing aids gave Ruth what she describes as “pretty reasonable hearing”.

Although Ruth now has access to a variety of sounds, her hearing loss still affects her day-to-day life. She cites the example of working in a busy open-plan office, where spontaneous and informal conversations spring up. Ruth says: “A lot of information is shared by accident. I’m aware I miss all that. If there’s something important, people should try and draw you in, but it’s impossible to do that all the time. Accessibility is often built around structures and planning, isn’t it? So I think ‘unplanned accessibility’ is a really hard thing.”

A similar lack of structure and planning among many developers hinders Ruth’s ability to use the web. The growing number of videos shared online, particularly through Facebook Live, are rarely subtitled. Ruth says: “If subtitles aren’t possible, then at least provide a description of what the video is about. That’s increasingly missing as people get more casual about posting content.”

Ruth recalls a video of Aretha Franklin that a friend shared on Facebook: “I see her playing the piano and think, ‘What song is she playing? There’s no description. It’s not that I need to hear every word, but sometimes I have no idea what’s happening.”

Here, Ruth explains more about what makes life easier for her online.

– Provide subtitles/captions.

“Subtitles, that’s the really obvious one,” says Ruth. The term ‘subtitles’ typically refers only to spoken content, whereas ‘captions’ also includes descriptions of non-speech sounds, such as music, applause and laughter. Outside of North America, the terms are often used interchangeably.

– Check the accuracy of captions.

In the United States, YouTube and Facebook offer free automated video captioning, but since there aren’t any humans involved, the captions they produce can be wildly inaccurate. “I’ve seen captions where irrelevant and inappropriate words came up, including expletives,” says Ruth. “I was surprised there was no protection or filtering to stop that.”

Google provides clear instructions on how you can review and edit automated captions on YouTube
(link below to Google’s captioning advice:
eab.li/5x ),
but you also need to take the time to edit them, because automated captions are notoriously terrible to begin with. Ruth explains: “It makes such a difference to a deaf person if a little bit of effort has been made to clean up subtitles. You can see that they are so much more accurate.”

– Make sure that captions are synchronised with the audio.

One advantage of using automated captioning is that the captions are automatically synchronised to the audio. Some video makers choose to generate their captions from an existing transcript. And if you do that, you need to make sure that each line appears on screen at about the same time as the audio is heard. “When you’re deaf, you want the captions to run in time,” explains Ruth.

– Provide a summary of audio and video content.

Earlier, Ruth explained how a brief summary of what a video is about can be just as important as captions or a transcript. A video’s summary may be as simple as a list of topics or songs that the video includes, which Ruth likens to “alternative text for someone with a hearing impairment”.

Ruth recalls watching a concert on YouTube: “There’s a piano player playing Gershwin songs and I’ll think, ‘Which one do I want to listen to?’ I can’t skip through and listen to fragments, I need to know what’s in there before I make a choice to watch it. And I feel like I don’t have that choice if I don’t know what’s in it.”

– Make sure that audio doesn’t play automatically.

Deaf and hard of hearing people may have a difficult time gauging how loud videos are, particularly when they play automatically and unexpectedly. If it’s unavoidable for audio to play automatically, make sure that users have an easy way to turn it off. “You see a video on your Facebook feed and there’s no sound”, explains Ruth. “If you click on it to make it big, the sound plays, but you don’t realise. Sometimes I have my hearing aid off and I feel like the video is blaring out sound.”

– Structure your content.

“I rely so much on visual information,” says Ruth. “The more clearly it is structured and explained, the better.” Using semantic HTML helps websites remain flexible and extensible. It makes the content reusable and conveys more meaning to assistive technologies. As Ruth points out, “This is good practice generally, but for people who are completely reliant on visual information it’s even more important.”

– Keep your content flexible.

Underlying each of these suggestions is an emphasis on clear communication and flexibility of content. Mobile technology has proven particularly useful to Ruth in this respect. “I think if you speak to any person with a disability, they will tell you that an iPhone or an Apple Watch is a game changer,” says Ruth.

The benefit for Ruth is being able to curate information in a way that suits her. “I can’t hear a radio very well if it’s in a room, but if I plug in my Bluetooth headphones to my iPhone, I can listen to a radio programme. It’s a bit of a lifesaver in terms of being able to get information and content, and modify it in a way that works for me.”

Content that’s flexible enough to be delivered by captions, indexed by transcripts, enlarged by screen magnifiers, and rendered by screen-readers is a key principle of web accessibility. And while many of the tips in this post are useful for deaf and hard of hearing people, they ultimately benefit everyone.

Read more about Ruth’s experiences and work at her blog:
eab.li/5s .

This article is a condensed version of two pieces that appeared on The Paciello Group’s website. Links to the original posts are below:

Part one of ‘Sounding out the web’:
eab.li/5v .

Part two of ‘Sounding out the web’:
eab.li/5w .

Read more of The Paciello Group’s accessibility blogs at the company’s website:
eab.li/5t .

Find out more about David Swallow’s accessibility work at his website:
eab.li/5u .

Comments

Post a comment

Comment spam protected by SpamBam