BBC - Ouch! (disability) - Features - Subtitles for life

Home > Features > Subtitles for life

Subtitles for life

by Charlie Swinbourne

14th December 2009

My top three inventions that would improve deaf life in this country read like this. One: realistic robotic sign language interpreters who run on electricity and can be plugged into the wall and work all day long without charging an extortionate hourly rate. Two: hearing aids that can selectively tune into any conversation we like in a crowded room (that one’s just for entertainment reasons really. And three: subtitles that appear underneath everyday life. All the time.
Spectacles that transmit sub-titles direct to the eye
Actually, scrap the first two. Having subtitles alongside real life’s been a dream of mine – maybe the only dream I've ever had - since I was a little tot. I grew up watching them, these words floating helpfully underneath people while they talk on TV.

If you're not sure what I'm talking about, they're blocks of text that reveal dialogue and sound effects for hearing impaired people, sometimes known as closed captions, and I loved getting every single word of what was going on. But the minute I stepped out of the front door and into regular life - classrooms, lecture theatres and finally the workplace - I suddenly didn’t get every word any more. I was lucky to get just half of them, with guesswork filling in the rest.
I nearly threw my laptop in the air with delight recently when I read a BBC News report saying that spectacles have been invented which will enable subtitles to be beamed onto the wearer’s eye, in real time, during a conversation.
These gizmos are currently aimed at making conversations possible between people who speak different languages, such as a French person speaking to an Italian, but it’s just a small skip and a leap to imagine the possibilities for deaf and hard of hearing people.

Hooked up to good voice-recognition technology via a laptop or mobile phone, the specs could enable a deaf person to read the words a hearing person is saying, almost as they speak them. They could also provide subtitles when you're at the shops, a meeting, a cinema, you name it.
Subtitled Doctor Who from the 1980s
Subtitling has hitherto only been seen on screen-based media - TV, movie theatres and so on, making it a fairly immobile and inflexible experience.

Taking cinema as an example, mainstream audiences don’t like subtitles affecting their view of the big picture (oh the hardship!). So the multiplexes tend to screen text accessible films at highly inconvenient times, like Tuesday afternoons when it just so happens that most deaf people are at work and can't go.
With these amazing specs, the subtitling wouldn't need to be burnt onto the bottom of films for everyone in the cinema to see. They could just be handed out every time a deaf person asks for them. You could watch any film, any time and it wouldn't bother anyone else. It would almost be like freedom..!

The specs could also become very useful at conferences that are currently palantyped – where people type dialogue phonetically into a keyboard so that it comes up on a screen soon after it is spoken (more on this next month).
One limitation of palantyping is that deaf people have to look away from the person who is speaking, as the words come up on a screen. However with the subtitles right in your eye you’d be able to look at the speaker, and know what they were saying at the same time.

There might be other deaf-friendly ways of using the technology. You know the annoying way hearing people walk around with Bluetooth headsets on, nattering away into what at first seems like thin air?
Facebook
Imagine it. You’re on a bus, and there’s this deaf person with funny glasses on smiling at you. Laughing. You wonder why. Then you twig. They’re reading their text messages. On their eye.

I can imagine a world where deaf people become equally as annoying as Bluetooth headset wearers, taking part in instant messaging conversations as they walk, reading messages and replying to them with a few deft keyboard presses in their pockets.

No longer will we see the familiar sight of a deafie leaning down to look into a mobile phone as they walk, narrowly avoiding other commuters and head-on collisions with lamp-posts. But you will see smug deaf people multi-tasking with their expensive accessibility specs.
The big limitation of the spectacles for some might be that, while it can beam spoken dialogue into the eye of a deaf person, the technology doesn’t exist yet to beam sign language dialogue back. So it might get a bit one-way. It wouldn’t be perfect but it could still make a big difference, particularly for hard of hearing people.

Now all we need to do is get onto the tech boffins about the robotic interpreters and conversation-specific hearing aids. I’ve got a feeling those might be a bit longer coming.

Bookmark with...

What are these?

Live community panel

Our blog is the main place to go for all things Ouch! Find info, comment, articles and great disability content on the web via us.

Mat and Liz
Listen to our regular razor sharp talk show online, or subscribe to it as a podcast. Spread the word: it's where disability and reality almost collide.

More from the BBC

BBC Sport

Disability Sport

All the latest news from the paralympics.

Peter White

In Touch

News and views for people who are blind or partially sighted.

BBC Radio 4

You & Yours

Weekdays 12.40pm. Radio 4's consumer affairs programme.

BBC © 2014 The BBC is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.