In the past, engineers working on technology to aid the deaf had focused primarily on hearing devices, such as hearing aids and cochlear implants, but recently they’ve been getting into what’s known as deaf technology: applications designed to make the day-to-day lives of the deaf and hearing-impaired easier. Now engineers from the University of Washington, in Seattle, and Cornell University, in Ithaca, N.Y., have taken a big step toward developing a mobile phone that allows real-time conversations in sign language.
Of course, many in the deaf community already use mobile phones to communicate via text messaging and e-mail, but deaf people almost always prefer sign language: It’s faster and more natural, just as speaking is easier than writing for most hearing people. Laptops are getting smaller and more portable, making video chats outside the home possible, but Wiâ¿¿Fi–enabled cellphones would provide even more freedom. When cellphones became capable of video sharing a few years ago, Eve Riskin, Sheila Hemami, and Richard Ladner, all newly minted IEEE Fellows, felt the time seemed right to develop a sign-language-capable phone. ”Today’s world is more connected by cellphones than by any other device,” says the University of Washington’s Ladner, whose parents were deaf.
From the beginning, the researchers knew that their project, which they named mobileASL (for mobile American Sign Language), would be a challenge. The low bandwidth available on wireless networks in the United States forced them into the balancing act between speed and quality that’s familiar to anyone who works with video, but there was an added twist. Most compression algorithms don’t focus on the aspects of video that would make ASL easily understandable, says Riskin, an electrical engineering professor at the University of Washington.