Sign language translating devices are cool. But are they useful? DeepASL's camera. (Michigan State University)
Sign language translating devices are cool. But are they useful?
Lexile

Assign to Google Classroom

Over the past several decades, researchers have regularly developed devices. They are meant to translate American Sign Language (ASL) to English. They hope to ease communication. They aim to help people who are deaf and hard of hearing and the hearing world talk. Many of these technologies use gloves. These gloves capture the motion of signing. They can be bulky and awkward.

A group of researchers at Michigan State University (MSU) has developed a glove-less device. It is the size of a tube of Chapstick. They hope it will improve ASL-English translation.

The technology is called DeepASL. It uses a camera device to capture hand motions. Then it feeds the data through a deep learning algorithm. It matches it to signs of ASL. It is unlike many previous devices. DeepASL can translate whole sentences rather than single words. It doesn't require users to pause between signs.

"This is a truly non-intrusive technology," says Mi Zhang. He is a professor of electrical and computer engineering. He led the research.

Zhang and his team hope DeepASL can help people who are deaf and hard of hearing by serving as a real-time translator. It could be especially useful in emergency situations, Zhang says. The device can be used with a phone or tablet. It can also be used with a computer. It can help teach ASL, Zhang says. 

More than 90 percent of deaf children are born to parents who are hearing. There is a large community of adults who need to learn ASL quickly. DeepASL could serve as a digital tutor. It gives feedback on whether learners are signing correctly.

Zhang has applied for a patent. He hopes to have a device on the market within a year. Because it's based on affordable technology it could be more widely accessible than previous efforts.

Christian Vogler is a professor of communication studies at Gallaudet University. It is a university for people who are deaf or hard of hearing. He is skeptical of devices designed to translate ASL. His skepticism is shared by many in the Deaf community.

Devices generally do not truly 'translate' ASL. They merely recognize hand signs and turn them into an English word per sign, Vogler says. This means key grammatical information is lost. That includes information about whether a phrase is a question or a negation. And if a phrase is a relative clause and so forth. 

DeepASL does translate full sentences. But some features of ASL grammar go beyond hand signs. Facial expressions are often used as modifiers. An eyebrow raising can turn a phrase into a question and body positioning can show when the ASL user is quoting someone else.
So far, "none of the systems have been even remotely useful to people who sign," Vogler says. He adds that researchers often seem to have "very little contact with the [Deaf and hard of hearing] community and very little idea of their real needs."

Zhang's team did not test the device on people who were deaf and hard of hearing. He tested it on students in a sign language translation program. Zhang emphasizes that DeepASL is designed to enable only basic communication. It is just a starting place. He says his team hopes to extend DeepASL's capabilities in the future. He wants it to capture facial expressions as well.

"That will be the next significant milestone for us to reach," he says.

Vogler says it's a positive that the MSU technology is using deep learning methods. These have had success with spoken language. The device doesn't need a glove. But it likely has the same pitfalls of any previous system. That's because it doesn't capture face and body movements.

Vogler thinks researchers should move away from the idea that sign language recognition devices can really meet in-person communication needs.

"We have many options for facilitating in-person communication. And until we have something that actually respects the linguistic properties of signed languages and the actual communication behaviors of signers, these efforts will go nowhere near supplanting or replacing them," he says. 

"Instead, people need to work with actual community members, and with people who understand the complexities of signed languages."

Vogler says it would be useful for sign language recognition technology like MSU's to work with voice interfaces like Alexa. The growth of these interfaces is an accessibility challenge for people who are deaf and hard of hearing. That is similar to the challenges the internet has presented for people who are blind over the years. That's because it is a largely visual medium.

"We presently do not have an effective and efficient way to interact with these voice interfaces if we are unable to, or do not want to, use our voice," he says. "Sign language recognition is a perfect match for this situation, and one that actually could end up being useful and getting used."

Filed Under:  
Assigned 94 times
CRITICAL THINKING QUESTION
Why do you think researchers are hesitant to reach out to the Deaf community to get help creating their devices?
Write your answers in the comments section below


COMMENTS (28)
  • biankab-
    9/30/2019 - 02:27 p.m.

    This could be very helpful in the future for people who are deaf or have other issues.

  • jamesf-5
    9/30/2019 - 02:27 p.m.

    sign language devices would be useful because it would give people with disabilities a chance to try to lie a normal life.

  • raeganh-
    9/30/2019 - 02:37 p.m.

    They want to make sure it works before they let someone try.

  • LatoyaH-bad1
    10/23/2019 - 10:34 a.m.

    i think that because like people who are deaf is sad and that’s why smart people like this can help them by creating a device to at least help them like see a little bit so that way they wont have to use like that much sign and just be happy of what the device that they got and they will be able to see Here their people.

  • ZinP-bad1
    10/23/2019 - 10:34 a.m.

    I think researchers are hesitant to reach out to the deaf community to get help creating their devices because i think language is smart because some time you may not know what they saying and i think is will smart to know different languages.

  • JaheimJ-bad1
    10/23/2019 - 10:34 a.m.

    I think researchers are hesitant to reach out to the deaf community to get help creating their devices because they’re not that many inventors and engineers to help make these devices.

  • DiamonteA-bad1
    10/23/2019 - 10:43 a.m.

    I THINK ITS A GREAT IDEA BECAUSE IT WHOULD HELP IMPROVE THEM WHAT WERE DOING??

  • JakyrahJ-bad
    10/23/2019 - 10:51 a.m.

    Because so people was born like that.

  • SwahayB-bad
    10/23/2019 - 10:51 a.m.

    I think researchers are hesitant to reach out to deaf community to get help creating devices because they want to communicate with deaf people but don’t know how to.

  • NataveonJ-bad
    10/24/2019 - 09:17 a.m.

    They aim to help people who are deaf and hard of hearing and the ... It can help teach ASL, Zhang says. ... Why do you think researchers are hesitant to reach out to the Deaf community to get help creating their devices?

Leave a comment
ADVERTISEMENT
ADVERTISEMENT
Leave a comment