Sign language translating devices are cool. But are they useful? DeepASL's camera. (Michigan State University)
Sign language translating devices are cool. But are they useful?
Lexile

Assign to Google Classroom

Researchers have regularly made devices. They are meant to translate American Sign Language (ASL) to English. They aim to help people. They want to make talking between people who are deaf and hard of hearing and the hearing world easier. Many of these devices use gloves. These gloves capture the motion of signing. They can be bulky. And they can be awkward.

There is a group of researchers. They are at Michigan State University. They made a glove-less device. It is the size of a tube of Chapstick. They hope it will help ASL-English translation.

The device is called DeepASL. It uses a camera. It captures hand motions. Then it feeds the data through a deep learning algorithm. It matches it to signs of ASL. It is not like other devices. DeepASL can translate whole sentences. That is different from other devices. Those only translated single words. It doesn't make users pause between signs.

"This is a truly non-intrusive technology," says Mi Zhang. He is a professor. He teaches electrical and computer engineering. He led the research.

Zhang and his team hope DeepASL can help people. He wants it to serve as a real-time translator. It could be really useful in an emergency. The device can be used with a phone. It can be used with a tablet. It can also be used with a computer. It can help teach ASL. 

More than 90 percent of deaf children are born to parents who are hearing. This is a large group of adults. They need to learn ASL quickly. DeepASL could serve as a digital tutor. It gives feedback. It tells users if they are signing correctly.

Zhang has applied for a patent. He hopes to have a device on the market within a year. It's based on affordable technology. So, it could be more widely accessible than previous efforts.

Christian Vogler is a professor. He teaches communication studies. He works at Gallaudet University. It is a university for people who are deaf or hard of hearing. He is doubtful devices can really translate ASL. His doubts are shared by many in the Deaf community.

Devices generally do not truly 'translate' ASL. They recognize hand signs. They turn them into an English word. This means key grammatical information is lost. It may be unclear if a phrase is a question. Or if a phrase is a negation. Users may not know if a phrase is a relative clause. 

DeepASL does translate full sentences. But some parts of ASL grammar go beyond hand signs. Facial expressions are often used as modifiers. An eyebrow raising can turn a phrase into a question. Body positioning can show when the ASL user is quoting someone else.
So far, "none of the systems have been even remotely useful to people who sign," Vogler says. He adds that researchers often seem to have "very little contact with the [Deaf and hard of hearing] community. And very little idea of their real needs."

Zhang's team did not test the device on people who were deaf and hard of hearing. He tested it on students. They were in a sign language translation program. Zhang says that DeepASL is made to help with basic communication. It is just a starting place. He says his team hopes to extend DeepASL's tools. He wants it to capture facial expressions as well.

"That will be the next significant milestone for us to reach," he says.

Vogler says it's a positive that the MSU technology is using deep learning methods. These have had success with spoken language. The device doesn't need a glove. But it likely has the same pitfalls of any previous system. That's because it doesn't capture face movements. It also doesn't capture and body movements.

Vogler thinks researchers should move away from the idea that sign language recognition devices can really meet in-person communication needs.

"We have many options for facilitating in-person communication. And until we have something that actually respects the linguistic properties of signed languages and the actual communication behaviors of signers, these efforts will go nowhere near supplanting or replacing them," he says. 

"Instead, people need to work with actual community members. And with people who understand the complexities of signed languages."

Vogler says it would be useful for sign language recognition technology to work with voice interfaces like Alexa. The growth of these interfaces is an accessibility challenge. That is for people who are deaf and hard of hearing. That is similar to the challenges the internet has presented for people who are blind over the years. That's because it is a largely visual medium.

"We presently do not have an effective and efficient way to interact with these voice interfaces if we are unable to, or do not want to, use our voice," he says. "Sign language recognition is a perfect match for this situation, and one that actually could end up being useful and getting used."

Filed Under:  
Assigned 14 times
CRITICAL THINKING QUESTION
Why do you think researchers are hesitant to reach out to the Deaf community to get help creating their devices?
Write your answers in the comments section below


COMMENTS (0)
Leave a comment
ADVERTISEMENT
ADVERTISEMENT
Leave a comment