Students at the University of Houston designed a device called MyVoice, an ASL translator, which uses a video camera to capture a person’s sign language movements. For people with hearing disabilities sign language can greatly improve communication with the world; however, when attempting to communicate with someone who does not understand ASL it can be a barrier. Technology that automatically translates hand motions into audible speech for a non-signing person to interpret can help reduce this barrier.
MyVoice contains a small video monitor, a microphone, and a speaker. Software processes the images and determines what was said, and then translates the word or phrase into speech, which is transmitted through an electronic voice. The device also works in reverse by capturing a person’s spoken words and projecting the appropriate hand sign onto the monitor. Students sampled a database of images to train their software to recognize the hand signs, according to a UH news release. The team used between 200 and 300 images per sign.
MyVoice is still in the prototype stages and it is unclear how well the translation algorithms work; however, the device was able to congratulate the students on winning a first-place award at the American Society of Engineering Education conference by translating the single phrase: “Good job, Cougars.”
For more on technology for individuals with hearing disabilities please visit AbleData. Check out research in REHABDATA on sign language interpretation, sign language translation, and related to hearing disabilities.