Too often, communication barriers exist between those who can hear and those who cannot.
Sign language has helped bridge such gaps, but many people are still not fluent in its motions and hand shapes.
Thanks to a group of University of Houston students, the hearing impaired may soon have an easier time communicating with those who do not understand sign language. During the past semester, students in UH’s engineering technology and industrial design programs teamed up to develop the concept and prototype for MyVoice, a device that reads sign language and translates its motions into audible words. Recently, MyVoice earned first place among student projects at the American Society of Engineering Education (ASEE) – Gulf Southwest Annual Conference.
The development of MyVoice was through a collaborative senior capstone project for engineering technology students (Anthony Tran, Jeffrey Seto, Omar Gonzalez and Alan Tran) and industrial design students (Rick Salinas, Sergio Aleman and Ya-Han Chen). Overseeing the student teams were Farrokh Attarzadeh, associate professor of engineering technology, and EunSook Kwon, director of UH’s industrial design program.
MyVoice’s concept focuses on a handheld tool with a built-in microphone, speaker, soundboard, video camera and monitor. It would be placed on a hard surface where it reads a user’s sign language movements. Once MyVoice processes the motions, it then translates sign language into space through an electronic voice. Likewise, it would capture a person’s voice and can translate words into sign language, which is projected on its monitor.
The industrial designers researched the application of MyVoice by reaching out to the deaf community to understand the challenges associated with others not understanding sign language. They then designed MyVoice, while the engineering technology students had the arduous task of programming the device to translate motion into sound.
The Latest Streaming News: Translates Sign Language updated minute-by-minute