Mobile app for sign language translation and learning by image recognition
IB302 14:30 ~ 14:55 EnglishSign language is the way of communication involving specific visual-manual modality to convey meaning, used daily by the hearing impaired and their family members. There are over 360 million people suffering from disabling hearing impairment in 2011, and over 150 thousand people have hearing difficulty in Hong Kong in 2013. Despite its importance, the majority of the general public has little knowledge of sign language, imposing barrier in communication and interaction with the hearing loss and members.
Mobile application is effective for learning and translation with its high accessibility and interactive interface. Many language acquisition apps are available on the market, but very few are for sign language, which rely heavily on simple drawing, whereas effective learning involves interaction. Due to the challenges in capturing visual cues, progress of sign language translation is lagging. Some attempts of translation apps emerged to generate sign language animation from word input. Nevertheless, sign-to-word translation app is still not available.
With recent advancement in visual recognition by neural network machine learning, conversion of visual-manual signs to other verbal language become possible. In this regard, we are inspired to create a mobile application to facilitate the learning and translation of sign language, and to apply the software on simple phrases in Hong Kong Sign Language.
In order to foster the continuous development of the application, we would like to release the preliminary source code as Creative Commons, to encourage joint development for more complex sentences, and for localization for sign languages in different regions.
As recognition accuracy relies data input, we would also like to call for the sharing of more images and clips to improve the machine learning model.
Via this app, we hope to help diminish the communication barriers between different members in society, and raise awareness for the underpriviledged minority. People in need can install the application and let their conversation partner read their signs. Anyone can learn some fundamentals of the language to show their care.
Collaborative note: https://hackmd.io/@coscup/Hk_6CGlNS