Please use this identifier to cite or link to this item: https://rda.sliit.lk/handle/123456789/1595
Title: Utalk: Sri Lankan Sign Language Converter Mobile App using Image Processing and Machine Learning
Authors: Dissanayake, I.S.M.
Wickramanayake, P.J.
Mudunkotuwa, M.A.S
Fernando, P.W.N.
Keywords: Sinhala Sign Language
Computer Vision
Machine Learning
Issue Date: 10-Dec-2020
Publisher: 2020 2nd International Conference on Advancements in Computing (ICAC), SLIIT
Series/Report no.: Vol.1;
Abstract: Deaf and mute people face various difficulties in daily activities due to the communication barrier caused by the lack of Sign Language knowledge in the society. Many researches have attempted to mitigate this barrier using Computer Vision based techniques to interpret signs and express them in natural language, empowering deaf and mute people to communicate with hearing people easily. However, most of such researches focus only on interpreting static signs and understanding dynamic signs is not well explored. Understanding dynamic visual content (videos) and translating them into natural language is a challenging problem. Further, because of the differences in sign languages, a system developed for one sign language cannot be directly used to understand another sign language, e.g., a system developed for American Sign Language cannot be used to interpret Sri Lankan Sign Language. In this study, we develop a system called Utalk to interpret static as well as dynamic signs expressed in Sri Lankan Sign Language. The proposed system utilizes Computer Vision and Machine Learning techniques to interpret sings performed by deaf and mute people. Utalk is a mobile application, hence it is non-intrusive and cost-effective. We demonstrate the effectiveness of the our system using a newly collected dataset.
URI: http://rda.sliit.lk/handle/123456789/1595
ISBN: 978-1-7281-8412-8
Appears in Collections:2nd International Conference on Advancements in Computing (ICAC) | 2020
Department of Information Technology-Scopes



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.