Hand Gesture Recognition for Sign Language Translation
Abstract
This project aims to develop software that translates Indian Sign Language (ISL) hand gestures into real-time text and speech. A custom symbol module enables users to add new gestures, enhancing user experience. Voice call integration allows real-time gesture-to-speech translation during calls. Optimized machine learning models ensure efficiency and accessibility, creating a solution for inclusive communication. The domain for hand gesture recognition in sign language translation, where the development of automated systems for the interpretation of sign languages using gestures would make fluid communication possible with deaf or hard-of-hearing patients. The recognition of hand gestures relies on the combination of algorithms in
image processing and machine learning to capture, understand, and translate signs into text or spoken language. Over time, these systems have evolved from the traditional rule-based and statistical methods to more advanced models that focus on achieving greater accuracy and reliable recognition through machine learning, especially deep learning.