×
A Real-Time System to Bridge Communication Gaps Between Deaf and Hearing Communities using Machine Learning and Computer Vision

Authors

Alex Zhu1 and Carlos Gonzalez2, 1USA, 2California State Polytechnic University, USA

Abstract

Communication barriers between the deaf and hearing communities remain a significant challenge due to the lack of widespread knowledge of American Sign Language (ASL) [1]. Motivated by a personal experience at a Boy Scouts summer camp, I developed a real-time ASL translating app to bridge this gap [2]. The app leverages Google’s MediaPipe for precise hand landmark detection and the PointNet model for gesture recognition, translating ASL letters into text in real time [3]. Built with Flutter and Dart for a seamless cross-platform experience, the app integrates a Flask-based backend for efficient processing. Key challenges, including environmental variability and achieving computational efficiency, were addressed through data augmentation, model optimization, and extensive testing. The experimentation demonstrated high accuracy and usability, validating the app’s effectiveness across diverse real-world scenarios. Future plans include expanding capabilities to recognize full ASL sentences, integrating text-to-speech functionality, and leveraging cloud storage for scalability. This project exemplifies how technology can foster inclusivity, creating a practical tool to empower communication and bridge societal gaps.

Keywords

ASL Recognition, Real-Time Translation, Machine Learning, Inclusive Communication