Article in HTML

Cite this article:
Karan Kumar Giri, Abhishek Singh, Mohammad Intiyaj Alam, Basanta Mahato (2023). The Importance and Challenges of Sign Language Translator- A Review Spectrum of Emerging Sciences, 3 (1), 2023, 36-38. 10.55878/SES2023-3-1-8

  View PDF

Please allow Pop-Up for this website to view PDF file.



1 INTRODUCTION

Sign Language is the visual manner to convey the message for Deaf and Dumb Peoples. It is a combination of gestures, orientations, movements of hands, arms or body and facial expression[1]. Like a normal language, Sign Language is also varying considering different factors. Various standard sign languages available in the real world are Indian Sign Language (ISL), American Sign Language (ASL) etc. According to a great Author Paul J. Meyer, Communication – the human connection is the key to personal and career success [2]. Communication is the important term in real world to allow others and ourself to understand information more accurately and quickly. But since everyone else cannot understand this Sign Language, communication between the Deaf and the Dumb is difficult. This project provides one of the solutions to increase the communication of Deaf and Dumb peoples with the normal peoples. In this digital era, Mobile application is the best solution for everyone to use, so by using the capabilities of Machine Learning and Image processing algorithms available in Tensor Flow Library we make the working mobile application[3]. In which, user has to capture image as input and got the output in terms of text and audio. This will ease the medium for those special peoples and in some cases using the Frequent Phrases feature of our application, there is no need of capturing the photos just they need to press singlebutton[4].

2. BACKGROUND

In the past, communication between deaf and hearing individuals was often limited to written notes or gestures. However, with the advent of new technology, such as the Sign Language Translator, communication has become more accessible and inclusive for the deaf community. The Sign Language Translator uses advanced artificial intelligence and computer vision algorithms to recognize and interpret signs and gestures[5].

3. METHODOLOGY 

To develop the Sign Language Translator, researchers used machine learning algorithms to train the system to recognize and translate signs and gestures from multiple sign languages, including American Sign Language (ASL) and British Sign Language (BSL). The system consists of a camera or sensor that captures the signer's movements and converts them into data. The machine learning algorithms then process this data to identify the signs and gestures that are utilised in the language. The system can translate signs and gestures into, text.

Block Diagram:

Fig. 1- Block Diagram

4. PROJECT OVERVIEW

The initiative's primary goal is to improve the deaf community's ability to engage and communicate with others around them. The intention is to transform the 26 basic characters that make up the English alphabet. script them, then show them on a smartphone.

Fig. 2- The fingers, thumb, and palm bends are detected

The concept of using hand motions to control a robotic arm served as the inspiration for the project[6]. The majority of the job is consistent, but putting the other portions into practise is a challenging undertaking. An accelerometer measures the tilt of the palm. Four bend sensors are located on the fingers of a glove, and one is located on the thumb. The fingers, thumb, and palm bends are detected by these sensors. The bend angle value is used by the Arduino Nano microcontroller to identify the set of values that belong to each symbol. The Arduino Nano then sends the appropriate result value over Bluetooth to the Android app, which displays the generated symbol.

While working on this project, there is a significant problem that arises every time we put on the glove: it requires constant calibration. Additionally, when a person is changing, it is important to calibrate and check based on their hand and gesture.

The accuracy was increased by routinely refreshing the data set for each sign.

But for now, we've created a few movements for a prototype to demonstrate that the project is operational, and with more study, we can address the aforementioned problems.

5. RESULTS

The Sign Language Translator has many potential applications, including in education, healthcare, and business settings. In education, the system can help teachers communicate with deaf students and make their lessons more accessible. In healthcare, the system can help medical professionals communicate with deaf patients and provide them with better care. In business settings, the system can help deaf individuals communicate with hearing colleagues and clients.

6. BENEFITS

The Sign Language Translator can improve communication between deaf and hearing individuals, making the world more inclusive for the deaf community. The technology can also help break down language barriers and promote equal access to education, healthcare, and employment opportunities.

7. CHALLENGES

The Sign Language Translator technology is still in its early stages and faces several challenges. One of the major challenges is the variability of sign languages across regions and cultures. Different sign languages have different signs and grammar rules, making it difficult to develop a universal system that can recognize and translate all sign languages accurately.

8. CONCLUSION

The Sign Language Translator is a promising technology that has the potential to improve the lives of deaf individuals by making communication more accessible. While there are challenges to overcome, the continued development and refinement of this technology can lead to a more inclusive and accessible world for all.



Related Images:

Recomonded Articles:

Author(s): Karan Kumar Giri; Abhishek Singh; Mohammad Intiyaj Alam; Basanta Mahato

DOI: 10.55878/SES2023-3-1-8         Access: Open Access Read More

Author(s): Sunny Raj; Yash Kumar Singh Jadon; Basanta Mahto

DOI: 10.55878/SES2024-4-1-10         Access: Open Access Read More

Author(s): Deep Shikha; Seema Nayak; Anisha Anand

DOI: 10.55878/SES2024-4-1-13         Access: Open Access Read More