Gesture Vocalizer for Deaf and Dumb

DOI : 10.17577/IJERTCONV10IS11143

Download Full-Text PDF Cite this Publication

Text Only Version

Gesture Vocalizer for Deaf and Dumb

Rajeshwari H M

Department of ECE Jain Institute of Technology Davangere, Karnataka, India

Basavaraj N V

Department of ECE Jain Institute of Technology Davangere, Karnataka, India

Kiran S Terdal

Department of ECE Jain Institute of Technology Davangere, Karnataka, India

Nandeesh B S

Department of ECE Jain Institute of Technology Davangere, Karnataka, India

Manoj G

Department of ECE Jain Institute of Technology Davangere, Karnataka, India

Abstract – People work together to share their views, ideas, and knowledge with those around them. However, it is not the same for the deaf and dumb. People who are deaf or mute can communicate via sign language with others. The sign language recognition system facilitates communication between people with speech impairments and non-speech impaired people, hence bridging the gap in communication. Hand gestures are more crucial than other types of touches (arm, face, head, and body) as they reflect the user's perception in a small amount of time. The flex sensor changes the bend value to resistance, so whenever the bend increases, the resistance value also increases.. The Accelerometer measures the displacement of the hand. The values of these sensors are converted to digital data and then processed by the Microcontroller, and the results are transmitted to the output device (phone) via the Bluetooth module (HC05).

KeywordsGestures, flex sensors, accelerometer, Bluetooth module, Arduino UNO.


    With a population of about 7.6 billion people, communication today is a strong means for understanding each other. But people with speech-impairment need some special skills called sign language to communicate with the rest of the world. They think that its hard to be involved in a general public where a large portion of the population does not understand sign language. For people who are unable to hear or speak, sign language is the most convenient and simple skill to communicate. People without these disabilities never try to learn sign language in order to communicate with people who are deaf or dumb. As a result, the deaf and dumb group is isolated. This device will help the deaf and dumb community to communicate with the rest of the world using acoustic sounds. Indian sign language (ISL) uses both hands to represent different alphabets [1], numbers and words. Most researchers in this field focus on American Sign Language (ASL) recognition, because most ASL signs are used with one hand and, as a result, have a lower level of complexity. In recent years, more and more researchers have begun working on ISL. This proposed system can recognize numerous alphabets, numbers and

    words which will provide a wide range of gestures to the user.


    • To help the deaf and dumb community

    • To read hand gestures accurately using the sensors

    • To compare these hand gesture values with the standard values in the code

    • To transfer the gestures equivalent speech and text data to the output device via the Bluetooth module

    • To show customizable words, symbols, numbers, and sentences into speech and text using the Bluetooth text to speech converter application on the phone


    Over the years, various attempts have been made to close the communication gap between the deaf-mute community and the general public. Many attempts include translating a hand sign into a voice signal. This paper presents brief summaries of various previous efforts designed to improve gloves for changing hand gestures using a variety of technologies.

    Sanish Manandhar et al [1] proposed the Hand Gesture Vocalizer for Dumb and Deaf People . Using components such as Arduino Mega, Flex sensors, and Accelerometer, the system translates a given sign used by a disabled person into its proper written, aural, and graphical form, which can be comprehended by a common person. On the present training model, the system employs the Random forest algorithm to predict the correct output with an accuracy of 85 percent.

    Mali Pooja Dadaram et al [2] proposed Sign Language to Speech Conversion Gloves using Arduino and Flex Sensors. In the proposed system, the method is used with a microcontroller (Arduino UNO) and a flex sensor based data glove. While the data is being transferred, the LED lights up. Flex sensors are inserted inside the glove. The rotation detector makes a uniform change in resistance to each touch and measures the shape of the hand. In the controller, the process for those hand signals is completed. Touch is

    compared to what is on the website, and output is generated in the form of audio and LCD data.

    Mangesh T Nikam et al [3] proposed Talking Hands for Deaf and Dumb People. The system consists of 3 axes accelerometer sensor which is mounted on hand /head of disable person. Movement of hand/ head from accelerometer is sensed by micro-controller with the help of ADC. Specific words are assigned for each significant movement of hand/head. Micro-controller read this movement & gives the command to Voice IC to speak out the prerecorded message assigned for that particular movement.

    Kshirasagar Snehal P et al [4] proposed gesture vocalizer for deaf and dumb. The system consists of a Flex sensor, accelerometer, speech synthesizer, speaker, and LCD all integrated into the AVR microcontroller (ATmega16). The glove is equipped with flexible sensors and an accelerometer. The AVR microprocessor compares these values with the stored value and selects the nearest or nearest value, i.e. the description of the action, which is displayed in the text on the LCD and spoken over the speaker.

    Supriya Shevate et al [5] proposed Gesture Based Vocalizer for Deaf and Dumb. Gesture-based vocalizer which will detect all the gestures of deaf people and turn it into a voice and they can show it on the LCD screen .

    ARM 7 controller for all sensors and speech synthesizer. In fact the data gloves contain two types of

    flexible sensors and accelerometer as a tilted sensor. Boppana, Lakshmi et al [6] proposed Assistive Sign Language Converter for Deaf and Dumb. This device allows a person to interact with the gestures to see which gestures are based on the variety. The controller for this auxiliary device has been developed to process gesture images using a variety of image processing techniques and in-depth reading models to identify the signal. This feature is converted to real-time speech using a text-to-speech module.

    Khan, Mubashir et al [7] proposed. "SignTalk and animator for speech and hearing imapaired.".

    Akthar, S. Javeed et al [8] proposed. Survey on communication among blind, deaf and dumb people using a smart glove

    Kakde, Manisha U et al [11] proposed. A review paper on sign language recognition system for deaf and dumb people using image processing.Contains the review of different Sign Acquiring Methods and methods for sign identification system.


    Fig 1: Block diagram for gesture vocalizer

    The system receives the gesture as an input, which is measured by four flex sensors and an accelerometer sensor. The flex sensors measure the bending of fingers and the accelerometer sensor measures the displacement and position of the hand in x, y and z coordinates in analog form. These values are fed to the Arduino UNO which has built-in ADC, these digital values are compared to the values stored in the predefined code, if a match is found, the gestureis identified and the corresponding text data is transferred to the Bluetooth module which transmits this digital data wirelessly to the output device (phone). Bluetooth text to speech converter application is used to convert the text into a speech format. Both text and speech are given as output shown in Fig 1.


    Fig 2: Flow chart for working of gesture vocalizer



    Fig 3: Arduino UNO R3

    The Arduino Uno is a microcontroller board that uses the ATmega328P microcontroller. Includes 14 digital input / output pins (six of which can be used as PWM effects), six analogue inputs, 16 MHz ceramic resonator (CSTCE16M0V53-R0), USB port, ICSP header, and the reset button. It comes with everything you need to get started with a microcontroller as shown in Fig 3.


    Fig 4: Flex Sensors

    Flex sensor sensors are made of conductive ink, its resistance varies depending on the bending value. They bring out the change in bending into resistance. The flexing of sensor is directly proportional to the resistance value of the flex sensor. They are usually in the form of thin strips from 2-3 in length. Each flex sensors ideal resistance might be different. Series resistance is used in series with flex sensors to obtain decent variation values shown in Fig 4.


    Fig 5: Accelerometer (MPU 6050)

    The MPU 6050 Accelerometer is a good motion processing module. By combining the MEMS 3-axis gyroscope with a 3-axis accelerometer on the same silicon dies and an onboard Digital Motion Processor capable of processing the 9-axis Motion Fusion algorithm. It is very accurate, as it contains an analog of 16 bits to digital version of each channel. Therefore, it captures x, y and z channels simultaneously[7]. Sensors use an I2C-bus to communicate with Arduino. All the PINs you need to get up and running are taken out, including the auxiliary bus I2C that allows the MPU-6050 to access external magnetometers and other sensors. Internal LDO chip allows you to connect the board with 5V & 3V3 MCUs without the need to change the level shown in Fig 5.


    Fig 6: Bluetooth Module (HC-05)

    The HC-05 is a Bluetooth module designed for wireless communication. This module can be used as a master and slave. All enabled serial devices can be connected to each other using a Bluetooth module. This module operates at 3.3 volts. Because the module contains a 5 to 3.3 V controller on board, we can also attach a 5V supply voltage. There is no need to change the transmission rate of the HC-05 Bluetooth module as it has a 3.3 V RX / TX standard and the small controller can detect the 3.3 V HC-05 standard shown in Fig 6.



    Fig 7: Arduino IDE Interface

    Open-source Arduino Software (IDE) makes it easy to write code and upload it to the board. This software can be used with any Arduino board. The Arduino is a computer hardware and software device UART .TTL-serial connection is available for digital pins TX (1) and RX (1) on the Arduino UNO ATmega328 (0). Arduino software has a serial monitor that allows faster data entry. When data is transmitted via USB, two LEDs per circuit, with a reception label and a transmitter, will flash. To download applications to personal computers, the boards provide serial ports, which include Universal Serial Bus (USB) for a variety of other features. Microcontrollers are highly customizable with features from C and C ++ programming languages. The Arduino project has an integrated development platform (IDE) based on the Processing Language project, in addition to using chains of relevant integration tools shown in Fig 7.


    Arduino UNO R3,4 flex sensors, an accelerometer and a Bluetooth module, all these components are attached directly to the gesture vocalizer glove. left pin of all flexible sensors is connected to common ground and the right pins are connected to 47kohm resistors whose other end is connected to the common VCC. The Bluetooth sensors TX and RX pins are connected to RX and TX pins of Arduino.MPU-6050 Accelerometers SCL and SDA are connected to A5 and A4 pins of Arduino shown in Fig 8.

    Fig 8: Hardware part of the gesture vocalizer device

    To convert the text into speech, Bluetooth text to speech converter application is used. Bluetooth module is displayed as HC-05 during selection of Bluetooth device. Both text and speech output are displayed on the screen whenever a gesture is made. The pitch, speed and volume of output voice can be customized shown in Fig 9.

    Fig 9: Bluetooth text to speech converter android application





    Display: 1

    Speech: One





    Table 1: ASL hand gestures output

    Table 2: customized hand gestures output






Gesture vocalizer is an easy to use hand glove device which will help the deaf and dumb community to communicate with the rest of the world using Acoustic sounds. It receives a gesture as input and provides the speech and text output on the android application. Hand gestures are more crucial than other types of gestures (arm, face, head, and body) as they reflect the user's perception in a short period of time. People without these disabilities never try to learn sign language in order to communicate with people who are deaf or dumb. As a result, the deaf and dumb group is isolated. Gesture vocalizer will help this community to be involved with the rest of the world and make their lives much easier with the help of communication ability.


In future dedicated application for gesture vocalizer can be developed in which the user can set his own gestures and shortcuts. User can customize different output voices and also can set his own voice for the gestures. Personal voice assistants such as Alexa, Google Assistant can be included in the glove which can also support some basic home automation shortcuts such as switching ON and OFF lights, increasing/decreasing the thermostat etc.


[1] Manandhar, Sanish, Sushana Bajracharya, Sanjeev Karki, and Ashish Kumar Jha. "Hand Gesture Vocalizer for Dumb and Deaf People." SCITECH Nepal 14, no. 1 (2019): 22-29.

[2] Mali Pooja Dadaram, Gosavi Deepali Balu, Sonawale Rutuja Ramesh, Prof. S.N.Wangikar. Sign Language to Speech Conversion Gloves using Arduino and Flex Sensors. International Research Journal of Engineering and Technology (IRJET) Volume: 07 Issue: 04 | Apr 2020

[3] Nikam, Mangesh T., S.R.Ghodunde, K.V.Sonawane, and V. R. Gawade. "Talking Hands for Deaf and Dumb People." International Journal of Scientific Engineering and Technology Research (2015): 565-568.

[4] Kshirasagar Snehal, P., Shaikh Mohammad Hussain, S. Malge Swati,

S. Gholap Shraddha, and Mr Swapnil

Tambatkar."INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY GESTURE VOCALIZER FOR DEAF AND DUMB." [Ashwini*, 5(4): April 2016] [5] Supriya Shevate , Nikita Chorage , Siddhee Walunj, Moresh M. Mukhedkar. Gesture Based Vocalizer for Deaf and Dumb

IJARCCE Vol. 5, Issue 3, March 2016

[6] Boppana, Lakshmi, Rasheed Ahamed, Harshali Rane, and Ravi Kishore Kodali. "Assistive Sign Language Converter for Deaf and Dumb." In 2019 International Conference on Internet of Things (iThings) and IEEE Green Computing and Communicatios (GreenCom) and IEEE Cyber, Physical and Social Computing (CPSCom) and IEEE Smart Data (SmartData), pp. 302-307. IEEE, 2019.

[7] Khan, Mubashir, and Abdus Salam Shaikh. "SignTalk and animator for speech and hearing imapaired." (2017).

[8] Akthar, S. Javeed, P. Yuvaraju, M. Ramanjaneyulu, and S. Premkumar. "Survey on communication among blind, deaf and dumb people using a smart glove." (2018).

[9] Sapkota, Bijay, Mayank K. Gurung, Prabhat Mali, and Rabin Gupta. "Smart glove for sign language translation using Arduino." In 1st KEC Conference Proceedings, vol. 1, pp. 5-11. 2018.

[10] Yudhana, Anton, J. Rahmawan, and C. U. P. Negara. "Flex sensors and MPU6050 sensors responses on smart glove for sign language translation." In IOP conference series: materials science and engineering, vol. 403, no. 1, p. 012032. IOP Publishing, 2018.

[11] Kakde, Manisha U., Mahender G. Nakrani, and Amit M. Rawate. "A review paper on sign language recognition system for deaf and dumb people using image processing." International Journal of Engineering Research & Technology (IJERT) 5, no. 03 (2016).

[12] Ail, Suyash & Chauhan, Bhargav & Dabhi, Harsh & Darji, Viraj & Bandi, Yukti. (2020). Hand Gesture-Based Vocalizer for the Speech Impaired. 10.1007/978-981-15-1002-1_59.

[13] Sethi, Ashish, S. Hemanth, Kuldeep Kumar, N. Bhaskara Rao, and R. Krishnan. "Signpro-An application suite for deaf and dumb." IJCSET 2, no. 5 (2012): 1203-1206.

[14] Bajpai, Dhananjai, Uddaish Porov, Gaurav Srivastav, and Nitin Sachan. "Two way wireless data communication and american sign language translator glove for images text and speech display on mobile phone." In 2015 Fifth International Conference on Communication Systems and Network Technologies, pp. 578-585. IEEE, 2015.

[15] Loke, Pranali, Juilee Paranjpe, Sayli Bhabal, and Ketan Kanere. "Indian sign language converter system using an android app." In 2017 International conference of Electronics, Communication and Aerospace Technology (ICECA), vol. 2, pp. 436-439. IEEE, 2017.