Design of Smart Gloves

DOI : 10.17577/IJERTV3IS110222

Download Full-Text PDF Cite this Publication

Text Only Version

Design of Smart Gloves

Ms. Pallavi Verma Mrs. Shimi S.L. Dr. S. Chatterji

M.E. Scholar, EE Assistant Professor, EE Professor and Head, EE

NITTTR Chandigarh NITTTR Chandigarh NITTTR Chandigarh

Abstract: Communication is the only medium by which we can share our thoughts or convey the message but for a person with disability (deaf and dumb) faces difficulty in communication with normal person. Because of this, a person who lacks in hearing and speaking ability is not able to stand in race with normal person. Communication for a person who cannot hear is visual, not auditory. Generally dumb people use sign language for communication but they find difficulty in communicating with others who dont understand sign language. So there is a barrier in communication between these two communities. This work aims to lower this barrier in communication The main aim of the proposed project is to develop a cost effective system which can give voice to voiceless person with the help of Smart Gloves. It means that using smart gloves communication will not be barrier between two different communities. With the help of these gloves disabled person can also get chance to grow in their respective carrier. Using such devices by disabled person also makes nation grow.

Keywords: Sign Language, Gesture Recognition system, Flex Sensors.


India constitutes 2.4 million of Deaf and Dumb population, which holds the worlds 20% of the Deaf and Dumb Population. This person lacks the amenities which a normal person should own. The big reason behind this is lack of communication as deaf people are unable to listen and dumb people are unable to speak. Fig. 1 shows a survey analysis [1].

Deaf & Dumb Literate D&D Graduate D&D Employed D&D

0 50 100 150

Fig. 1: Deaf and Dumb Work Survey

This decreasing ratio of Literate and Employed Deaf and Dumb population is a result of the physical disability of hearing for deaf people and disability of speaking for dumb people so it yields to lack of communication between normal person and Deaf and Dumb Person. It actually becomes the same problem of two persons which knows two different language, no one of them knows any common language so its becomes a problem to talk with each other and so they requires a translator physically which may not be always convenient to arrange and this same kind of problem occurs in between the Normal Person and the Deaf person or the Normal Person and the Dumb person [2-4]. To overcome this problem, we introduce a unique application. Our application model is a desirable Interpreter which translates. Natural English Sentences as, an text input by Normal Person for Deaf Person and Sign Language, in form of Gesture by a Dumb Person to Synthesized English Words which have a corresponding meaning in Sign Language which interprets a particular thing, as an Audio Output for Normal Person. This will help Normal and Deaf and dumb communities by removing the communication gap between them [5-8].

The sign language is an important and only method of communication for deaf-dumb persons. As sign language is a formal language employing a system of hand gesture for communication (by the deaf). Sign Language symbol is shown in figure 2 [9]. In this project Flex Sensor Plays the major role, which are placed on fingers, as fingers bends it changes resistance depending on the amount of bend on the sensor [11].

Fig. 2 Sign Language Symbols



The main aim of the proposed project is to develop a cost effective system which can give voice to voiceless person with the help of Smart Gloves. It means that using Smart Glove by the deaf person enables them to communicate with others which also helps to bridge the gap between person with disability and normal person. Problems faced by the deaf person regarding employment can be overcome by this method. So in the proposed work




an intelligent microcontroller based system using Flex sensors will be developed which is able to:

  • To develop coding for the system to that receives its instruction from gesture recognition system using Flex sensors.

  • To develop a microcontroller based cost effective system to recognize gesture and convert into coded form so that it can be displayed if code matches with predefined codes.

  • Normal person can text their message using keyboard.

The wireless arrangement makes the device more comfortable to be used by the disabled person. Wireless transmission and reception of signals are done with the help of RF transceiver.


    1. Flex Sensor

      Signed letters are determined using flex sensor on each finger. The flex sensors change their resistance based on the amount of bend in the sensor as shown in figure 7[12]. As a variable printed resistor, the flex sensor achieves great form-factor on a thin flexible substrate. When sensor placed in gloves is bent, it produces a resistance output correlated to the bend radiusthe smaller the radius, the higher the resistance value. They require a 5-volt input and output between 0 and 5V. The sensors are connected to the device via three pin connectors (ground, live, and output). In device, sensors are activated in sleep mode. It enables them to power down mode when not in use.[13-14].

      IMPEDENCE BUFFER Vout = Vin (+)

      Fig. 5 Basic Flex Circuit

      Figure 5 shows circuit of basic flex sensor which consist of two or three sensors are connected. The outputs from the flex sensors are inputted into op-amps and used a non- inverted style setup to amplify their voltage [16].The greater the degree of bending the lower the output voltage.

      By voltage divider rule, output voltage is determined and given by

      Vout = Vin *R1 / (R1 + R2), where R1 is the other input resistor to the non-inverting terminal.



      Bending (degrees)


      Fig.4 Flex Sensor Offers Variable Resistance Reading


      Fig.6 Characteristics

      1. Resistance V/S Bending

      2. Voltage V/S Resistance

      Fig.7 Glove with flex Sensors

    2. PIC MICROCONTROLLER Microcontroller is the heart of the device. It stores the required data and make use of if whenever the person uses the device. This device helps deaf and dumb person to announce their requirement. By this the person who is near can understand their need and help them. PIC microcontrollers can be programmed in Assembly, C or a combination of the two [19-21].

      Other high-level programming languages can be used but embedded systems software is primarily written in C. All output signals generated from flex sensors are in analogue form and these signals need to be digitized before they can be transmitted to encoder. Therefore microcontroller PIC16F877A is used as the main controller in this project. It has inbuilt ADC module, which digitizes all analogue


      After gesture recognition system, data is sent to voice section. In this, data is matched with feeded data. If the data is matched with feeded data then it is given to speaker and display system [30-31].


      In this project data glove is implemented to capture the hand gestures of a user. Gloves are aimed to convert gesture into voice. In this project data glove is implemented to capture the hand gestures of a user Smart gloves having sensors in it captures the movement of user and converts analog input into digtal output utilizing voltage divider rule. Then movement is given to microcontroller for further processing. Now gesture array is transmitted using RF transmitter and receiver. Recognized gestures are matched with prefeeded data and if it matches given to speaker using voice section.



      signals from the sensors and inbuilt multiplexer for sensor signal selection. It supports both serial and parallel communication facilities [22].


      Flex Sensor


      ntroller Encod

      RF Transmit


      The output from the PIC microcontroller is encoded by encoder. The programmed address/data are transmitted together with the header bits Via an RF. It is used to correct the error at the receiver end, if any error had occurred. In the receiver it is decoded by decoder [23-25].


      The gesture manager is the principal part of the recognition system. It contains data to match with incoming data. The system tries to match incoming data with existing posture.


      Receiv er

      Decod er

      Gesture recogniti on

      Voic e Secti

      The bend values of the fingers and for each posture definition the distance to the current data is calculated. Then, the position/orientation data is compared in a likewise manner [26-29].

      Fig. 8 Block Diagram of Smart Gloves



        Switch on the system

        Is there any change in

        resistance of


        • Disabled use these gloves to convert sign performed by them into speech.

        • From the convenience of simple flex sensors, a user is able to interact with others in more comfortable and easier manner. This makes it possible for the user to not only interact with their community but with others also and they can also live normal life.

        • The end product will have a cheap and simplistic design making it easy for users to interact with.

        • The system is capable of recognizing signs more quickly. Furthermore real time recognition ratio of nearly 99% can be easily achieved

Convert analog signal into digital

flex sensor?




Keyboar d

Does signal matches from predefined code?

Audio system/LCD







    Sign language is a method used for communication by disabled person. Here we are converting sign language into text and speech so that communication is not limited between them only, utilizing data gloves communication barrier between two different communities is eliminated. Using data gloves disabled person can also grow in their carrier and makes nation grow as percentage of disabled person are millions in count. Making their future better

    ,making nation better.



  1. Laura Dipietro, Angelo M. Sabatini and Paolo Dario A Survey of Glove-Based Systems and Their Applications, IEEE Transactions on Systems, Man and CyberneticsPart C: Applications and Review, Vol. 38, No. 4, pp. 461-482, July 2008.

  2. K.C. Sriharipriya, K.Aarthy, T.S.Keerthana, S.Menaga ,S. Monisha, Flex Sensor based Non-specific User Hand Gesture Recognition, International Journals of Innovative Research and Studies, Vol.2, No.5, pp.214-220, May 2013.

  3. Praveen Kumar, S Havalagi and Shruthi Urf Nivedita,

    The Amazing Gloves that give Voice to the Voiceless, International Journal of Advances in Engineering & Technology, Vol. 6, No.1, pp. 471-480, March 2013.

  4. Olga Katzenelson and Solange Karsenty, A Sign-to- Speech Glove, IUI Workshop: Interacting with Smart Objects, Haifa, Israel, and February 24, 2014.

  5. Shoaib Ahmed.V Department of Electronics and Communication C. Abdul Hakeem College of Engineering and Technology Melvisharam, Vellore, Tamil Nadu 632 509, India submitted the thesis report of Hand Gesture Recognition and Voice Conversion System for Differentially Able Dumb People, 2012.

  6. Oya Aran, Ismail Ari, Lale Akarun, Bulent Sankur, Alxender Benoit, Alice Caplier, Pavel Campr, Ana Huerta Carrillo and Francis-Xavier Fanard, Sign Tutor: An Interactive System for Sign Language Tutoring, IEEE Proceeding of 6th International Conference, Israel, Vol 16, No. 1, pp. 81-

    Fig.9 Flowchart of Smart Gloves

    93, November 2009.

  7. M. Ebrahim Al-Ahdal and Nooritawati Md Tahir, Review in Sign Language Recognition Systems IEEE Symposium on Computers & Informatics, USA, pp.52-57, May 2012.

  8. Carlos Pesqueira Fiel, Cesar Cota Castro, Victor Velarde Arvizu, Nun Pitalúa-Díaz, Diego Seuret Jiménez, Ricardo & Rodríguez Carvajal, Design of Translator Glove for Deaf- Mute Alphabet 3rd International Conference on Electric

    and Electronics (EEIC 2013), Mexico, pp. 01-07,November 2013.

  9. Xia Liu and Kikuo Fujimura, Hand Gesture Recognition using Depth Data Proceedings of the Sixth IEEE International Conference on Automatic Face and Gesture Recognition IEEE Computer Society ,Washington, DC, USA, pp. 529 534, 17-19 May 2004.

  10. Wen Gao , Gaolin Fang, Debin Zhao and Yiqiang Chen

    Transition Movement Models for Large Vocabulary Continuous SignLanguage Recognition Proceedings of the Sixth IEEE International Conference on Automatic Face and Gesture Recognition, IEEE Computer Society ,China, May 2004.

  11. Sylvie C.W. Ong and S. Ranganath, Automatic Sign Language Analysis: A Survey and the Future Beyond Lexical Meaning, IEEE Transcation on Pattern Analysis and Machine Learning, Vol. 27, No. 6 , pp. 873-891, Februrary 2005.

  12. Sushmita Mitra and Tinku Acharya, Gesture Recognition: A Survey, IEEE Transactions on Systems, Man and CyberneticsPart C: Applications and Review, Vol. 37, No. 3, pp. 311-324, May 2007.

  13. Ignazio Infantino, Riccardo Rizzo, and Salvatore Gaglio, A Framework for Sign Language Sentence Recognition by Commonsense Context, IEEE Transactions on Systems, Man, and Cybernetics-Part C: Applications and Review, Vol. 37, No.5, pp. 1034-1039, September 2007.

  14. Gaurav Pradhan , Balakrishnan Prabhakaran and Chuanjun Li, Hand-Gesture Computing for the Hearing and Speech Impaired IEEE, Vol.15, No.2, pp. 20-27, April – June 2008.

  15. Kaustubh Kalgaonkar and Bhiksha Raj, One- Handed Gesture Recognition using Ultrasonic Doppler Sonar , IEEE International Conference on Acoustics, Speech and Signal Processing, Taipei, pp. 1889-1892, 19-24 April 2009.

  16. Satjakarn Vutinuntakasame, V-ris Jaijongrak and Surapa Thiemjarus, An Assistive Body Sensor Network Glove for Speech- and Hearing- Impaired Disabilities, IEEE International Conference on Body Sensor Networks, Dallas, TX, pp. 7 12, 23-25 May 2011.

  17. Haoyun Xue and Shengfeng Qin, Mobile Motion Gesture Design for Deaf People, Proceedings of the 17th International Conference on Automation & Computing, University of Huddersfield Huddersfield, UK, pp. 46-50, 10 September 2011.

  18. Netchanok Tanyawiwat and Surapa Thiemjarus, Design of an Assistive CommunicationGlove using Combined Sensory Channels Proceedings of Ninth International Conference on Wearable and Implantable Body Sensor Networks, pp. 34- 39, London, 9-12 May 2012.

  19. Lingchen Chen, Feng Wang, Hui Deng, and Kaifan Ji , A Survey on Hand Gesture Recognition IEEE International Conference on Computer Sciences and

    Applications, Wuhan, China, pp. 313 316, 14-15 December


  20. Yun Li,Xiang Chen, Jianxun Tian, Xu Zhang, Kongqiao Wang and Jihai Yang, Automati Recognition of Sign Language Subwords based on Portable Accelerometer and EMG Sensors, IEEE Engineering in Medicine and Biology Society, Vol.59, No.10, pp. 2695 2704, October 2012.

  21. Ajinkya Raut, Vineeta Singh, Vikrant Rajput and Ruchika Mahale,Hand SignInterpreter, The International Journal of Engineering And Science (IJES), Vol. 1,No.2,pp. 19-25, December 5, 2012.

  22. Beifang Yi, Frederick C. Harris, Jr. and Sergiu M. Dascalu From Creating Virtual Gestures to Writing in Sign Languages, CHI Extended Abstracts, Portland, Oregon, USA, pp. 1885-1888, April 2-7 2005.

  23. Laura Dipietro, Angelo M. Sabatini and Paolo Dario A Survey of Glove-Based Systems and Their Applications, IEEE Transactions on Systems, Man and CyberneticsPart C: Applications and Review, Vol. 38, No. 4, pp. 461-482, July 2008.

  24. Ambika Gujrati, Kartigya Singh, Khushboo, Lovika Sora and Mrs. Ambikapathy, Hand-talk Gloves with Flex Sensor: A Review, International Journal of Engineering ScienceInvention ISSN, Vol. 2, No.4, pp. 43-46, April 2013.

  25. Subash chandar, Vignesh Kumar and Willson Amalraj Real Time Data Acquisition and Actuation via Network for Different Hand gestures using Cyber glove, International Journal of Engineering Research and Development ISSN, Vol. 2, No.1, pp. 50-54, July 2012

  26. David K. Sarji, HandTalk: Assistive Technology for the Deaf, Computer

    Volume 41, Issue 7, pp. 84-86, July 2008.

  27. Sigal Berman, and Helman Stern, Sensors for Gesture Recognition Systems, IEEE Transactions on Systems, Man, and CyberneticsPart C: Applications and Review, Vol. 42, No.3, pp. 277-290, May 2012.

  28. Praveen Kumar, S Havalagi and Shruthi Urf Nivedita,

    The Amazing Gloves that give Voice to the Voiceless, International Journal of Advances in Engineering & Technology,Vol. 6, No.1, pp. 471-480, March 2013.

  29. Vishal Nayakwadi and N. B. Pokale, Natural Hand Gestures Recognition System for Intelligent HCI: A Survey, International Journal of Computer Applications Technology and Research,Vol.3,No.1,pp. 10 19, 2014.

  30. Helen Using Multiple Sensors for Mobile Sign Language Recognition Proceedings of the 7th IEEE International Symposium on Wearable ComputersIEEE Computer Society Washington, DC, USA 2003.

  31. Nikhil T. Tarte, Sneha N. Bhadane and Purva S. Kulkarni A Gesture Audio-Video Conferencing Application for the ease of Communication between Normal Person And Deaf and Dumb Person IJSETR, Volume 3, Issue 5,pp. 1487- 1490 May 2014.

Leave a Reply