A Study of Hand Gesture Technique to Communicate with a Robot

DOI : 10.17577/IJERTV4IS010047

Download Full-Text PDF Cite this Publication

Text Only Version

A Study of Hand Gesture Technique to Communicate with a Robot

Praveen Kumar Sharma,

Assistant Professor,

Department of Electronics and Communication,

  1. K. Birla institute of Engineering & Technology, Pilani, Rajasthan, India

    Rohan Sheoran, Student (BE),

    Department of Electronics & Communication Engineering,

    B.K. Birla Institute of engineering and Technology, Pilani, Rajasthan, India

    Dinesh Kumar Kathania, Student (BE),

    Department of Electronics and Communication Engineering,

    B.K. Birla Institute of Engineering and Technology, Pilani, Rajasthan, India

    Abstract- Robots are the future of humans. This paper provides a smart way to recognize the hand gestures and bind them in a library for future use. This paper also covers the background techniques used for hand gesture recognition also, their advantages and disadvantages. It also covers the future work that can be done regarding the library technique for searching of pre-loaded hand gestures.

    Keywords:- Robot, HSV, Orientation Histogram Scheme, Library.

    1. INTRODUCTION

      In the modern age, the robots are one of the hottest topics on the planet; as they can perform the work with more efficiency and precision than humans. Robots being robust in nature can work in environments where living of individual is very hard; at such places, they provide greater shoulder to human for tasks. But, fully automated robots are nowhere near completion in the next five years or so. Also, almost everywhere in the society robots are used whether maybe it is industry or homes. With the growing numbers of companies and researches taking place in the field of robotics, almost every company is putting in some money for the research area that robotics provide.

      Communication is root of transmitting of intentions or commanding someone to do some work. Robots at their initial phase of development work only on the desired functions stored in them monotonously, it is because robots have no IQ, so; there is always a need of human component. Now Robotics is no more just a function with manual interaction but can also be instructed with human component whether may it be speech, code, and retina or hand gestures.

      Hand gesture or talking to someone using hands, is one of the oldest way of communication used by the humans. Humans who are not able to hear or speak often use hand

      gestures to communicate with the people around them. So, the main purpose of this paper is to make a system that can recognize some particular hand gesture to perform certain task, but also store them in the library for future use. The use of hand gesture will not only increase the usability of the robot with the person but also increase the time response of the robot for a particular task.

      For performing the above task a robot must have a camera and also a processor, which can detect the set of instructions being offered to the robot using hand gesture. This paper, tries to make a system which can auto store hand gestures and process them for giving commands to the robot.

    2. BACKGROUND

      Now a day an increasing number of systems are being worked out/made which uses the technique of hand gesture to recognize a command for a robot. But, one of the main problems in these systems was the fact that the machines could not perform complex human-robot interactions.

      1. Orientation Histogram Scheme

        Some of the scientists designed an algorithm that extracted information from the image input and then calculated its vector positions on the hand. They did so in three basic steps; firstly, they searched the hand block in the image by the use edge based hand area search algorithm. Then these researchers extracted the hand block and the vectors were analyzed by applying histogram algorithm. They also provided vectors for the moving hands and then they extracted the information of the hand vectors by Grey scaled motion images [1]. But, this was rather a slower process as the information was first extracted from the input, then the algorithm were applied to see the sign used by the human. Then, a conformation was send to the processor regarding which hand sign it was and then the action was taken.

      2. Histogram and Library

        Some researchers combined the histogram and instead of using gray scaled motion images; compared to already registered images in the system [2]. In this process, the idea was same but to be sure of the result and command provided to the robot. The researchers cross referenced the vectors in the image captured by the image input to the already registered images in the system and then the command was processed and completed. This method required a more memory area, as the library created need to be on the computer.

      3. HSV Colour Techniques

      Some researchers instead of using Grey scaled images used HSV color scheme as the major parameter for capturing while some researchers used HSV color spaces as major parameters [3]. The main advantage of HSV technique was that it took image samples in RGB color model, unlike Grey scale which only contained necessary information from the image captured. Using HSV technique we could get more information regarding the environment of the system. The main disadvantage of this system is the fact of differentiating the relevant and irrelevant information from the input image that is required for processing of the command that is to be given to the robot.

    3. DESCRIPTION AND DESIGN APPROACH

      The approach behind this paper is to create a global commands for controlling a robot or systems provided in the previous researches [4], [3]. To overcome the past problems, the focus of this paper is to make a self- generating library of hand gestures which can manipulate by the administrator of the robot.

      1. Design Implementation

        Phase I: Image Capturing

        Firstly, the human or user gives a command using hand, these gestures are captured by the camera, and the image from the camera of the robot is processed and shown to the user. The Image capturing software shall be developed in C.

        Phase II: Image Processing

        The image of the hand gesture, is processed using Orientation Histogram Scheme [1], [2]. By the help of vectors the image is processed.

        Phase III: The Gesture Recognition

        If the gesture found by the system is not present in the library of the system then it automatically asks for the definition of the gesture to the administrator. If the gesture is not useful or the image processed is wrong then the administrator can delete the gesture and the Phase I can begin again. The implementation of gesture recognition can be done using a combine approach of MATLAB and C.

        Phase IV: Command

        If the image is already registered in the library the command is given to the robot and if a new command is generated then by the updated library the command is recognized and given to the robot.

    4. APPLICATIONS

      The system can be used to create different library for different purposes like:

        • Television Control using hand gesture [5].

        • Navigation

        • DTAR[6]

        • DSAC[7]

        • Medicine [8], [9], [10]

    5. FUTURE WORK

      The main problem in this idea, happens when the there is a very huge library of hand gestures. Due to many commands present in the library, when a hand gesture is captured and is sent to cross reference with the images in the library, it takes a load of time. To overcome this situation and make the system, faster and accurate, we can use different search algorithm techniques [11] and also can design the library software to auto-generate a folder for most used hand gesture by the user. This would make the search process faster and better for the user.

    6. REFRENCES

  1. Hyung-Ji Lee and Jae-Ho Chung, Hand Gesture Recognition using Orientation Histogram, in Proc. of the IEEE Region 10 Conference, pp. 1355-1358 vol2, 1999.

  2. Hanning Zhou, D.J. Lin and T.S. Huang, Static Hand Gesture Recognition based on Local Orientation Histogram Feature Distribution Model, in Proc. of Conference of CVPRW, pp. 161, 2004

  3. M. Manigandan and I.M Jackin, Wireless Vision Based Mobile Robot Using Hand Gesture Recognition Through Perceptual Color Space, in Proc. of 2010 International Conference on ACE, pp. 95-99, 2010.

  4. C.Wang and K.Wang, Hand Posture Recognition using Adaboost with SIFT for Human Robot Interaction, in Proc. of International Conference on Advanced Robotics, 2007.

  5. W. T. Freeman and C. D. Weissman, Television control by hand gesture, International Workshop on Automatic Face- and Gesture- Recognition, 1995.

  6. A. Degani and H. Choset, DTAR- A Dynamic, Tube-Ascending Robot, in Proc of the IEEE Tranactions Volume 27 Issue 2, pp. 360- 364, 2011.

  7. A. Degani and H.Choset, DSAC-Dynamic, Single Actuated Climber: Local stability and bifurcations, in Proc. of IEEE International Conference, 2010

  8. I. Kassim and L. Phee, Locomotion Techniques for Robotics Colonoscopy, in Proc of Engineering in Medicine and Biology Magzine, IEEE Vol. 25 Issue 3, pp. 49-56, 2006.

  9. Wang Kundong and Yan Guozheng, A Wireless Robotics Endoscope for Gastrointestine, in Proc. of IEEE Transactions Volume 24 Issue 1, 2008.

  10. K. Ikeuchi and K. Yoshinaka, Locomotion of medical micro robot with spiral ribs using mucus, in Proc. Micro Machine and Human Science, pp. 217-222, 1996

  11. A.W.M Smeulders and M. Worring, Content-based image retrieval at the end of the early years, in Proc. Of IEEE Transactions on Pattern Analysis and Machine Intelligence, Volume 22, Issue 12, pp. 1349-1380, 2000.

Leave a Reply