Hand Gesture Recognition using Ant Colony Optimization Technique

DOI : 10.17577/IJERTCONV3IS19145

Download Full-Text PDF Cite this Publication

Text Only Version

Hand Gesture Recognition using Ant Colony Optimization Technique

Vijayalakshmi. R, Meena. R. C

Department of Information Science & Engineering T.John Institute Of Technology, Bengaluru, Karnataka

Abstract:- As computers become more pervasive in society, natural human computer interaction will have a positive impact on their use. Hence there has been growing interest in the new approaches and technologies for bridging the human computer barrier. Gestures have long been considered as an interaction technique that can potentially deliver more natural, creative and intuitive methods to communicate with our computers. The use of hand gestures as a natural interface serves as a motivating force for research in gesture taxonomies.

It focuses on the three main phases of hand gesture recognition i.e detection, tracking and recognition. Firstly the hand is detected using skin filtering and palm cropping was performed to extract out only the palm portion of the hand. After palm extraction, the features of hand were extracted using ant colony optimization technique and finally the input hand gesture is recognized using proper classifier.

INTRODUCTION:

The essential aim of hand gesture recognition system is to create a natural interaction system between human and computer where the recognised gestures can be used for controlling robot or conveying some information.

Human computer interaction also named man machine interaction refers to the relation between the human and computer. Gestures are used for communicating between human and machines as well as people using sign language.

In this method of hand gesture recognition firstly the skin filtering is done where RGB image is converted to HSV image because this model is

more sensitive to lighting conditions and then ant colony optimization is applied.

LITERATURE REVIEW:

We have studied many previous works done in this field by different researchers. There are many approaches that were followed by many researchers like vision based, data glove based, Artificial neural network, fuzzy logic, Genetic algorithm, Hidden markov model etc,. Some of the previous works are given below:

Many researchers [1][2][3] used vision based approaches for identifying hand gestures. kapuscinski [1] found out the skin colored region from the captured image and then this image with desired hand region was intensity

normalised and histogram was found for the same. Hasan[8] applied multivariate Gaussian distribution to recognize hand gestures using non geometric features. The input hand image is segmented using two different methods [18]: skin color based segmentation by applying HSV color model and clustering based thresholding techniques. Some operations are performed to capture the shape of the hand to extract hand feature, the modified Direction Algorithm are adopted to find a relationship between statistical parameters from the data, and used to find slope and trend by finding the direction of the hand gesture, as shown in figure:

Fig.1 Hand direction

The Gaussian distinction is applied on the segmented image and it takes the direction of the hand as shown in figure below:

Fig.2 Hand direction segment

After capturing the hand shape, two types of features are extracted to form the feature vector-local feature and global features.

µpp=(x-µx)p(y-µy)nf(x,y) (1)

µ(k)pp=(x(k)-µx)p(y-µy)nf(x,y) (2)

for all k {1,2,3,..88} & p {0,1}

Kulkarni recognize static posture of American Sign Language using neural network algorithm. The input image are converted into HSV color model, resized into 80X64 and some image pre-processing operations are applied to segment the hand from a uniform background, features are extracted using histogram technique and Hough algorithm

Wyoski presented rotation invariant postures using boundary histogram. Cameras were used to acquire the input image, filter for skin color detection has been used following clustering process to find the boundary for each group in the clustered image using contour tracking algorithm.

OBJECTIVES:

By using proper sensors (accelerometers and gyros) worn on the body of a patient and by reading the values from those sensors, robots can assist in patient rehabilitation. The best example can be stroke rehabilitation.

Through the use of gesture recognition, "remote control with the wave of a hand" of various devices is possible. The signal must not only indicate the desired response, but also which device to be controlled.

PROPOSED SYSTEM:

The block diagram of proposed system is given

below :

Fig.3 Proposed system

The first step involves the capturing of image using camera and conversion of the input RGB image to HSV colour space. This step is done because HSV model is more sensitive to changes in lighting condition. HSV which means Hue (H), Saturation (S) and the brightness (I, V or L).The resulting image is filtered, smoothened and finally we obtain a gray scale image.

After the skin has been extracted from the input image, hand cropping is done. As we are considering the gesture shown by the hand only till the wrist portion, it is important to remove the other skin parts.

After extraction of the desired hand we extracted only the edges of the hand in the image so as to further

reduce the time and computational complexity during the whole process. There are different edge detection techniques like Sobel, Prewitt, Roberts, Gaussian, zero- cross and canny method among which Canny method is found to be the best of all the techniques studied as it uses two different thresholds, a low threshold and a high threshold to detect strong and weak edges, thus eliminating the problem of inclusion false edges and discarding the valid edge points.

Extraction of unique features for each gesture which are independent of human hand size and light illumination is important. The proposed system is planned to use Ant Colony Optimization Technique as it provides advantage like positive feedback loop.

LIMITATIONS:

There are many challenges associated with the accuracy and usefulness of gesture recognition software. For image-based gesture recognition there are limitations on the equipment used and image noise. Images or video may not be under consistent lighting, or in the same location. Items in the background or distinct features of the users may make recognition more difficult.

Orientation histogram method have some problems which are similar gestures might have different orientation histograms and different gestures may have similar orientation histograms. Besides that the proposed method will achieve for any objects that dominate the image even if it is not a hand gesture.

Neural network classifier has been applied for gesture classification but it is time consuming and when the number of training data increases, the time required for classification is also increases too. The ant colony algorithm is natural metaphor, adaptivity, inherent parallelism and positive feedback.

In k-means Prediction of K is difficult for fixed number of clusters. Different initial partitions result in different final clusters.

In Hidden Markov Model Large assumptions about the data. Huge number of parameters needs to be set. Training data required is large.

CONCLUSION:

The steps that are using in this method for recognizing different hand gestures are skin filtering, edge detection, Ant Colony Optimization and finally a proper classifier, where we are using angle based classification to detect which symbol the test image elongs to. The experiment is to be performed with bare hands

Going forward the ant colony optimization technique for hand gesture recognition shall be one of the best method comparing current ones.

REFERENCES:

[1].T.Kapuscinski and M. Wysocki, Hand Gesture Recognition for Man- Machine interaction, Second Workshop on Robot Motion and Control, October 18-20, 2001.

[2].C. Yu, X. Wang, H. Huang, J. Shen and K. Wu, Vision-Based Hand Gesture Recognition Using Combinational Features, IEEE Sixth International Conference on Intelligent Information Hiding and Multimedia Signal Processing, 2010, pp. 543-546.

[3].P.S. Rajam and G. Balakrishnan, Real Time Indian Sign LanguageRecognition Systemto aid Deaf-dumb People, IEEE, 2011.

[4]. A. Malima, E. Ozgur and M. Cetin, A Fast Algorithm for Vision- Based Hand Gesture Recognition for Robot Control, IEEE, 2006.

[5]. Manigandan M and I.M Jackin, Wireless Vision based Mobile Robotcontrol using Hand Gesture Recognition through Perceptual Color Space, IEEE International Conference on Advances in Computer Engineering, 2010.

[6]. D.Y. Huang, W.C. Hu and S.H. Chang, Vision-based Hand Gesture Recognition Using PCA+ Gabor Filters and SVM, IEEE Fifth International Conference on Intelligent Information Hiding and Multimedia Signal Processing, 2009.

[7].E. Koh, J. Won and C. Bae, On-premise Skin Color Modeing Method for Vision-based Hand Tracking, The 13th IEEE

[8]. Mokhar M. Hasan, Pramod K. Mishra, (2012). Robust Gesture Recognition Using Gaussian Distribution for Features Fitting, International Journal of Machine Learning and Computing, Vol.2(3). [9].Matthew Turk and Mathias Kölsch, "Perceptual Interfaces", University of California, Santa Barbara UCSB Technical Report

2003-33

[10].MPorta "Vision-based user interfaces: methods and applications", International Journal of Human-Computer Studies, 57:11, 27-73, 2002.

[11].Afshin Sepehri, YaserYacoob, Larry S. Davis "Employing the Hand as an Interface Device", Journal of Multimedia, vol 1, number 2, pages 18-29

[12].Henriksen, K. Sporring, J. Hornbaek, K. "Virtual trackballs revisited", IEEE Transactions on Visualization and Computer Graphics, Volume 10, Issue 2, paged 206-216, 2004

[13].William Freeman, Craig Weissman, Television control by hand gestures, Mitsubishi Electric Research Laboratories, 1995

[14].Do Jun-Hyeong, Jung Jin-Woo, Sung hoon Jung, Jang Hyoyoung, Bien Zeungnam, Advanced soft remote control system using hand gesture, Mexican International Conference on Artificial Intelligence, 2006

[15].K. Ouchi, N. Esaka, Y. Tamura, M. Hirahara, M. Doi, Magic Wand: an intuitive gesture remote control for home appliances, International Conference on Active Media Technology, 2005 (AMT 2005), 2005

[16].Lars Bretzner, Ivan Laptev, Tony Lindeberg, SörenLenman, YngveSundblad "A Prototype System for Computer Vision Based Human Computer Interaction", Technical report CVAP251, ISRN KTH NA/P–01/09–SE. Department of Numerical Analysis and Computer Science, KTH (Royal Institute of Technology), SE-100 44 Stockholm, Sweden, April 2325, 2001.

[17].Thomas G. Zimmerman, Jaron Lanier, Chuck Blanchard, Steve Bryson and Young Harvill.. "A HAND GESTURE INTERFACE DEVICE." http://portal.acm.org.

[18].Yang Liu, YundeJia, A Robust Hand Tracking and Gesture Recognition Method for Wearable Visual Interfaces and Its Applications, Proceedings of the Third International Conference on Image and Graphics (ICIG04), 2004

[19].Kue-Bum Lee, Jung-Hyun Kim, Kwang-Seok Hong, An Implementation of Multi-Modal Game Interface Based on PDAs, Fifth International Conference on Software Engineering Research,

Management and Applications, 2007 [20].PerMalmestig,SofieSundberg,SignWiiverimplementation of sign

language technology

Leave a Reply