Download Full-Text PDF Cite this Publication

**Open Access**-
**Total Downloads**: 5 -
**Authors :**Jitendra Suthar, Ashutosh Vyas -
**Paper ID :**IJERTCONV2IS03013 -
**Volume & Issue :**ETRASCT – 2014 (Volume 2 – Issue 03) -
**Published (First Online):**30-07-2018 -
**ISSN (Online) :**2278-0181 -
**Publisher Name :**IJERT -
**License:**This work is licensed under a Creative Commons Attribution 4.0 International License

#### PCA Algorithm and its application in image compression and face recognition

PCA Algorithm and its application in image compression and face recognition

Jitendra Suthar

M. Tech. Scholar

Department of Computer Science and Engineering Jodhpur Institute of Engineering & Technology Jodhpur, Rajasthan jitendra.suthar@jietjodhpur.com

Ashutosh Vyas

Associate Professor

Department of Computer Science and Engineering Jodhpur Institute of Engineering & Technology Jodhpur, Rajasthan ashutosh.vyas@jietjodhpur.com

AbstractPCA (Principal Component Analysis) is an algorithm which is mainly used for face detection and recognition. This algorithm is basically used for learning of sample (training) face images because before starting to face detection and recognition we have to require a set of Eigen values and Eigen vector of sample (trained) images. So PCA is used for face recognition. PCA is a method which is applied on the group of colour images.

This image is divided to some rows. The division of rows is according to variation in the image. Each row acts as an input for simple PCA algorithm. If the rows have the high data similarity then we apply HOTELLING transform on the rows of a given image. First we have to apply algorithms which find out the variation between the rows so we can divide the image into number of rows for PCA calculation. For the best result we find out the covariance value between the rows of image matrix.

Index Terms Principal Component, Eigen value, Eigen vector, covariance matrix, HOTELLING transform, Standard Deviation, Variance, Covariance.

INTRODUCTION

PCA (Principal Component Analysis) is an Algorithm which is mainly used for face detection and recognition. This algorithm is basically used for learning of sample (training) face images because before starting to face detection and recognition we have to require a set of Eigen values and Eigen vector of sample (trained) images. PCA (Principal Component Analysis) Algorithm basically takes sample images as an input and calculates the Face vector for each input image and then find out the average Image Face vector for all input Face vectors and then calculate the covariance matrix for all input (sample) images. For finding the Eigen value and Eigen vector of covariance matrix. A PCA (Principal Component Analysis) Algorithm is implemented by following steps-

First it takes sample images as an input.

Calculate Face vector for each input image.

After Finding the Face vectors it find out the Average Face vector of all input images Face vector.

Calculate the Covariance matrix of all input Face vectors.

Calculate Eigen value and Eigen vector of Covariance matrix.

Principal Component Analysis technique involve a mathematical procedure that transform a number of correlated variables (data) into a smaller number of uncorrelated variables (data).These transformed data is known as Principal Component of PCA Algorithm. First Principal component measures as much as variability in the data as possible, and each succeeding component accounts for as much of the remaining variability as possible. So it is a technique of identifying patterns in data, and represents the data in such a way as to highlight their similarities and differences. In a high dimension data it is hard to find the pattern and luxury of graphical representation is not available, PCA is a powerful tool for analysing data.

The other main advantage of PCA is that once you have found these patterns in the given data, then you can compress the data by reducing the number of dimension, without much loss of information.

PCA COMPUTATION

For implementing the PCA Algorithm using Covariance matrix follow these steps-

First eigenvectors are found from the covariance matrix then in next step is to order them by eigenvalue, highest to lowest. This gives components in order of significance of data. Now we decide to ignore the components of lesser significance. This loses some information, but if the eigenvalues are small, we do not lose much. If we leave out some components, the final data set will have lesser dimensions than the original.

To be precise, if our originally data have n dimensions and then we calculate n eigenvectors and eigenvalues then we choose only the first p eigenvectors so the final data set has only p dimensions. Now we calculate a feature vector, which is constructed by taking the eigenvectors that we want to keep from the list of Eigenvectors Feature Vector = (eig1, eig2, eig3..eig N)

Final Data= Row Feature Vector * Row Data Adjust

where "row feature vector" represent the matrix with the eigenvectors in the columns transposed so that the eigenvectors are now comes in the rows, with the most principal Component eigenvector at the top, and "Row Data Adjust" is the mean- adjusted data transposed. For Example data items are in each column, with each row holding a separate dimension. Final Data" set is the final data set, with the data items in columns and dimensions along rows.

BACKGROUND MATHEMATIC FOR PCA ALGORITHM

However, it is useful to have a similar measure to find out how much the dimensions vary from the mean with respect to each other .Covariance is such a measure. Covariance is always measured between two dimensions. If you calculate the covariance between one dimension and itself, you get the variance.

Eigenvectors: – In mathematics, eigenvalue, eigenvector, and Eigenspace are related concepts in the field of linear algebra. Eigenvalues, eigenvectors and Eigenspace are properties of a matrix. In general, a matrix acts on a vector by changing both its magnitude and its direction. However, a matrix may act on certain vectors by changing only their magnitude, and leaving their direction unchanged these vectors are the eigenvectors of the matrix. Formally, if A is a linear transformation, a non-null vector X is an eigenvector of A if there is a scalar such that AX=X .

The scalar is said to be an eigenvalue of A corresponding to the eigenvector X. Lastly, all the eigenvectors of a matrix are perpendicular, i.e. at right angles to each other, no matter how many dimensions you have. By the way, another word for perpendicular, in maths talk, is orthogonal. This is important because it means that you can express the data in terms of these perpendicular eigenvectors, instead of expressing them in terms of the x and y axis. We will be doing this later in the section on PCA.

Any new captured human face can be considered to be a combination of these standard input faces. In PCA method a person's face is not recorded by a digital photograph, but instead as just a list of values which are much less space is taken for each person's face. The Eigenfaces that are created will appear as light and dark areas that are arranged in a particular pattern. This pattern calculates that how different

features of a face are singled out to be evaluated and scored. Also, there will be a pattern to evaluate symmetry for input faces, if there is any style of facial hair, where the hairline is, or evaluate the size of the nose or mouth. Other Eigenfaces have patterns that are less simple to identify, and the image of the Eigenfaces may look very little like a face. This technique is also used for handwriting analysis, lip reading, voice reconition, sign language/hand gestures interpretation and medical imaging analysis.

METHOD OF IMAGE COMPRESSION BY USING

PCA ALGORITHM

PCA is method is applied on the group of images now we apply PCA to an image and the image is a colored image. this image is divided to some rows. The division of rows is according to variation in the image. Each row acts as an input for simple PCA algorithm. If the rows have the high data similarity then we apply HOTELLING transform on the rows of an given image. First we have to apply an algorithm which finds out the variation between the rows so we can divide the image into number of rows for PCA calculation. For the best result we find out the covariance value between the rows of image matrix.

Covariance between two random variable is

cov(a,b) = E[(a-Âµa)(b-Âµb)] where Âµa = E[a], Âµb = E[b]

For this we find out similarity between the rows of an image matrix

similarity(i) = cov( i row; i + 1row)

where

i = 1, 2, 3, 4 n -1

take the two neighbor rows of an given image matrix as a and b rows and find out the similarity between these rows if similarity is very small then leave this row and find out similarity between first and third row and so on calculate the similarity between rows and discard the rows.

Figure 1: The rows of M * N matrix and their similarity values

In order to improve the efficiency of mentioned algorithm to divide image into some rows, we generate a curve of similarity values between its rows of an given image. Suppose there are M square N *P monochrome images. Colored images can be supposed as 3 monochrome images that are red, green, and blue color components for each individual pixel N*P monochrome images are equivalent to N*P matrixes that the values of the components of the matrices corresponding pixel's location. Suppose N*P = Q. By reshaping the matrixes, the image can be expressed as 1* Q vectors Fi in equation 1. All images are put in X matrix that its elements are the intensity values of input images.

Figure 2: A simple test image and the similarity value curve of the image.

REFERENCES

Zhaodi Xiao, School of Sciences, South China University of Technology, Guangzhou, China "An Image Compression Algorithm Based on PCA and JL" Journal of Image and Graphics Vol. 1, No. 3, September 2013

Farshid Mehrdoust "A hybrid randomized algorithm for image compression" International Journal of Applied Mathematical Research, 2 (1) (2013) 1-7

S. Zebhi, M. R. Aghabozorgi Sahaf, and M. T. Sadeghi "IMAGE FUSION USING PCA IN CS DOMAIN" Signal Processing Research Group, Electrical and Computer Engineering Department, Yazd University Signal Image Processing : An International Journal (SIPIJ) Vol.3, No.4, August 2012

R.Navaneethakrishnan "Study of Image Compression Techniques" International Journal of Scientific Engineering Research, Volume 3, Issue 7, ISSN 2229-5518, July-2012

Mostafa Mofarreh-Bonab " A New Technique for Image Compression Using PCA " University of Bonab, East Azerbaijan, Iran 2012

Sherin Kishk , Hosam Eldin Mahmoud Ahmed and Hala Helmy "Integral Images Compression using Discrete Wavelets and PCA" International Journal of Signal Processing, Image Processing and Pattern Recognition Vol. 4, No. 2, June, 2011

Pramod Kumar Pandey,Yaduvir Singh,Sweta Tripathi "Image Processing using Principle Component Analysis" International Journal of Computer Applications Volume 15 1 No:4 ,

February 2011

Stefan Stolevski "Hybrid PCA Algorithm for Image Compression" 18th Telecommunications forum TELFOR 2010.

Prof. Y. Vijaya Lata, Chandra Kiran Bharadwaj Tungathurthi,

H. Ram Mohan Rao, Dr. A. Govardhan, Dr. L. P. Reddy "Facial Recognition using Eigenfaces by PCA".International Journal of Recent Trends in Engineering, Vol. 1, No. 1, May 2009

Daoqiang Zhang , Zhi-Hua Zhou and Songcan Chen "Diagonal Principal Component Analysis for Face Recognition" National Laboratory for Novel Software Technology Nanjing University, Nanjing 210093, China 2006

M. Mudrova, A. Prochazka "PRINCIPAL COMPONENT ANALYSIS IN IMAGE PROCESSING" Institute of Chemical Technology, Prague Department of Computing and Control Engineering 2005

M. Turk and A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive Neuroscience, vol. 3, no. 1, pp. 71-86, 1991

"Iterative Methods for Computing Eigenvalues and Eigenvectors" Maysum Panju University of Waterloo.

Raghuraman Gopalan, Prof. Shihab Shamma "IMAGE COMPRESSION AND FACE RECOGNITION USING PRINCIPAL COMPONENT ANALYSIS" ENEE 630:

Advanced Digital Signal Processing

"Eigenfaces for Recognition" (M. Turk, A. Pentland)