A Survey on Blue Eyes Technology

Download Full-Text PDF Cite this Publication
Text Only Version


A Survey on Blue Eyes Technology

Anagha P A Department of computer science St. Josephs college (Autonomous)

Irinjalakuda, Thrissur, Kerala

Geethu Wilson Department of computer science St. Josephs college (Autonomous)

Irinjalakuda, Thrissur, Kerala

Abstract:- The BLUE EYES technology aims at making procedure machines that have sensory activity and sensory ability like those of people in general. The basic plan behind this technology is to administer the pc the human power. We all have some perceptual abilities. That is we can understand each others feelings. For example we are able to perceive ones emotion by analyzing his face expression. Adding these perceptual abilities of human to computers would enable computers to figure alongside citizenry as intimate partners

Keywords Bluetooth, speech recognition, DAU, CSI,BCI


    Imagine yourself in a very world wherever humans move with computers. You are sitting before of your personal com-puter which will listen, talk, or perhaps scream aloud. It has the flexibility to collect data regarding you and move with you through special techniques like face recognition, speech recognition, etc. It will even perceive your emotions at the bit of the mouse. It verifies your identity, feels your presents, and starts interacting with you .You asks the pc to dial to your friend at his office. It realizes the urgency of matters through the mouse, dials your friend at his workplace, and establishes a affiliation. Human cognition depends totally on the power to perceive, interpret, and integrate audio-visuals and sensoring information. Adding extraordinary sensory activity talents to computers would change computers to figure along side folks as intimate partners. Researchers are trying to feature additional capabilities to computers that may enable them to move like humans, acknowledge human presents, talk, listen, or perhaps guess their feelings. The BLUE EYES technology aims at making machines that have sensory activity and sensory ability like those of folks. It uses non- obtrusive sensing methodology, using latest video cameras and microphones to spot the utilizationrs actions through the use of imparted sensory abili-ties. The machine will perceive what a user desires, wherever he’s viewing, and even understand his physical or emotional states


    Figure 1. Software Analysis Diagram

    Looking after working operators’ physiological state is that the main task of Blue Eye System Software. Real time buffer-ing of the incoming data, real-time physiological data analysis and alarm triggering are being performed by the software to point out instance reaction on Operators condition. Several functional modules System core is consisted within the Blue Eyes software which facilitates the flow of transfer between other system modules (e.g. transfers data from the Connection Manager to data analyzers, processed data from the info analyzers to GUI controls, other data analyzers, and data. Visualization module provides a computer programme for the supervisors. A preview of selected video source and related sound stream the working operators physiological conditions watching is enabled by this software. Every time the supervisor is instantly signaled on the incoming of alarm messages. The mental image module is set in associate off-line mode, where all the data is fetched from the database. The supervisor reconstruct the course of the selected information. Adding extraordinary sensory activity talents to computers would alter computers to figure at the side of people in general as intimate partners. Re-searchers try to feature additional capabilities to comput-ers that may permit them to act like humans, recognize human presents, talk, listen, or even guess their feelings. It aims at making machine machines that have sensory activity and sensory ability like those of people in general. It uses non-obtrusive sensing methodology, employing latest video cameras and microphones to spot the utilization rs actions through the use of imparted sensory abilities. The machine will perceive what a user needs, where he’s watching , and even realize his physical or emotional states.

    The BLUE EYES technology aims at making machine ma-chines that have sensory activity and sensory ability like those of people in general. It uses nonobtrusive sensing methodology, using hottest video cameras and microphones to spot the employmentrs actions through the use of imparted sensory talents. The machine can understand what a user wants, where he is looking at, and even realize his physical or emotional states. In the name of BLUE EYES Blue during this term stands for Blue tooth (which enables wireless communication) and eyes because eye movement enables us to get tons of interesting and information.

    Operators duty by watching all the recorded physiological parameters, alarms, video and audio data. A set of custom-built GUI controls is used to present physiological data


    Types of Emotion Sensors used in Blue Eyes Technology:

    • For Hand – Emotion Mouse:

    The major aim of Brain Computer Interface (BCI) is to develop a sensible and accommodative automatic data processing system. These kinds of project should embrace speech recognition, eye tracking, facial recognition, gesture recognition etc. software and hardware. Similarly in Blue Eyes technologies, we’d like to create a system have the power to spot of these perceptual abilities of citizenry . In Blue Eyes, the machines have the power to spot the minor variations within the moods of citizenry . Say someone might strike the keyboard in haste or softly depends on his mood like happy or in angry. The Blue Eyes technology allows the machines to spot these minor emo-tional variations of men even by one bit on the mouse or key board and also the machines began to react with the users in keeping with this emotional levels. This is finished the steerage of intelli-gent devices like Emotion Mouse. Actually this feeling Mouse is associate degree device to trace the emotions of a user by a straightforward bit thereon. The feeling Mouse is intended to guage and determine the users emotions like worry, surprise, anger, sadness, happi-ness, disgust etc. when he/she is interacting with computer. The main objective of the feeling Mouse is to assemble the users physical and physiological info by a straightforward bit.

    • For Eye – Expression Glass:

    Expression Glass is another for the sometimes offered machine vision face or eye recognition ways. By analyzing pattern recognition ways and striated muscle variations, the glass senses and identifies the expressions such as interest or confusion of the user. The image used for this glass uses electricity sensors.


    Figure 2. Expression Glass

    MAGIC Pointing:

    The Eye gaze tracking methods explores a replacement way for han-dling eye gaze for man machine interfacing. The gaze tracking has been deliberated as a superb pointing method for giving input to computers. But several drawbacks exist with this ancient eye gaze pursuit ways. To overcome these difficulties another approach termed as MAGIC – Manual and Gaze Input Cascaded is projected. In this approach, eye gaze pointing appears to the user as a manual job, utilized for fine selection and manipulation processes. Even so, an outsized quantity of the pointer movement is removed by bending the pointer to the attention gaze portion, which surrounds the target. The selection and pointing of the curser is primarily controlled by manual mean but also guided by a gaze tracking mechanism and is usually referred to as MAGIC Pointing. The main aim of MAGIC pointing is to use gaze to warp the previous position (home) of the curser to the locality of the target, reasonably where the user was observing, so as to reduce the cursor motion amplitude required for target selection. When the cursor position is identified, only a small movement is needed by the user to click on the target by a manual input device that is to accomplish Manual Acquisition with Gaze Initiated pointer or Manual and Gaze Input Cascaded (MAGIC) inform. Two MAGIC Pointing methods con-servative and liberal in terms of cursor placement and target identification, were outlined, analyzed and executed with an eye fixed tracker unit.

    1. Data Acquisition Unit

      Data Acquisition Unit may be a mobile a part of the Blue eyes system. Its main task is to fetch the physiological information from the sensing element and to send it to the central system to be processed. To accomplish the task the device should manage wireless Bluetooth connections (connection establishment, authentication and termination). Personal ID cards and PIN codes give operator’s authorization. Communication with the operator is carried on employing a simple 5-key keyboard, alittle LCD display and a beeper. When an exceptional situation is detected the device uses them to notify the operator. Voice data is transferred employing a small headset, interfaced to the DAU with standard mini- jack plugs.

      The Data Acquisition Unit comprises several hardware modules:

      Atmel 89C52 microcontroller – system core

      Bluetooth module (based on ROK101008)

      HD44780 – small LCD display

      24C16 – I2C EEPROM (on a removable ID card)

      MC145483 13bit PCM codec

      Jazz Multisensor interface

      Beeper and LED indicators ,6 AA batteries and voltage level monitor

    2. Central System Unit

    Central System Unit hardware is second peer of the wireless connection. The box contains a Bluetooth module (based on ROK101008) and a PCM codec for voice data transmission. The module is interfaced to a PC employing a parallel, serial and USB cable. The audio data is accessible through standard mini-jack sockets. To program operator’s personal ID cards we developed a simple programming device. The computer programmer is interfaced to a computer victimization serial and PS/2 (power source) ports. Inside, there is Atmel 89C2051 microcontroller, which handles UART transmission and I2C EEPROM (ID card) programming.


    Many approaches for blue eye technology and human emotion recognition have been proposed in the last two decades. Mizna Rehman Mizna et. al. [1] this paper presents a technique which identifies human emotions (happy, surprise, sad or excited ) using image processing by taking out only the eye portion from the captured image which is further compared with images that are already stored in database. This paper intends two results of emotional sensory world. First, observation reveals the fact that different eye colors and their intensity results in change in emotions. It changes without giving any information on shape and actual detected emotion. It is used to successfully recognize four different emotions of eyes. S.R. Vinotha et. al. [2], this paper uses the feature extraction technique to extract the eyes, support vector machine (SVM) classifier and a HMM to build a human emotion recognition system. The projected system presents a human emotion recognition system that analyzes the human eye region from video sequences. From the frames of the video stream the human eyes can be extracted using the well-known canny edge operator and classified using a non linear Support Vector machine (SVM) classifier. Finally, standard learning tool is used, Hidden Markov Model (HMM) for recognizing the emotions from the human eye expressions.

    Mohammad Soleymani et. al. [3] this paper presents the methodology in which instantaneous detection of the users emotions from facial expression and electroencephalogram

    (EEG) signals is used. A set of videos having different emotional level were shown to a group of people and their physiological responses and facial expressions were recorded. Five annotators annotates the valence (from negative to positive) in the users face videos. A continuous annotation of arousal dimensions and valence is also taken for stimuli videos. Continuous Conditional Random Fields (CCRF) and Long-short-term-memory recurrent neural networks (LSTM- RNN) were used in detecting emotions continuously and automatically. The analyzed effect of the interference of facial muscle activities on EEG signals shows that most of the emotionally valued content in EEG features are as a result of this interference. However, the arithmetical analysis showed that in presence of facial expressions EEG signals carries complementary information. T. Moriyama et. al. [4] this paper presents a system that has capabilities of giving detailed analysis of eye region images in terms of the position of the iris, angle of eyelid opening, and the texture, shape and complexity of the eyelids. The system uses an eye region model that parameterizes the motion and fine structure of an eye. The structural factors represent structural individuality of the eye, including the color and size of the iris, the complexity, boldness and width of the eyelids, the width of the illumination reflection on the bulge and the width of the bulge below the eye. The motion factors represent movement of the eye, including the 2D position of the iris and the up- down motion and position of the upper and lower eyelids. Renu Nagpal et. al. [5] this paper presents the worlds first publicly available dataset of labeled data that has been recorded over the Internet of people naturally viewing online media. The AM-FED contains, 1) More than 200 webcam videos recorded in real-world conditions, 2) More than 1.5 lakhs frames labeled for the presence of 10 symmetrical FACS action units, 4 asymmetric (unilateral) FACS action units, 2 head movements, smile, general expressiveness, feature tracker fails and gender, 3) locations of 22 automatically detect landmark points, 4) baseline performance of detection algorithms on this dataset and baseline classifier outputs for smile. 5) Self-report responses of familiarity with, liking of and desire to watch again for the stimuli videos. This represents a rich and extensively coded resource for researchers working in the domains of facial expression recognition, affective computing, psychology and marketing. The videos that are recorded in real-world conditions are present in dataset. In particular, they exhibit non-uniform frame rate and non-uniform lighting. The camera position relative the viewer varies from video to video and in some cases the screen of the laptop is the only source of illumination. The videos contain viewers from a range of ages and customs some with glasses and facial hair. A large number of frames with fixed presence of facial action units and other labels is contained in data set.


Recent research documents tell that the understanding and recognition of emotional expressions plays a very important role in the maintenance and development of social relationships. This paper gives an approach of creating computational machines that have perceptual and sensory ability like those of human beings which enables

the computer to gather information about you through special techniques like facial expressions recognition and considering biological factors like heart rhythm and blood heat . This makes it possible for computer and machines to detect the emotion of the human and respond.


  1. Mizna Rehman Mizna m.rehman13@live.com,Mamta Bachani, SundasMemon, Blue Eyes Technology, 978-4799-0615- 4/13/$31.00 ©2013 IEEE.
  2. S.R.Vinotha, R.Arun and T.Arun, Emotion Recognition from Human Eye Expression, Internatioal Journal of Research in Computer and Communication Technology, Vol 2, Issue 4, April2013.
  3. Mohammad Soleymani, Member, IEEE, Sadjad Asghari-Esfeden, Student member, IEEE Yun Fu, Senior Member, IEEE, Maja Pantic, Fellow, IEEE, Analysis of EEG signals and Facial expressions for continuous emotion detection, 1949-3045 (c)

    2015 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.

  4. T T. Moriyama, T. Kanade, J. Xiao, and J.F.Cohn, Meticulously Detailed Eye Region Model and Its Application to Analysis of Facial Images, IEEE Trans. Pattern Analysis and Machine Intelligence, vol [13] Recognition Of Facial Expressions Of Six Emotions By Children With Specific Language Impairment, Kristen D. Atwood, Brigham Young University, August 2006.
  5. McDuff D., Kaliouby R., Senechal T., Amr M.,Cohn J., Picard R.W., “Affective-MIT Facial Expression Dataset (AMFED): Naturalistic and Spontaneous Facial Expressions Collected In-the- Wild., 2013 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW’10), Portland, OR, USA, June 2013.
  6. Reshma P , Rincy M Rafi A Survey On Blue Eyes Technology
  7. Manisha Kumavat,Garima Mathur,Nikita Susan Saju Research Paper on Blue Eyes Technology
  8. D.D. Mondal, Arti Gupta, Tarang Soni, Neha DandekarBlue Eyes Technology

Leave a Reply

Your email address will not be published. Required fields are marked *