Judging Human Behavior and Emotional Quotient using AI

DOI : 10.17577/IJERTCONV10IS01002

Download Full-Text PDF Cite this Publication

Text Only Version

Judging Human Behavior and Emotional Quotient using AI

Akshat Sharma

Student, CSE Department Chandigarh Engineering College, Chandigarh

Abstract:-Intelligence and emotions differentiate humans from animals. Emotion is part of a persons behaviour and certain feelings can affect his/her performance, emotions can even prevent a person from producing an intelligent outcome. Therefore, when a computer aims to identifying human behaviour, not only should this computer think and reason, but it could also be able to show emotions. This paper presents a review of recent research that shows how an AI or intelligent systems can judge or identify human emotions. The essential ways to identify human emotions are through facial expressions, body language, expressions used in text and lastly way of talking. This paper will also take these into account. Potentially, after gathering such information systems can use these to come out with the best result for the subject.

Keywords:- Human behaviour, human intelligence, Artificial Intelligence

INTRODUCTION:

When computers can read emotions by analyzing data, including facial expressions, gestures, tone of voice, force of keystrokes, and more to determine a persons emotional state and then react to it, we call this artificial emotional intelligence. This ability will allow humans and machines to interact in a much more natural way and very similar to how human-to-human interaction works. This will also allow machines to make better decisions by know the emotions of the person and knowing how to react or what could be wrong with the situation of the person.

This cannot just help in human-to-human interaction but also in various fields like interrogation and helping the police, could be used for intelligence in the army.

Humans have always been able to claim interrogation mastery above machines when it comes to understanding emotion. But with modern technology it will change soon. People working in the field of artificial emotional intelligence, also known as emotion AI or affective computing, say they are well on their way to the line, and by 2025 this markets value will reach $170+ billion.

When it began: Affective computing began in 1995 in an MIT Media Lab when cameras, microphones, and physiological sensors gathered affective responses to identify emotion, and then machines responded to those emotions. This early work led to lab professor Rosalind Picard publishing Affective computing. Today, a machines adeptness at evaluating data can help it pick up on subtle emotive nuances that only some professional can identify. Something it can even be more efficient at is: getting better by learning from previous data and so on.

LITERATURE SURVEY

Luke Stark and Jesse Hoey in their paper The Ethics of Emotion in Artificial Intelligence Systems said that they develop a taxonomy of conceptual models and proxy data used for digital analysis of human emotional expression and outline how the combinations and permutations of these models and data impact their incorporation into artificial intelligence (AI) systems. They argued that conceptualized of what emotions were and how they can be sensed, measured and transform into data and shape the social implementation of AI systems

Sahiti S. Magapu and Sashank Vaddiparty in their paper The Study of Emotional Intelligence in Artificial Intelligence They talked about how Emotional Intelligence plays a role in Artificial Intelligence and its applications in various fields and also the impact that it will have on society.. They said that even though AI has advanced in these past few years, AI still undergoes the process of perfection as it still copes up with the ability of learning and understanding of enabling itself to replicate into a humans intelligence. Although, this intelligent agent is highly rewarded to society and its impact today, it can be more humanly with the use of Emotional Intelligence equipped into it. With the help of Emotional Intelligence in Artificial Intelligence, we can allow it to widen its fields of knowledge and to provide further and more advanced solutions to complicated problems. If we can have an Emotional Artificial Intelligence, this can make an impact to close the barriers between a human and a machine. With this, it can help in fields like medical, consultation, education, and more, as well as provide new opportunities by equal treatment.

Eleanor Bird, Jasmin Fox-Skelly, Nicola Jenner, Ruth Larbey, Emma Weitkamp and Alan Winfield in their paper The ethics of artificial intelligence: Issues and initiatives about the details of the ethical implications and moral questions that arise from the development and implementation of artificial intelligence (AI) technologies. It presents a comparison between the current main

frameworks and the main ethical issues, and highlights gaps around mechanisms of fair benefit sharing, assigning of responsibility, energy demands in the context of environmental and climate changes, and more complex and less certain implications of AI, like regarding human relationships and human emotion be it in any form.

Christoph Bartneck1, Michael Lyons2, Martin Saerbeck1,3 in their paper The Relationship Between Emotion Models and Artificial Intelligence reflected on the limitations of the OCC model specifically, and on the emotion models in general due to their dependency on artificial intelligence. They talked about how Emotions play an important role in most forms of natural human interaction so we may expect that computational methods for the processing and expressing emotions will play a growing role in human-computer interaction. The OCC model has established itself as the standard model for emotion synthesis Many developers of such characters believe that the OCC model will be all they ever need to equip their character with emotions.

Monika Gautam, Mala Tandon, Amita Bajpai in their paper RELEVANCE OF EMOTIONAL INTELLIGENCE FOR HAPPINESS AND WELL- BEING IN THE ERA OF ARTIFICIAL INTELLIGENCE talked about how The machines with artificial intelligence act as online assistants and also help in the recognition of speech and language. Didnt matter how fast they were growing they couldnt take out place as we possess emotional intelligence and have a sub-conscious and unconscious mind that just have not been perfectly replicated in artificially intelligent machines. Thus the paper discussed the relevance of Emotional Intelligence in the enhancement of happiness and well-being in the present era of the rise of artificially intelligent machines.

Marc Schr¨oder1 and Gary McKeown in their paper Considering Social and Emotional Artificial Intelligence described an implementation of Sensitive Artificial Listeners which provides a hands-on example of technology with some emotional and social skills, and discussed first elements of test methodologies for such technology. This paper also told us the concept of social and emotional intelligence as elements of human intelligence that are complementary to intelligence assessed by the Truing Test. The arguing point was that these elements are gaining importance as human users are increasingly conceptualizing machines as social entities.

Raymond Price and Rose Mary Cordova in their paper Human Behavior Skills and Emotional Intelligence in Engineering described that This paper described the efforts of a college of the engineering at a large university in the mid-west to improve the human behavior skills and capabilities of undergraduate students through an emotional intelligence course. This past decade has been characterized by a series ofchanges in engineering education, including the incorporation of human behavior skills into the list of learning outcomes required for engineering program accreditation. They described their approach, their conceptual model, and some of the progress they made to the date.

PROPOSED WORK

We characterized these past years as series of changes in engineering education, including the

incorporation of human behavior skills into the list of learning outcomes required for engineering program accreditation. We in the modern times required a lot of positive changes in the Artificial Intelligence as well as in the use of Artificial Intelligence. One of the challenges that occurred in our way was- Artificial Intelligence learning and trying to read human behavior through varying means. We started projects in order to achieve such goals but we barely scratched the surface. What we actually need is not just one way but multiple ways to identify human behavior. Thus far the AI we built were somewhat inconsistent and less to not reliable database. Things that we can do to make these AI better is:

  1. with the help of people in the field of affective sciences we can expand our databases with more relevant data including most flinches, simple to subtle expressions, with all the slight tilts and etc.

  2. databases with the data of chats with varying emotion and as well as repeating emotion which could be critically analyzed then and there or maybe later, for example while processing.

  3. database with the data of body language with varying emotion and repeating emotion conveying open body language, closed body language, aggressive, hunched, confused, etc.

  4. data bases with data of way of speaking with varying emotions and repeating emotions for example all the emphasizes on different words, breaks in sentences, sighs in between, speed of speaking, enthusiasm, pronunciations, voice breaks, etc.

Fig1 and fig 2: representing human facial expressions and AI reading human facial expressions respectively

Fig3 and fig4: representing human body language and AI reading human body language respectively

To this date there are countless number of AIs which are not even close to being efficient once they identify an emotion, they tell it with a chance and not with assurance that there are chances that there might be have different a different emotion/expression. For example: Happy 20% Neutral 80% and so on. On this stage humans are more efficient than such machines and AI. If we take above statement in account we will make more efficient AI combined which will be able to identify the emotional quotient of a person at some extent.

(These AI can help us in the long run if we move on the right track. These AIs can help us in flied like)

CONCLUSION

This paper presents the ways through which the Artificial Intelligence can grow, learn and judge/emotions of human beings through the various means. The means could be through minute to strong facial expressions, body language, chats or by the way one speaks. By expanding our databases with re-occurring data of various emotions will be one of the ways through which AI would be able to judge human emotions in an accurate form unlike the previous generation of Artificial Intelligence systems. In the near future these advances will help us in various field for example in courts, robotics and of course in the betterment of AI etc. that is why we need to implement these above suggestions for a much faster growth of AI and AI to human interaction.

REFERENCES

  1. Luke Stark and Jesse Hoey The Ethics of Emotion in Artificial Intelligence Systems.

  2. Sahiti S. Magapu and Sashank Vaddiparty The Study of Emotional Intelligence in Artificial Intelligence

  3. Eleanor Bird, Jasmin Fox-Skelly, Nicola Jenner, Ruth Larbey, Emma Weitkamp and Alan Winfield The ethics of artificial intelligence: Issues and initiatives

  4. Christoph Bartneck, Michael Lyons, Martin Saerbeck The Relationship Between Emotion Models and Artificial Intelligence

  5. Monika Gautam, Mala Tandon, Amita Bajpai RELEVANCE OF EMOTIONAL INTELLIGENCE FOR HAPPINESS AND WELL- BEING IN THE ERA OF ARTIFICIAL INTELLIGENCE

  6. Marc Schr¨oder and Gary McKeown Considering Social and Emotional Artificial Intelligence

  7. Raymond Price and Rose Mary Cordova Human Behavior Skills and Emotional Intelligence in Engineering

Leave a Reply