21 April 2018

Artificial Intelligence in Affective Computing

Affect means "touch the feelings of, or move emotionally". Whereas, affective means “relating to moods, feelings, and attitudes”. Thus affective computing is “the study and development of systems and devices that can recognize, interpret, process, and simulate human affects”. Affective computing is an interdisciplinary field which spans computer science, psychology, and cognitive science. Affective computing is sometimes called artificial emotional intelligence, or emotion AI. Emotional intelligence can be defined as “the capacity to be aware of, control, and express one's emotions, and to handle interpersonal relationships judiciously and empathetically”.

Sentiment analysis might be considered a primitive form of affective computing. Sentiment analysis may be defined as “the process of computationally identifying and categorizing opinions expressed in a piece of text”, especially in order to determine whether the writer's attitude is positive, negative, or neutral. Sentiment analysis may also be referred to as opinion mining, or emotion AI. SentiWordNet is a popular lexical resource for opinion mining. It assigns three sentiment scores to WordNet synsets, or sets of synonyms. Natural language processing toolkits are often used for sentiment analysis, such as GATE, LingPipe, NLTK, R-Project, RapidMiner, StanfordNLP, UIMA, and WEKA.

In terms of natural language the 2011 book Affective Computing and Sentiment Analysis: Emotion, Metaphor and Terminology, edited by Khurshid Ahmad, addresses the role of metaphor in affective computing. Metaphor is something that is considered to be representative or symbolic of something else, in other words a paradox of comparing unlike things. Contributor Andrew Goatly looks at metaphor as a resource for conceptualisation and expression of emotion. For instance, emotions may be present in deep lexical semantics. Metaphoricity is the quality of being metaphorical, which contributor Carl Vogel maintains involves sense modulation. In a conversational agent, affect may be transferred by metaphor, forming a kind of artificial or synthetic emotion.

In ‘affective dialog systems’, an ‘affect listener’ is a device which detects and adapts to the affective states of users, facilitating meaningful responses. The SEMAINE project was a well known European Union initiative to create a non-verbally competent ‘sensitive artificial listener’. SAL, the SEMAINE sensitive artificial listener, was in effect a kind of ‘emotional agent’, or ‘emotion agent’, which could be termed an ‘affective interface’.

Automated, or automatic, emotion recognition leverages techniques from signal processing, machine learning, and computer vision. Computers use different methods to interpret emotion, from Bayesian networks to Paul Ekman's ‘Facial Action Coding System’. A number of companies are now working with automatic emotion recognition, including affectiva.com (Emotion Recognition Software), eyeris.ai (Emotional AI and Face Analytics), imotions.com (Emotion Analysis Engine), nviso.ch (Emotion Recognition Software), and visagetechnologies.com (Face Tracking and Analysis).

From this overview, it becomes clear that that there are two main aspects to affective computing, 1) emotion detection, or emotion recognition, and 2) artificial or synthetic emotion, or emotion synthesis. Facial recognition figures prominently in emotion detection, however language can be used for emotion recognition, in particular metaphor. Voice biometrics are also being used for detecting emotion, something like polygraph biofeedback. Emotion may be synthesized facially in the form of an avatar, or talking head, not to mention animatronic head. Emotional body language could be expressed by a humanoid robot. But also natural language can be used for the expression of computational affect, in the form of metaphor generation as a vehicle for emotion.

References:

No comments: