Sorry, you need to enable JavaScript to visit this website.

Affective computing: Giving computers a human touch

Affective computing: Giving computers a human touch
October 04, 2019

With the rise in AI in everyday life the necessity of emotionally intelligent devices is increasing. Computers are now an essential part of our day to day lives - they are present in our cars, our kitchen appliances, our phones, our bedrooms, and in some cases, even inside our bodies. However, although they are great in many ways, they lack emotional intelligence.

How many times have you yelled at Siri for not understanding what you want? And it keeps giving you the same results! Would you not love a refrigerator that automatically locks its door after sensing that you are stress-eating? Or an app that can sense and flag if a pilot is depressed or in stress? Or an app that can send alert messages if a truck driver is feeling drowsy?

Affective computing, also known as emotional AI, refers to IT systems and devices designed to determine human emotions, respond to the user based on what they perceive, and, in some cases, represent human emotions to users.

Affective computing, also known as emotional AI, refers to IT systems and devices designed to determine human emotions, respond to the user based on what they perceive, and, in some cases, represent human emotions to users.

We, as humans, have six basic emotions: joy, anger, surprise, disgust, fear, and sadness. Also, there are mixed emotions like guilt and sympathy which are created by the combination of two or more basic emotions.

We express these emotions in multiple ways like facial expressions, body movements, various gestures, voice behavior, and other physiological signals such as heart rate and sweat.

Affective computing captures signals from human users through cameras, microphones, skin sensors, and other means, and collects information about facial expression, voice, tone, gestures, and other variables that can indicate emotional state. By evaluating these data points, the emotional intelligence AI system interprets the user’s emotional state. There is also the multimode system that takes more than one type of input, for example, facial and voice multi-model based affective recognition.

Affective computing can be used in almost all parts of our life, giving our interactions with machines an emotional touch.

multi-model based affective recognition

Let’s take an example of facial expressions. Action units (AUs) are the fundamental actions of individual muscles or groups of muscles. There are 45 AUs that combine to express all human emotions. For example, AU12 is lip corner pull (main component of smile) and AU4 is brow furrow (strong indicator of negative emotion).

Action units (AUs) are the fundamental actions of individual muscles or groups of muscles.

Steps to create affective computing software are similar to those of any other AI project. These are:

  1. Collecting huge amounts of data: Like any AI application, affective computing also needs lots and lots of data. For example, to create a software that can recognize a particular facial expression and act accordingly, you will need thousands, if not millions, of images of people of different region, color, culture, etc., to train your model. The more images you have, the more accurate your predictive model will be.
  2. Perform deep leaning: In this step, you will be creating a deep learning model, training it, and testing it.
  3. Infer: In this step, the prod data is passed through the AI model.
  4. Take action: Finally, predefined actions are taken depending upon the outcome of inference.

Affective Computing Research Framework:

Computing Research Framework

Challenges of affective computing:

Like any other AI application, affective computing has some challenges too. These include and are not limited to database shortage, background noise, and multi-agent system.

Use cases:

Humans are emotional creatures and we tend to extend our emotions and feelings to all the things close to us. Affective computing aims at integrating AI and emotions, and can be used in almost all aspects of our lives, giving our interactions with machines an emotional touch. But in the fields mentioned below, affective computing has proved to be exceptionally useful:

  1. Education – MIT has developed TEGA, an educational app that can detect boredom or confusion and help take corrective action.
  2. Intelligent toys – Sony’s Aibo robot dogs, available in the market, can sense the emotions of their owners and react to them.
  3. Medical – Apps that can enable a visually impaired person to understand the visual expressions of the person they are talking to, have been developed.
  4. Autonomous cars – An app that can sense when a driver is tired or drowsy or intoxicated or texting can help prevent accidents.

There are many other user cases like biometric, Virtual reality, Service robots on which research is in progress.e many other use cases like biometric, virtual reality, and service robots, which are being researched.

To conclude, we have been working with computers for the past 50 years and communicating with them in their language. Now, with affective computing working with emotional intelligence, it is time that computers interact with us in our language.