Emotion AI: Do we want AI to understand our feelings?

IMAGE CREDIT:
Image credit
iStock

Emotion AI: Do we want AI to understand our feelings?

Emotion AI: Do we want AI to understand our feelings?

Subheading text
Companies are heavily investing in AI technologies to capitalize on machines being able to analyze human emotions.
    • Author:
    • Author name
      Quantumrun Foresight
    • September 6, 2022

    Insight summary



    Emotion artificial intelligence (AI) is transforming how machines understand and react to human emotions in healthcare, marketing, and customer service. Despite debates on its scientific basis and privacy concerns, this technology is evolving rapidly, with companies like Apple and Amazon integrating it into their products. Its growing use raises important questions about privacy, accuracy, and the potential for deepening biases, prompting a need for careful regulation and ethical considerations.



    Emotion AI context



    Artificial intelligence systems are learning to recognize human emotions and leverage that information in various sectors, from healthcare to marketing campaigns. For example, websites use emoticons to gauge how viewers respond to their content. However, is emotion AI everything that it claims to be? 



    Emotion AI (also known as affective computing or artificial emotional intelligence) is a subset of AI that measures, understands, simulates, and responds to human emotions. The discipline dates back to 1995 when MIT Media lab professor Rosalind Picard released the book “Affective Computing.” According to the MIT Media Lab, emotion AI allows for more natural interaction between people and machines. Emotion AI attempts to answer two questions: what is the human’s emotional state, and how will they react? The answers collected heavily impact how machines provide services and products.



    Artificial emotional intelligence is often interchanged with sentiment analysis, but they are different in data collection. Sentiment analysis is focused on language studies, such as determining people’s opinions about specific topics according to the tone of their social media posts, blogs, and comments. However, emotion AI relies on facial recognition and expressions to determine sentiment. Other effective computing factors are voice patterns and physiological data like changes in eye movement. Some experts consider sentiment analysis a subset of emotion AI but with fewer privacy risks.



    Disruptive impact



    In 2019, a group of inter-university researchers, including Northeastern University in the US and the University of Glasgow, published studies revealing that emotion AI doesn’t have a solid scientific foundation. The study highlighted that it doesn’t matter if humans or AI are conducting the analysis; it’s challenging to accurately predict emotional states based on facial expressions. The researchers argue that expressions are not fingerprints that provide definitive and unique information about an individual.



    However, some experts don’t agree with this analysis. The founder of Hume AI, Alan Cowen, argued that modern algorithms had developed datasets and prototypes that accurately correspond to human emotions. Hume AI, which raised USD $5 million in investment funding, uses datasets of people from the Americas, Africa, and Asia to train its emotion AI system. 



    Other emerging players in the emotion AI field are HireVue, Entropik, Emteq, and Neurodata Labs. Entropik uses facial expressions, eye gaze, voice tones, and brainwaves to determine the impact of a marketing campaign. A Russian bank uses Neurodata to analyze client sentiments when calling customer service representatives. 



    Even Big Tech is starting to capitalize on the potential of emotion AI. In 2016, Apple purchased Emotient, a San Diego-based firm analyzing facial expressions. Alexa, Amazon’s virtual assistant, apologizes and clarifies its responses when it detects that its user is frustrated. Meanwhile, Microsoft’s speech recognition AI firm, Nuance, can analyze drivers’ emotions based on their facial expressions.



    Implications of emotion AI



    Wider implications of emotion AI may include: 




    • Major technology corporations acquiring smaller companies specializing in AI, especially in emotion AI, to enhance their autonomous vehicle systems, resulting in safer and more empathetic interactions with passengers.

    • Customer support centers incorporating emotion AI to interpret vocal and facial cues, leading to more personalized and effective problem-solving experiences for consumers.

    • More funding flowing into affective computing, fostering collaborations between international academic and research organizations, thereby accelerating advancements in human-AI interaction.

    • Governments facing growing demands to create policies that govern the collection, storage, and application of facial and biological data.

    • A risk of deepening biases related to race and gender due to flawed or biased emotion AI, requiring stricter standards for AI training and deployment in public and private sectors.

    • Increased consumer reliance on emotion AI-enabled devices and services, leading to more emotionally intelligent technology becoming integral in daily life.

    • Educational institutions may integrate emotion AI in e-learning platforms, adapting teaching methods based on students' emotional responses to enhance learning experiences.

    • Healthcare providers utilizing emotion AI to better understand patient needs and emotions, improving diagnosis and treatment outcomes.

    • Marketing strategies evolving to use emotion AI, allowing companies to tailor advertisements and products more effectively to individual emotional states.

    • Legal systems possibly adopting emotion AI to assess witness credibility or emotional states during trials, raising ethical and accuracy concerns.



    Questions to consider




    • Would you consent to have emotion AI apps scan your facial expressions and voice tone to anticipate your emotions?

    • What are the possible risks of AI potentially misreading emotions?


    Insight references

    The following popular and institutional links were referenced for this insight:

    MIT Management Sloan School Emotion AI, explained