Emotion analytics: Can machines understand how we feel?

IMAGE CREDIT:
Image credit
iStock

Emotion analytics: Can machines understand how we feel?

Emotion analytics: Can machines understand how we feel?

Subheading text
Tech companies are developing artificial intelligence models to decode the sentiment behind words and facial expressions.
    • Author:
    • Author name
      Quantumrun Foresight
    • October 10, 2023

    Insight summary

    Emotion analytics uses artificial intelligence to gauge human emotions from speech, text, and physical cues. The technology primarily focuses on customer service and brand management by adapting chatbot responses in real-time. Another controversial application is in recruitment, where body language and voice are analyzed to make hiring decisions. Despite its potential, the technology has garnered criticism for a lack of scientific basis and potential privacy issues. Implications include more tailored customer interactions, but also the possibility of more lawsuits and ethical concerns.

    Emotion analytics context

    Emotion analytics, also known as sentiment analysis, allows artificial intelligence (AI) to understand how a user feels by analyzing their speech and sentence structure. This feature enables chatbots to determine consumers' attitudes, opinions, and emotions toward businesses, products, services, or other subjects. The main technology that powers emotion analytics is natural language understanding (NLU).

    NLU refers to when computer software comprehends input in the form of sentences via text or speech. With this capability, computers can understand commands without the formalized syntax that often characterizes computer languages. Also, NLU allows machines to communicate back with humans using natural language. This model creates bots that can interact with humans without supervision. 

    Acoustic measurements are used in advanced emotion analysis solutions. They observe the rate at which someone speaks, the tension in their voice, and changes to stress signals during a conversation. The main benefit of emotion analysis is that it doesn’t need extensive data to process and customize a chatbot conversation for user reactions compared with other methods. Another model called Natural Language Processing (NLP) is employed to measure the intensity of the emotions, assigning numerical scores for identified sentiments.

    Disruptive impact

    Most brands use emotional analytics in customer support and management. Bots scan social media posts and mentions of the brand online to gauge the ongoing sentiment towards its products and services. Some chatbots are trained to respond immediately to complaints or direct users to human agents to handle their concerns. Emotion analysis allows chatbots to interact with users more personally by adapting in real-time and making decisions based on the user's mood. 

    Another use of emotion analytics is in recruitment, which is controversial. Primarily employed in the US and South Korea, the software analyzes interviewees through their body language and facial movements without their knowledge. One company that has received much criticism regarding its AI-driven recruitment technology is US-based HireVue. The firm uses machine learning algorithms to figure out a person's eye movements, what they're wearing, and voice details to profile the candidate.

    In 2020, Electronic Privacy Information Center (EPIC), a research organization focusing on privacy issues, filed a complaint to the Federal Trade of Commission against HireVue, stating that its practices don't promote equality and transparency. Nonetheless, several companies still rely on the technology for their recruitment needs. According to Financial Times, AI recruitment software saved Unilever 50,000 hours' worth of hiring work in 2019. 

    News publication Spiked called emotion analytics a "dystopian technology" set to be worth $25 billion USD by 2023. Critics insist that there is no science behind emotion recognition. The technology disregards the complexities of human consciousness and instead relies on superficial cues. In particular, facial recognition technology does not consider cultural contexts and the many ways people can mask their true feelings by pretending to be happy or excited.

    Implications of emotion analytics

    Wider implications of emotion analytics may include: 

    • Large companies employing emotion analytics software to monitor employees and fast-track hiring decisions. However, this might be met by more lawsuits and complaints.
    • Chatbots that offer different responses and options based on their perceived emotions. However, this can result in inaccurate identification of customer mood, leading to more discontented clients.
    • More tech companies investing in emotion recognition software that can be used in public spaces, including retail stores.
    • Virtual assistants that can recommend movies, music, and restaurants based on their users’ feelings.
    • Civil rights groups filing complaints against facial recognition technology developers for privacy violations.

    Questions to comment on

    • How accurate do you think emotion analytics tools can be?
    • What are the other challenges of teaching machines to understand human emotions?

    Insight references

    The following popular and institutional links were referenced for this insight: