Emotion recognition: Cashing in on people’s emotions

IMAGE CREDIT:
Image credit
iStock

Emotion recognition: Cashing in on people’s emotions

Emotion recognition: Cashing in on people’s emotions

Subheading text
Companies race to develop emotion recognition technologies that can accurately identify potential customers' feelings at any given moment.
    • Author:
    • Author name
      Quantumrun Foresight
    • February 2, 2023

    Much has been said about the limitations of facial recognition technology in accurately identifying emotions and mental states. Still, companies insist that emotions are quantifiable and that artificial intelligence (AI) can one day crack the code on humanity's complex feelings.



    Emotion recognition context



    Emotion recognition technology scans biomarkers, such as the face, voice, and heart rate, to detect emotional states. However, the wide range of use cases for this type of technology presents significant privacy risks. For example, Hyundai Motor filed a 2019 patent for an electronic device that would attach to someone's skin and measure their bio-signals to figure out what emotions they were feeling. The display would show a color associated with the sentiment. Even though bio-signals more accurately pinpoint emotions, this method was considered too invasive.



    In the recruitment sector, HireVue and other companies boast that their software is able to not only analyze a candidate's face and body posture to determine their "employability score" but also whether they would work well on a team. These assessments have the potential to significantly shape a candidate's future. In regions where AI-assisted hiring is rapidly gaining popularity, such as the US and South Korea, coaches teach new graduates and job seekers how to interview with an algorithm. Additionally, this technology is being used in schools for children and has even been studied for its ability to detect dishonesty in courtroom videos.



    However, very few of these claims have any scientific evidence to back them up. No reliable studies (as of 2022) show that analyzing body posture or facial expressions can help identify the best employees or students. Given that the market for emotion recognition is expected to reach USD $25 billion by 2023, there has been significant backlash from those who believe the technology could lead to potential discrimination, similar to problems seen with predictive sentencing or housing algorithms.



    Disruptive impact



    Even with its ethical issues, investments in emotional technology are still increasing. One of the more promising developments in the field is multimodal recognition, which can simultaneously scan various body parts. For example, in 2019, Kia Motors unveiled Real-time Emotion Adaptive Driving (READ), which uses cameras and sensors to analyze facial expressions, heartbeats, skin conduction, and breathing to understand the physical and emotional state of occupants in real-time. The results show that this new service not only provides customized interior features like lighting and music but also seat vibrations and scents. It is designed entirely for passenger comfort.



    In 2021, the vocal biomarker startup Sonde Health created an app to detect a wide range of medical conditions simply by recording and analyzing a 30-second audio clip. By detecting changes or nuances in the user's voice, such as smoothness, control, liveliness, energy range, and clarity, the app can pinpoint depression, stress, anxiety, or fatigue. 



    Investments have been increasing in voice biomarkers for mental healthcare. In 2021, mental health AI startup Ellipsis Health secured USD $26 million to expand its depression and anxiety voice test. Kintsugi, another vocal biomarker startup, received USD $8 million in funding. With the USD $90-million investment in therapy chatbot developer Woebot, it is clear that AI is on the rise for detecting emotions in healthcare.



    Implications of emotion recognition



    Wider implications of emotion recognition may include: 




    • More recruiting agencies using facial emotion recognition technology to assess and categorize applicants.

    • Screen devices or billboards with cameras gaining the ability to adjust the advertisements they display based on the detected emotional state of the owner or passerby.

    • Increasing backlash by ethicists and civic groups on the use of emotion recognition, which can lead to lawsuits against firms that discriminate through this technology.

    • More startups experimenting with how biomarkers can be used to detect emotions, particularly in healthcare.

    • Senior care and mental health facilities using this tech to more effectively monitor the emotional states of their patients and provide more timely care during periods of detected distress.

    • The service and entertainment industry using emotion recognition technologies to provide hyper-personalized experiences, such as virtual reality (VR) environments.

    • Select governments integrating this tech into existing urban CCTV networks to monitor the real-time emotional states of their domestic populations, potentially applicable to predictive policing and riot control.

    • Pressure for governments to regulate emotion technology, particularly in hiring, policing, and public surveilling.



    Questions to comment on




    • If you have tried using emotion recognition technology, how accurate was it?

    • What are the other potential challenges of relying on emotion recognition tech to make significant decisions?


    Insight references

    The following popular and institutional links were referenced for this insight: