As companies experiment with incorporating AI everywhere, one unexpected trend is that they are using AI to help their newfound army of bots better understand human emotions.
It's a field known as “emotional AI,” and the technology is predicted to be a growing trend, according to a new Enterprise SaaS Emerging Tech Research report from PitchBook.
Here's why: When companies deploy AI assistants for their executives and employees, and make AI chatbots their front-line sales and customer service reps, how can the AI perform well if it can't understand the difference between an angry “What does that mean?” and a confused “What does that mean?”
Emotion AI claims to be the more sophisticated sibling of sentiment analysis, a pre-AI technique that extracts human emotions from text-based interactions, especially on social media. Emotion AI can be called multi-modal, using sensors for vision, voice, and other inputs, combining machine learning and psychology to try to detect human emotions during interactions.
Major AI cloud providers offer services that give developers access to emotion AI capabilities, such as Microsoft Azure Cognitive Services' Emotion API and Amazon Web Services' Rekognition service (the latter of which has been mired in controversy for years).
Emotion AI delivered as a cloud service is not new, but the proliferation of bots in the workplace makes it more promising than ever in the business world, according to PitchBook.
“With the widespread adoption of AI assistants and fully automated human-machine interactions, emotion AI is expected to enable more human-like interpretations and responses,” Derek Hernandez, senior emerging technologies analyst at PitchBook, wrote in the report.
“Cameras and microphones are an integral part of the hardware side of emotion AI. They can be located individually on your laptop, phone, or in your physical space. Additionally, wearable hardware could provide another avenue for using emotion AI beyond these devices,” Hernandez tells TechCrunch (which is why customer service chatbots may request access to the camera).
To that end, a growing number of startups have launched to make it happen, including Uniphore (which has raised $610 million in total, including $400 million in 2022 led by NEA), as well as MorphCast, Voicesense, Superceed, Siena AI, audEERING and Opsis, each of which has raised modest amounts from a range of VCs, PitchBook estimates.
Of course, Emotional AI is very much a Silicon Valley approach: using technology to solve problems that arise from humans and their use of technology.
But even if most AI bots eventually acquire some form of automated empathy, that doesn’t mean this solution will work in practice.
In fact, the last time emotion AI garnered much interest in Silicon Valley—around 2019, when much of the AI/ML world was still focused on computer vision rather than generative language or art—researchers thwarted the idea. That year, a team of researchers published a meta-review of studies concluding that human emotions can’t actually be determined by facial movements. In other words, the idea that we can teach an AI to pick up on human emotions by having it mimic the ways other humans try to pick up on them (facial reading, body language, tone of voice) is somewhat misguided in its premise.
The idea could also be forestalled by AI regulations such as the European Union's AI law, which bans computer vision emotion detection systems in certain uses, such as education. (Some state laws, such as Illinois' BIPA, also ban the unauthorized collection of biometric data.)
All of this offers a glimpse into the AI-everywhere future that Silicon Valley is frantically building right now. These AI bots will try to understand emotions to perform customer service, sales, HR, and any other task a human would want to assign them to. Or maybe they won't be very good at any task that really requires that ability. Perhaps what we're looking at is office life filled with Siri-level AI bots circa 2023. Which is worse than a must-have bot that guesses everyone's emotions in real time during a meeting?