As businesses experiment with embedding AI throughout their operations, a surprising trend has emerged: companies are turning to a new type of AI to help their bots understand human emotions better. This technology, known as “emotion AI,” is highlighted in PitchBook’s latest Enterprise SaaS Emerging Tech Research report, which predicts its rise in the business world. I’m not convinced Emotion AI can accurately detect human emotions. The 2019 study that found human emotions cannot be accurately determined raises serious concerns. — Chris TechJourney (@ChrisTechVC) September 1, 2024 For example, an AI should be able to distinguish between an angry “What do you mean by that?” and a confused “What do you mean by that?” This technology positions itself as a more sophisticated sibling to sentiment analysis, which has traditionally attempted to extract emotional meaning from text-based interactions, particularly on social media. I’m not convinced Emotion AI is the solution to creating more human-like interactions. We need to focus on developing more transparent and predictable AI policies, not just pouring resources into trendy tech. — Alejandro Fernández (@GatoYTech) September 1, 2024 Major cloud providers, such as Microsoft Azure and Amazon Web Services, have already begun to offer developers access to Emotion AI capabilities. However, deploying such technologies has not been without controversy, particularly in the case of Amazon’s Rekognition service, which has faced privacy concerns over the years. Interesting topic! I’m concerned about the privacy implications of Emotion AI, especially with big players like Microsoft and Amazon involved. Let’s ensure we regulate this tech before it’s too late. — DataGuardJiro (@jiroonguard) September 2, 2024 According to PitchBook, the proliferation of AI assistants and fully automated human-machine interactions means Emotion AI could enable more human-like interpretations and responses. Emotion AI relies heavily on hardware such as cameras and microphones, which could be embedded in various devices like laptops, phones, or even wearables. In response to the perceived potential of Emotion AI, several startups have entered the market, including Uniphore, which has raised $610 million to date, including $400 million in 2022 led by NEA. Other companies such as MorphCast, Voicesense, Superceed, Siena AI, audEERING, and Opsis have also raised capital from various venture capitalists, according to PitchBook’s estimates. However, there are notable criticisms and challenges. Emotion AI, while touted as a technological advancement, represents a very Silicon Valley approach: using technology to solve problems created by technology itself. Comment Even if AI bots eventually gain some form of automated empathy, that doesn’t necessarily mean the technology will be effective. The last time Emotion AI attracted massive interest was around 2019, when a team of researchers conducted a meta-review of studies and concluded that human emotion could not be reliably determined by facial movements alone. This suggests that the concept of teaching AI to detect human emotions by mimicking human methods—such as interpreting facial expressions, body language, and tone of voice—might be inherently flawed. Comment Moreover, the future of Emotion AI could be further complicated by regulatory constraints. For instance, the European Union’s AI Act has banned the use of computer-vision emotion detection systems for specific applications, like education. Meanwhile, certain U.S. state laws, such as Illinois’ Biometric Information Privacy Act (BIPA), prohibit the collection of biometric data without consent, potentially limiting the deployment of Emotion AI technologies. As Silicon Valley continues to develop an AI-everywhere future, businesses are faced with a choice. Comment They can either push forward with AI bots that attempt to understand human emotions in roles such as customer service, sales, and HR, or they might end up with bots that are not particularly adept at handling tasks requiring genuine emotional intelligence. This could lead to an office environment where AI bots function on the level of a 2023-era Siri, or perhaps something worse like a management-required bot attempting to interpret employees’ emotions in real-time during meetings. Comment While Emotion AI offers new possibilities for enhancing AI-human interactions, its effectiveness, ethical implications, and regulatory compliance will remain points of consideration for businesses looking to adopt this technology. For more news and trends, visit AI News on our website.
The logic behind Emotion AI is that as AI assistants are increasingly deployed to serve executives, employees, and customers, they must be capable of differentiating emotional contexts.
Emotion AI is described as multimodal, meaning it uses various inputs—visual, audio, and others—combined with machine learning algorithms and psychological insights to detect emotions during interactions.
While the concept of Emotion AI isn’t entirely new, its potential future in business looks more promising now due to the increasing use of AI bots in the workforce.
byu/Happy_Ad_4028 from discussion
inoffmychest
byu/Happy_Ad_4028 from discussion
inoffmychest
byu/Slow-Landscape5200 from discussion
insingularity
byu/ComprehensiveFruit65 from discussion
inartificial
‘Emotion AI’ Enters Business Software, Sparking Concerns Over Ethics and Privacy!
Key Takeaways:
As Derek Hernandez, senior analyst of emerging technology at PitchBook, notes, “With the proliferation of AI assistants and fully automated human-machine interactions, emotion AI promises to enable more human-like interpretations and responses.”
Hernandez explains, “Cameras and microphones are integral parts of the hardware side of emotion AI. These can be on a laptop, phone, or individually located in a physical space. Additionally, wearable hardware will likely provide another avenue to employ emotion AI beyond these devices.” This is why a customer service chatbot might request access to a camera or microphone during an interaction.
Was this article helpful?
YesNo