Multimodal sentiment analysis combining language, audio, visual, and physiological signals.
The multimodal neural network is used to predict user sentiment from multimodal features such as text, audio, and visual data. In a new study, researchers from Japan account for physiological signals in sentiment estimation while talking with the system, greatly improving the system’s performance.
Image courtesy: Shogo Okada from JAIST
Researchers integrate biological signals with gold-standard machine learning methods to enable emotionally intelligent speech dialog systems
Artificial intelligence (AI) is at the forefront of modern technology. Making AI “emotionally intelligent” could open doors to more natural human-machine interactions. To do this, it needs to pick up on the user’s sentiment during a dialog. Physiological signals could provide a direct route to such sentiments. Now, researchers from Japan take things to the next level with an AI with sentiment-sensing capabilities comparable to that of humans.
Speech and language recognition technology is a rapidly developing field, which has led to the emergence of novel speech dialog systems, such as Amazon Alexa and Siri. A significant milestone in the development of dialog artificial intelligence (AI) systems is the addition of emotional intelligence. A system able to recognize the emotional states of the user, in addition to understanding language, would generate a more empathetic response, leading to a more immersive experience for the user.
“Multimodal sentiment analysis” is a group of methods that constitute the gold standard for an AI dialog system with sentiment detection. These methods can automatically analyze a person’s psychological state from their speech, voice color, facial expression, and posture and are crucial for human-centered AI systems. The technique could potentially realize an emotionally intelligent AI with beyond-human capabilities, which understands the user’s sentiment and generates a response accordingly.
However, current emotion estimation methods focus only on observable information and do not account for the information contained in unobservable signals, such as physiological signals. Such signals are a potential gold mine of emotions that could improve the sentiment estimation performance tremendously.
In a new study published in the journal IEEE Transactions on Affective Computing, physiological signals were added to multimodal sentiment analysis for the first time by researchers from Japan, a collaborative team comprising Associate Professor Shogo Okada from Japan Advanced Institute of Science and Technology (JAIST) and Prof. Kazunori Komatani from the Institute of Scientific and Industrial Research at Osaka University. “Humans are very good at concealing their feelings. The internal emotional state of a user is not always accurately reflected by the content of the dialog, but since it is difficult for a person to consciously control their biological signals, such as heart rate, it may be useful to use these for estimating their emotional state. This could make for an AI with sentiment estimation capabilities that are beyond human,” explains Dr. Okada.
The team analyzed 2468 exchanges with a dialog AI obtained from 26 participants to estimate the level of enjoyment experienced by the user during the conversation. The user was then asked to assess how enjoyable or boring they found the conversation to be. The team used the multimodal dialogue data set named “Hazumi1911,” which uniquely combined speech recognition, voice color sensors, facial expression and posture detection with skin potential, a form of physiological response sensing.
“On comparing all the separate sources of information, the biological signal information proved to be more effective than voice and facial expression. When we combined the language information with biological signal information to estimate the self-assessed internal state while talking with the system, the AI’s performance became comparable to that of a human,” comments an excited Dr. Okada.
These findings suggest that the detection of physiological signals in humans, which typically remain hidden from our view, could pave the way for highly emotionally intelligent AI-based dialog systems, making for more natural and satisfying human-machine interactions. Moreover, emotionally intelligent AI systems could help identify and monitor mental illness by sensing a change in daily emotional states. They could also come handy in education where the AI could gauge whether the learner is interested and excited over a topic of discussion, or bored, leading to changes in teaching strategy and more efficient educational services.
Original Article: Physiological Signals Could be the Key to “Emotionally Intelligent” AI, Scientists Say
More from: Japan Advanced Institute of Science and Technology | Osaka University
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Emotionally intelligent artificial intelligence
- Mazher Sayed joins the cast of Janani- AI Ki Kahani
Mazher Sayed's role in Janani - AI ki Kahani, alongside Mouli Ganguly, creates intrigue as his character mysteriously disappears. The show's writers s ...
- This critical skill gives emotionally intelligent people ‘greater happiness and fulfillment,' says Stanford-trained psychologist
Most people are emotionally underdeveloped,” says psychologist Emma Seppälä. “It keeps people from living their freest, boldest, and most authentic lives.” ...
- Investors Are Showering AI Startups With Cash. One Problem: They Don’t Have Much of a Business
“Everyone believes that AI is the future, so we are going to see an extraordinary amount of investment until proven otherwise,” said Alex Clayton, a general partner at the venture firm Meritech. “The ...
- I Asked AI Chatbots for Gift Recommendations, and It Went Horribly
AI is transforming our culture, but there are some seemingly simple tasks that AI can't do well, like picking out a a thoughtful gift for a family member. I tried to use artificial intelligence to ...
- Teachers Pick Their Favorite Sci-Fi Texts for AI Education
To prompt class discussions about the potential consequences of artificial intelligence, teachers can draw from a long history of literature on the subject, from classic novels to short stories and ...
Go deeper with Google Headlines on:
Emotionally intelligent artificial intelligence
[google_news title=”” keyword=”emotionally intelligent artificial intelligence” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]
Go deeper with Bing News on:
Emotionally intelligent AI
- Waken AI: AI-Twin Celebrities Life-Coach Their Fans in the BE RIGHT BACK to ME App, Empowering a New Era of Sci-Fi Emotional Wellness by Waken.AI
Democratizing emotional wellness with AI, through fully customized and emotionally sensitive companions, available 24/7, for personal growth and therapeutic self-careBEVERLY HILLS, CA / ACCESSWIRE / A ...
- Boosting Sales with AI: How Chatbots Can Increase Conversion Rates
Artificial intelligence (AI) is changing customer business interactions in today’s digital era. Chatbots, enabled by AI, are becoming essential for enhancing customer service and improving online ...
- The Flaw in Artificial Intelligence’s Limitless Potential
Using AI, our ability to make decisions increases exponentially, which, in mission-critical industries, means saved lives or expedited justice.
- This critical skill gives emotionally intelligent people ‘greater happiness and fulfillment,' says Stanford-trained psychologist
Most people are emotionally underdeveloped,” says psychologist Emma Seppälä. “It keeps people from living their freest, boldest, and most authentic lives.” ...
- I talked to Nvidia’s AI NPC: it’s impressive, uncanny, an ethical nightmare, and inevitably here to stay - like it or not
Is this the beginning of the robot uprising, or just a weird little novelty that will tire itself out? A little of column A, a little of column B.
Go deeper with Google Headlines on:
Emotionally intelligent AI
[google_news title=”” keyword=”emotionally intelligent AI” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]