
Multimodal sentiment analysis combining language, audio, visual, and physiological signals.
The multimodal neural network is used to predict user sentiment from multimodal features such as text, audio, and visual data. In a new study, researchers from Japan account for physiological signals in sentiment estimation while talking with the system, greatly improving the system’s performance.
Image courtesy: Shogo Okada from JAIST
Researchers integrate biological signals with gold-standard machine learning methods to enable emotionally intelligent speech dialog systems
Artificial intelligence (AI) is at the forefront of modern technology. Making AI “emotionally intelligent” could open doors to more natural human-machine interactions. To do this, it needs to pick up on the user’s sentiment during a dialog. Physiological signals could provide a direct route to such sentiments. Now, researchers from Japan take things to the next level with an AI with sentiment-sensing capabilities comparable to that of humans.
Speech and language recognition technology is a rapidly developing field, which has led to the emergence of novel speech dialog systems, such as Amazon Alexa and Siri. A significant milestone in the development of dialog artificial intelligence (AI) systems is the addition of emotional intelligence. A system able to recognize the emotional states of the user, in addition to understanding language, would generate a more empathetic response, leading to a more immersive experience for the user.
“Multimodal sentiment analysis” is a group of methods that constitute the gold standard for an AI dialog system with sentiment detection. These methods can automatically analyze a person’s psychological state from their speech, voice color, facial expression, and posture and are crucial for human-centered AI systems. The technique could potentially realize an emotionally intelligent AI with beyond-human capabilities, which understands the user’s sentiment and generates a response accordingly.
However, current emotion estimation methods focus only on observable information and do not account for the information contained in unobservable signals, such as physiological signals. Such signals are a potential gold mine of emotions that could improve the sentiment estimation performance tremendously.
In a new study published in the journal IEEE Transactions on Affective Computing, physiological signals were added to multimodal sentiment analysis for the first time by researchers from Japan, a collaborative team comprising Associate Professor Shogo Okada from Japan Advanced Institute of Science and Technology (JAIST) and Prof. Kazunori Komatani from the Institute of Scientific and Industrial Research at Osaka University. “Humans are very good at concealing their feelings. The internal emotional state of a user is not always accurately reflected by the content of the dialog, but since it is difficult for a person to consciously control their biological signals, such as heart rate, it may be useful to use these for estimating their emotional state. This could make for an AI with sentiment estimation capabilities that are beyond human,” explains Dr. Okada.
The team analyzed 2468 exchanges with a dialog AI obtained from 26 participants to estimate the level of enjoyment experienced by the user during the conversation. The user was then asked to assess how enjoyable or boring they found the conversation to be. The team used the multimodal dialogue data set named “Hazumi1911,” which uniquely combined speech recognition, voice color sensors, facial expression and posture detection with skin potential, a form of physiological response sensing.
“On comparing all the separate sources of information, the biological signal information proved to be more effective than voice and facial expression. When we combined the language information with biological signal information to estimate the self-assessed internal state while talking with the system, the AI’s performance became comparable to that of a human,” comments an excited Dr. Okada.
These findings suggest that the detection of physiological signals in humans, which typically remain hidden from our view, could pave the way for highly emotionally intelligent AI-based dialog systems, making for more natural and satisfying human-machine interactions. Moreover, emotionally intelligent AI systems could help identify and monitor mental illness by sensing a change in daily emotional states. They could also come handy in education where the AI could gauge whether the learner is interested and excited over a topic of discussion, or bored, leading to changes in teaching strategy and more efficient educational services.
Original Article: Physiological Signals Could be the Key to “Emotionally Intelligent” AI, Scientists Say
More from: Japan Advanced Institute of Science and Technology | Osaka University
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Emotionally intelligent artificial intelligence
- Promise and perils of Artificial Intelligence (Column: The Third Eye)
As the debate on whether Artificial Intelligence (AI) is more of a threat than a technological advancement came to a head, US President Joe Biden issued a wide-ranging Executive Order on October 30 ...
- Stealing fire from the gods: Artificial Intelligence and the evolution of thought
It would be ironic if, to gain more power and control over the world, we used our human intelligence to create AI systems and devices which, for all the benefits they bring, end up managing our lives ...
- Top 10 Fears About Artificial Intelligence
While some relish the fresh perspective AI has brought, others are apprehensive regarding the scale of change, leading to fears about AI technology.
- The Future of Artificial Intelligence: Can Machines Develop Human-Like Consciousness?
Welcome to our blog, where we dive deep into the realms of cutting-edge technological wonders! Today, we embark on a mind-boggling journey to explore one of the most fascinating questions humanity has ...
- Microsoft's CEO Just Made a Brilliant but Dangerous Announcement. It Could Change the Future of A.I. Forever
CEO Sam Altman is coming to Microsoft. That could be a great advantage for Microsoft--but it could also set dangerous wheels into motion.
Go deeper with Google Headlines on:
Emotionally intelligent artificial intelligence
[google_news title=”” keyword=”emotionally intelligent artificial intelligence” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]
Go deeper with Bing News on:
Emotionally intelligent AI
- Promise and perils of Artificial Intelligence (Column: The Third Eye)
As the debate on whether Artificial Intelligence (AI) is more of a threat than a technological advancement came to a head, US President Joe Biden issued a wide-ranging Executive Order on October 30 ...
- AstraZeneca, AI biologics firm Absci tie up on cancer drug - FT
(Reuters) - Anglo-Swedish drugmaker AstraZeneca has signed a deal worth up to $247 million with U.S. artificial intelligence (AI) biologics firm Absci to design an antibody to fight cancer, the ...
- Will AI become more powerful and intelligent than humans? Microsoft gave important information
In a recent announcement, Microsoft has shed light on the trajectory of artificial intelligence, sparking intense discussions on whether AI ..|News Track ...
- AI art not a ‘replacer’ but an ‘enabler’: ‘No point being a Luddite in an era of innovation’
From the images of Pope Francis playing basketball to Prime Minister Narendra Modi posing as a ‘rockstar’ that made social media enter a frenzy, people are using AI to create images in droves. In such ...
- AI Should Complement Humans at Work, Not Replace Them, TIME Panelists Say
Leaders from across the sector emphasized the need to center humans in decisions around incorporating AI into workflows.
Go deeper with Google Headlines on:
Emotionally intelligent AI
[google_news title=”” keyword=”emotionally intelligent AI” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]