via Penn State
An individual may bring their hands to their face when feeling sad or jump into the air when feeling happy. Human body movements convey emotions, which plays a crucial role in everyday communication, according to a team led by Penn State researchers. Combining computing, psychology and performing arts, the researchers developed an annotated human movement dataset that may improve the ability of artificial intelligence to recognize the emotions expressed through body language.
The work — led by James Wang, distinguished professor in the College of Information Systems and Technology (IST) and carried out primarily by Chenyan Wu, a graduating doctoral student in Wang’s group — was published today (Oct. 13) in the print edition of Patterns and featured on the journal’s cover.
“People often move using specific motor patterns to convey emotions and those body movements carry important information about a person’s emotions or mental state,” Wang said. “By describing specific movements common to humans using their foundational patterns, known as motor elements, we can establish the relationship between these motor elements and bodily expressed emotion.”
“People often move using specific motor patterns to convey emotions, and those body movements carry important information about a person’s emotions or mental state.”
James Wang, distinguished professor of information sciences and technology
According to Wang, augmenting machines’ understanding of bodily expressed emotion may help enhance communication between assistive robots and children or elderly users; provide psychiatric professionals with quantitative diagnostic and prognostic assistance; and bolster safety by preventing mishaps in human-machine interactions.
“In this work, we introduced a novel paradigm for bodily expressed emotion understanding that incorporates motor element analysis,” Wang said. “Our approach leverages deep neural networks — a type of artificial intelligence — to recognize motor elements, which are subsequently used as intermediate features for emotion recognition.”
The team created a dataset of the way body movements indicate emotion — body motor elements — using 1,600 human video clips. Each video clip was annotated using Laban Movement Analysis (LMA), a method and language for describing, visualizing, interpreting and documenting human movement.
Wu then designed a dual-branch, dual-task movement analysis network capable of using the labeled dataset to produce predictions for both bodily expressed emotion and LMA labels for new images or videos.
“Emotion and LMA element labels are related, and the LMA labels are easier for deep neural networks to learn,” Wu said.
According to Wang, LMA can study motor elements and emotions while simultaneously creating a “high-precision” dataset that demonstrates the effective learning of human movement and emotional expression.
“Incorporating LMA features has effectively enhanced body-expressed emotion understanding,” Wang said. “Extensive experiments using real-world video data revealed that our approach significantly outperformed baselines that considered only rudimentary body movement, showing promise for further advancements in the future.”
The Latest Updates from Bing News
Go deeper with Bing News on:
Automated emotion recognition
- Smartphone app uses AI to detect depression from facial cues
Dartmouth researchers report they have developed the first smartphone application that uses artificial intelligence paired with facial-image processing software to reliably detect the onset of ...
- Facial Recognition Heads to Class. Will Students Benefit?
Bou-Saba is designing a facial recognition system for classroom management. Multiple cameras spread throughout the room will take attendance, monitor whether students are paying attention and detect ...
- Homeowners reveal how they saved to build
How different are the people who manage to build and move into their own homes from those who try and fail or those who never even bother to try?Planning aheadFor 30-year-old Joel Kibira, a homeowner ...
- Moxie: How This AI Robot Is Designed To Teach Kids Emotional Intelligence
Developed by Embodied, Moxie is an AI-powered robot that can accompany, interact with and teach five-to-10 year old kids problem-solving and emotional intelligence.
- 3 Ways We're Already Using AI in Mental Health Care
The use of Artificial Intelligence is inevitable in healthcare, including mental health. AI is now being used as an adjunct to assist clinicians in more effective treatment. Several technologies ...
Go deeper with Bing News on:
Emotion sensing machines
- Have One Dollar? 3 Stocks to Buy and Hold Forever.
InvestorPlace - Stock Market News, Stock Advice & Trading Tips If you have an extra dollar or two looking to throw around in some high-risk, ...
- Next-Gen Emotion-Sensing Tech Revolutionizes Wearables
Personalized skin-integrated facial interface (PSiFI) technology recognizes human emotions in real-time using facial expressions and vocal cues This wearable and self-powered system has the potential ...
- Wearable face sensors add to the evolution of tech that "gets" us
If humanoid robots are ever going to fully integrate in society, they're going to need to get good at reading our emotional states and responding appropriately. A new wearable from researchers in ...
- Scientists Develop Tech That Can Recognise Human Emotion Real Time
South Korean scientists have developed a groundbreaking technology that can recognise human emotions in real time, an advance po ...
- World's first real-time wearable human emotion recognition technology developed
It features a first-of-its-kind bidirectional triboelectric strain and vibration sensor that enables the simultaneous sensing and integration ... enabling real-time emotion recognition. Utilizing ...