Though algorithms are increasingly being deployed in all facets of life, a new USC study has found that they fail basic tests as truth detectors.
Most algorithms have probably never heard the Eagles’ song, “Lyin’ Eyes.” Otherwise, they’d do a better job of recognizing duplicity.
Computers aren’t very good at discerning misrepresentation, and that’s a problem as the technologies are increasingly deployed in society to render decisions that shape public policy, business and people’s lives.
Turns out that algorithms fail basic tests as truth detectors, according to researchers who study theoretical factors of expression and the complexities of reading emotions at the USC Institute for Creative Technologies. The research team completed a pair of studies using science that undermines popular psychology and AI expression understanding techniques, both of which assume facial expressions reveal what people are thinking.
“Both people and so-called ‘emotion reading’ algorithms rely on a folk wisdom that our emotions are written on our face,” said Jonathan Gratch, director for virtual human research at ICT and a professor of computer science at the USC Viterbi School of Engineering. “This is far from the truth. People smile when they are angry or upset, they mask their true feelings, and many expressions have nothing to do with inner feelings, but reflect conversational or cultural conventions.”
Gratch and colleagues presented the findings today at the 8th International Conference on Affective Computing and Intelligent Interaction in Cambridge, England.
New study analyzes facial expressions in social situations
Of course, people know that people can lie with a straight face. Poker players bluff. Job applicants fake interviews. Unfaithful spouses cheat. And politicians can cheerfully utter false statements.
Yet, algorithms aren’t so good at catching duplicity, even as machines are increasingly deployed to read human emotions and inform life-changing decisions. For example, the Department of Homeland Security invests in such algorithms to predict potential threats. Some nations use mass surveillance to monitor communications data. Algorithms are used in focus groups, marketing campaigns, to screen loan applicants or hire people for jobs.
“We’re trying to undermine the folk psychology view that people have that if we could recognize people’s facial expressions, we could tell what they’re thinking,” said Gratch, who is also a professor of psychology. “Think about how people used polygraphs back in the day to see if people were lying. There were misuses of the technology then, just like misuses of facial expression technology today. We’re using naïve assumptions about these techniques because there’s no association between expressions and what people are really feeling based on these tests.”
We’re trying to undermine the folk psychology view that people have that if we could recognize people’s facial expressions, we could tell what they’re thinking.
To prove it, Gratch and fellow researchers Su Lei and Rens Hoegen at ICT, along with Brian Parkinson and Danielle Shore at the University of Oxford, examined spontaneous facial expressions in social situations. In one study, they developed a game that 700 people played for money and then captured how people’s expressions impacted their decisions and how much they earned. Next, they allowed subjects to review their behavior and provide insights into how they were using expressions to gain advantage and if their expressions matched their feelings.
Using several novel approaches, the team examined the relationships between spontaneous facial expressions and key events during the game. They adopted a technique from psychophysiology called “event-related potentials” to address the extreme variability in facial expressions and used computer vision techniques to analyze those expressions. To represent facial movements, they used a recently proposed method called facial factors, which captures many nuances of facial expressions without the difficulties modern analysis techniques provide.
The scientists found that smiles were the only expressions consistently provoked, regardless of the reward or fairness of outcomes. Additionally, participants were fairly inaccurate in perceiving facial emotion and particularly poor at recognizing when expressions were regulated. The findings show people smile for lots of reasons, not just happiness, a context important in the evaluation of facial expressions.
“These discoveries emphasize the limits of technology use to predict feelings and intentions,” Gratch said. “When companies and governments claim these capabilities, the buyer should beware because often these techniques have simplistic assumptions built into them that have not been tested scientifically.”
When attempting to read emotions, context is king
Prior research shows that people will make conclusions about other’s intentions and likely actions simply based off of the other’s expressions. While past studies exist using automatic expression analysis to make inferences, such as boredom, depression and rapport, less is known about the extent to which perceptions of expression are accurate. These recent findings highlight the importance of contextual information when reading other’s emotions and support the view that facial expressions communicate more than we might believe.
Learn more: Emotion-reading algorithms cannot predict intentions via facial expressions
The Latest on: Truth detectors
[google_news title=”” keyword=”truth detectors” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: Truth detectors
- Day Eight: Pecker Survives Cross-Examination at Trump Trialon April 29, 2024 at 5:00 pm
This left Bove on his back foot when he needed to lean into his cross-examination, which he managed to do. Bove scored some points by showing that the attacks on a few of Trump’s rivals for the 2016 ...
- Countering the Threat: Lone Wolves, Homemade Explosives, and the Path to a Safer Future: Part IIIon April 29, 2024 at 6:09 am
Certainly, no one in the know would say that the United States today is as ill-prepared to confront the array of potential terrorist threats to the homeland as we were on September 10, 2001.
- Poisoned cheesecake used as a weapon in an attempted murder a first for NY investigatorson April 27, 2024 at 11:52 pm
Rodgers realized that Olga had probably been telling the truth all along. Det ... They tracked her down and brought her in for a lie detector test. But before the results could come back, unbeknownst ...
- Phil Houston on Techniques to Identify Lies in Businesson April 26, 2024 at 7:07 am
Not many people think about interrogating senior executives and employees, and asking the right questions to figure out whether they're telling the truth about the business model. In this episode, we ...
- Stripe, doubling down on embedded finance, de-couples payments from the rest of its stackon April 24, 2024 at 11:32 am
But the truth is that the market is huge and fragmented ... where we are seeing AI tooling being added into a number of fraud detection services. In its case, it’s launching a new tool called “Radar ...
- Drift detection and self-destroying stacks in new release of Pulumi Deploymentson April 24, 2024 at 12:17 am
Pulumi is releasing new Day 2 functionalities for Pulumi Deployments, a product focused on infrastructure deployment and lifecycle management. The new ...
- Best Smoke Detector for 2024on April 23, 2024 at 7:01 am
He is in charge of developing and carrying out testing procedures for a wide variety of home appliances and smart devices including robot vacuums, smoke/CO detectors and air conditioning units.
- ‘People like Connor are still left to die in squalor’: the truth, joy and tragedy behind Laughing Boyon April 22, 2024 at 12:00 am
A decade after her autistic son Connor Sparrowhawk died in a specialist NHS facility, Sara Ryan’s campaign to reveal the truth about what happened is coming to the stage. ‘It’s the most important ...
- How Tetris Inspired an MIT Breakthrough in Nuclear Safety Technologyon April 20, 2024 at 9:49 pm
The device, based on simple tetromino shapes, could determine the direction and distance of a radiation source, with fewer detector pixels. The spread of radioactive isotopes from the Fukushima Daiich ...
- Crafting A Future Where Truth Has A Fighting Chance In The Age Of Digital Deceptionon April 17, 2024 at 3:00 am
In our rapidly digitizing world, the advent of deepfakes and sophisticated digital forgeries poses a stark threat to the foundational trust in our media landscape.
via Bing News