Androids, or robots with humanlike features, are often more appealing to people than those that resemble machines — but only up to a certain point. Many people experience an uneasy feeling in response to robots that are nearly lifelike, and yet somehow not quite “right.” The feeling of affinity can plunge into one of repulsion as a robot’s human likeness increases, a zone known as “the uncanny valley.”
The journal Perception published new insights by Emory psychologists into the cognitive mechanisms underlying this phenomenon.
Since the uncanny valley was first described, a common hypothesis developed to explain it. Known as the mind-perception theory, it proposes that when people see a robot with human-like features, they automatically add a mind to it. A growing sense that a machine appears to have a mind leads to the creepy feeling, according to this theory.
“We found that the opposite is true,” says Wang Shensheng, first author of the new study, who did the work as a graduate student at Emory and recently received his PhD in psychology. “It’s not the first step of attributing a mind to an android but the next step of ‘dehumanizing’ it by subtracting the idea of it having a mind that leads to the uncanny valley. Instead of just a one-shot process, it’s a dynamic one.”
The findings have implications for both the design of robots and for understanding how we perceive one another as humans.
“Robots are increasingly entering the social domain for everything from education to healthcare,” Wang says. “How we perceive them and relate to them is important both from the standpoint of engineers and psychologists.”
“At the core of this research is the question of what we perceive when we look at a face,” adds Philippe Rochat, Emory professor of psychology and senior author of the study. “It’s probably one of the most important questions in psychology. The ability to perceive the minds of others is the foundation of human relationships. ”
The research may help in unraveling the mechanisms involved in mind-blindness — the inability to distinguish between humans and machines — such as in cases of extreme autism or some psychotic disorders, Rochat says.
Co-authors of the study include Yuk Fai Cheong and Daniel Dilks, both associate professors of psychology at Emory.
Anthropomorphizing, or projecting human qualities onto objects, is common. “We often see faces in a cloud for instance,” Wang says. “We also sometimes anthropomorphize machines that we’re trying to understand, like our cars or a computer.”
Naming one’s car or imagining that a cloud is an animated being, however, is not normally associated with an uncanny feeling, Wang notes. That led him to hypothesize that something other than just anthropomorphizing may occur when viewing an android.
To tease apart the potential roles of mind-perception and dehumanization in the uncanny valley phenomenon the researchers conducted experiments focused on the temporal dynamics of the process. Participants were shown three types of images — human faces, mechanical-looking robot faces and android faces that closely resembled humans — and asked to rate each for perceived animacy or “aliveness.” The exposure times of the images were systematically manipulated, within milliseconds, as the participants rated their animacy.
The results showed that perceived animacy decreased significantly as a function of exposure time for android faces but not for mechanical-looking robot or human faces. And in android faces, the perceived animacy drops at between 100 and 500 milliseconds of viewing time. That timing is consistent with previous research showing that people begin to distinguish between human and artificial faces around 400 milliseconds after stimulus onset.
A second set of experiments manipulated both the exposure time and the amount of detail in the images, ranging from a minimal sketch of the features to a fully blurred image. The results showed that removing details from the images of the android faces decreased the perceived animacy along with the perceived uncanniness.
“The whole process is complicated but it happens within the blink of an eye,” Wang says. “Our results suggest that at first sight we anthropomorphize an android, but within milliseconds we detect deviations and dehumanize it. And that drop in perceived animacy likely contributes to the uncanny feeling.”
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
The uncanny valley
- AI-driven audio cloning startup gives voice to Einstein chatboton April 16, 2021 at 12:24 pm
You’ll need to prick up your ears for this slice of deepfakery emerging from the wacky world of synthesized media: A digital version of Albert Einstein — with a synthesized voice that’s been ...
- Uncanny Valley: A Webcam That Looks Like A Human Eyeon April 16, 2021 at 4:51 am
Are you being monitored all of the time? You certainly are, by both corporations and the government. It just may not be obvious in the U.S., whereas in London you are filmed by government 300 times ...
- Epic’s MetaHuman hyper-realistic character creator now in Early Accesson April 15, 2021 at 2:49 pm
Video game characters are about to look more realistic. Thanks to Epic’s Unreal Engine-powered MetaHuman Creator, realistic humans in video games are now more reachable. However, these ...
- Taylor Swift’s new ‘Fearless’ is a success, but beware the dangers of the re-recordon April 12, 2021 at 6:29 pm
But even when acts attempt a painstakingly faithful reconstruction, they’re often one-way tickets to uncanny valley. Having been brought up on a strict diet of oven chips and ELO, listening to ...
- CRX + Civic: The newest classic Honda Si meets an original superstaron April 12, 2021 at 8:00 am
Cameron Neveu What is this, a Civic for ants? The original Honda CRX is small enough to drive right through the uncanny valley separating it from the proportions of modern cars. You could make it ...
Go deeper with Google Headlines on:
The uncanny valley
Go deeper with Bing News on:
- Sound location inspired by bat ears could help robots navigate outdoorson April 16, 2021 at 7:10 am
Sound location technology has often been patterned around the human ear, but why do that when bats are clearly better at it? Virginia Tech researchers have certainly asked that question. They've ...
- Robotics firm Sarcos to go public via $1.3B financing dealon April 14, 2021 at 11:04 am
with the aim of delivering a full suite of robots capable of performing physically demanding work that requires human-like skill, dexterity, and range of motion. With a focus on augmenting humans for ...
- Will R2-D2 and WALL-E Help Define Intellectual Property for Robots?on April 12, 2021 at 8:43 am
The robots at issue in this case, just like their robot movie star counterparts, also have human-like traits and are powered by artificial intelligence to perform various functions and otherwise ...
- Will R2-D2 and WALL-E Help Define Intellectual Property for Robots?on April 12, 2021 at 8:34 am
What's the possibility that the future of androids will partly turn on a federal judge's view of C-3PO, R2-D2, WALL-E and other famous robots appearing in movies? Believe it or not, greater than zero.
- Our Reactions To The Treatment Of Robotson April 10, 2021 at 4:59 pm
More recently I’ve watched my own robots being petted ... it human-shaped but the video shows it carrying a box using human-like movements. The second snapshot perhaps evokes the strongest ...