Can machines think? That’s what renowned mathematician Alan Turing sought to understand back in the 1950s when he created an imitation game to find out if a human interrogator could tell a human from a machine based solely on conversation deprived of physical cues.
The Turing test was introduced to determine a machine’s ability to show intelligent behavior that is equivalent to or even indistinguishable from that of a human. Turing mainly cared about whether machines could match up to humans’ intellectual capacities.
But there is more to being human than intellectual prowess, so researchers from the Center for Complex Systems and Brain Sciences (CCSBS) in the Charles E. Schmidt College of Science at Florida Atlantic University set out to answer the question: “How does it ‘feel’ to interact behaviorally with a machine?”
They created the equivalent of an “emotional” Turing test, and developed a virtual partner that is able to elicit emotional responses from its human partner while the pair engages in behavioral coordination in real-time.
Results of the study, titled “Enhanced Emotional Responses during Social Coordination with a Virtual Partner,” are recently published in the International Journal of Psychophysiology. The researchers designed the virtual partner so that its behavior is governed by mathematical models of human-to-human interactions in a way that enables humans to interact with the mathematical description of their social selves.
“Our study shows that humans exhibited greater emotional arousal when they thought the virtual partner was a human and not a machine, even though in all cases, it was a machine that they were interacting with,” said Mengsen Zhang, lead author and a Ph.D. student in FAU’s CCSBS. “Maybe we can think of intelligence in terms of coordinated motion within and between brains.”
The virtual partner is a key part of a paradigm developed at FAU called the Human Dynamic Clamp – a state-of-the-art human machine interface technology that allows humans to interact with a computational model that behaves very much like humans themselves. In simple experiments, the model – on receiving input from human movement – drives an image of a moving hand which is displayed on a video screen. To complete the reciprocal coupling, the subject sees and coordinates with the moving image as if it were a real person observed through a video circuit. This social “surrogate” can be precisely tuned and controlled – both by the experimenter and by the input from the human subject.
“The behaviors that gave rise to that distinctive emotional arousal were simple finger movements, not events like facial expressions for example, known to convey emotion,” said Emmanuelle Tognoli, Ph.D., co-author and associate research professor in FAU’s CCSBS. “So the findings are rather startling at first.”
Tognoli is quick to point out that that it is not so much about how fanciful the partner appears or how emotionally prone it is, since usually, fingers have little in the way of tears or laughter. Instead, it is a matter of how well the virtual partner relates its behavior to the human – its competence for social coordination written in its mathematical equations.
The mathematical models that govern the virtual partner’s behavior are grounded in four decades of empirical and theoretical research at FAU led by co-author J.A. Scott Kelso, Ph.D., the Glenwood and Martha Creech Eminent Chair in Science, and founder of FAU’s CCSBS.
Kelso stresses that the key idea behind the Human Dynamic Clamp is the symmetry between the human and the machine, the fact that they are governed by the same laws of coordination dynamics.
“In reality, humans’ interactions with their milieu, including other human beings, are continuous and reciprocal,” said Kelso. “By putting time and reciprocity back in the investigation of emotion and social interaction, the Human Dynamic Clamp affords the opportunity to explore parameter ranges and perturbations that are out of reach of traditional experimental designs. It is a step forward for investigations aimed at understanding complex social behavior.”
The study shows that behavioral interaction and emotion are continuously feeding from each other, so that coordination of movement could make useful contribution to the rehabilitation of diseases. Movement coordination disorders are often found in patients with schizophrenia and autism spectrum disorders, who also suffer from social and emotional dysfunctions.
“Artificial Intelligence has been grounded in an algorithmic approach of human cognition. We are now bringing the social and emotional dimensions to the table as well,” said Guillaume Dumas, Ph.D., co-author, and a former post-doctoral member of FAU’s CCSBS who is currently with the Institut Pasteur in Paris.
The researchers anticipate that the virtual partner will soon be developed into the prototype of a cooperative machine that can be used for therapeutic purposes. This type of application might benefit many patients afflicted with social and emotional disorders.
“FAU has nurtured the Center for Complex Systems and Brain Sciences for 30 years, and this work is led by one of our outstanding doctoral students who is advancing our understanding of the orders and disorders that take place in our society and in our brains,” said Daniel C. Flynn, Ph.D., FAU’s vice president for research.
The Latest on: Human machine interface
via Google News
The Latest on: Human machine interface
- A Flagship SUV is Born: Introducing the All-New 2022 Lexus LX 600on October 13, 2021 at 9:30 am
The multifaceted LX brings unparalleled capability, luxury and human-centered technology to the Lexus range. – The all-new Lexus next-generation interior and exterior design marries superior function ...
- Four Computer Gateways, Gigabit Speed Make Lucid DreamDrive Pro Growon October 12, 2021 at 4:07 pm
Lucid Group, which is setting new standards with its advanced luxury EVs, announced today the details of DreamDrive™, the most technically sophisticated advanced driver-assistance system (ADAS). All ...
- PLC and Operator Interface Advantages Stack Up for a Large Container Automatic Stacking Crane Projecton October 8, 2021 at 9:14 am
Scalability and connectivity of PLCs and operator interfaces played a key role as TMEIC integrated 86 new container automatic stacking cranes for the Port of Virginia.
- 2022 Lexus NX Human Machine Interface infotainment reviewon October 6, 2021 at 9:57 pm
Frankly, the all-new 2022 Lexus NX’s most important change, improvement and missed opportunity is its equally new Human Machine Interface infotainment system. It also has implications for the entire ...
- Food And Pharmaceutical Sectors Will Influence Sale Of Robotic Carton Loading Machine Marketon October 6, 2021 at 3:35 am
Fact.MR, analyzing how Robotic Carton Loading Machine Market sales will grow During 2028 The recent study by Fact.MR on Demand of Robotic Carton Loading Machine Market offers a 10-year forecast. The ...
- Human Machine Interface in cars: Reimagining the automotive futureon October 5, 2021 at 2:01 am
The same system that introduced the world to Fortnite, a cultural phenomenon boasting over 350 million players, now powers the Hummer EV’s HMI (Human Machine Interface). What otherwise brings virtual ...
- Global Touch Based Human Machine Interface Market 2021 Top Leading Player, Regional Overview, Future Outlook and Business Growth Analysis 2027on October 4, 2021 at 3:10 am
Global Touch Based Human Machine Interface Market Growth 2021-2027 investigation rundown by MRInsights.biz is a thorough, systematic, and all-encompassing study of the industry. The report explores ...
- Automotive HMI (Human Machine Interface) Market: An Exclusive Study on Upcoming Trends and Growth Opportunitieson October 1, 2021 at 3:12 pm
Opportunities in the automotive HMI (Human Machine Interface) market have evolved through a number of stages. Lucintel has found the future of this market to be promising; the automotive HMI (Human ...
- The Satellite Payloads Market to be in sync with human-machine interface at a CAGR of 5% between 2017 and 2022on September 27, 2021 at 1:58 am
Get Going With Sample Of Satellite Payloads Market Report! The report published by Persistence Market Research projects that by the end of forecast period, 2017-2022, the global market for satellite ...
- A Tongue Operated Human Machine Interfaceon September 25, 2021 at 5:00 pm
so this isn’t a new idea, but the actual implementations differ quite a lot. Apparently it’s also possible to use your ear muscles as an interface!
via Bing News