Can machines think? That’s what renowned mathematician Alan Turing sought to understand back in the 1950s when he created an imitation game to find out if a human interrogator could tell a human from a machine based solely on conversation deprived of physical cues.
The Turing test was introduced to determine a machine’s ability to show intelligent behavior that is equivalent to or even indistinguishable from that of a human. Turing mainly cared about whether machines could match up to humans’ intellectual capacities.
But there is more to being human than intellectual prowess, so researchers from the Center for Complex Systems and Brain Sciences (CCSBS) in the Charles E. Schmidt College of Science at Florida Atlantic University set out to answer the question: “How does it ‘feel’ to interact behaviorally with a machine?”
They created the equivalent of an “emotional” Turing test, and developed a virtual partner that is able to elicit emotional responses from its human partner while the pair engages in behavioral coordination in real-time.
Results of the study, titled “Enhanced Emotional Responses during Social Coordination with a Virtual Partner,” are recently published in the International Journal of Psychophysiology. The researchers designed the virtual partner so that its behavior is governed by mathematical models of human-to-human interactions in a way that enables humans to interact with the mathematical description of their social selves.
“Our study shows that humans exhibited greater emotional arousal when they thought the virtual partner was a human and not a machine, even though in all cases, it was a machine that they were interacting with,” said Mengsen Zhang, lead author and a Ph.D. student in FAU’s CCSBS. “Maybe we can think of intelligence in terms of coordinated motion within and between brains.”
The virtual partner is a key part of a paradigm developed at FAU called the Human Dynamic Clamp – a state-of-the-art human machine interface technology that allows humans to interact with a computational model that behaves very much like humans themselves. In simple experiments, the model – on receiving input from human movement – drives an image of a moving hand which is displayed on a video screen. To complete the reciprocal coupling, the subject sees and coordinates with the moving image as if it were a real person observed through a video circuit. This social “surrogate” can be precisely tuned and controlled – both by the experimenter and by the input from the human subject.
“The behaviors that gave rise to that distinctive emotional arousal were simple finger movements, not events like facial expressions for example, known to convey emotion,” said Emmanuelle Tognoli, Ph.D., co-author and associate research professor in FAU’s CCSBS. “So the findings are rather startling at first.”
Tognoli is quick to point out that that it is not so much about how fanciful the partner appears or how emotionally prone it is, since usually, fingers have little in the way of tears or laughter. Instead, it is a matter of how well the virtual partner relates its behavior to the human – its competence for social coordination written in its mathematical equations.
The mathematical models that govern the virtual partner’s behavior are grounded in four decades of empirical and theoretical research at FAU led by co-author J.A. Scott Kelso, Ph.D., the Glenwood and Martha Creech Eminent Chair in Science, and founder of FAU’s CCSBS.
Kelso stresses that the key idea behind the Human Dynamic Clamp is the symmetry between the human and the machine, the fact that they are governed by the same laws of coordination dynamics.
“In reality, humans’ interactions with their milieu, including other human beings, are continuous and reciprocal,” said Kelso. “By putting time and reciprocity back in the investigation of emotion and social interaction, the Human Dynamic Clamp affords the opportunity to explore parameter ranges and perturbations that are out of reach of traditional experimental designs. It is a step forward for investigations aimed at understanding complex social behavior.”
The study shows that behavioral interaction and emotion are continuously feeding from each other, so that coordination of movement could make useful contribution to the rehabilitation of diseases. Movement coordination disorders are often found in patients with schizophrenia and autism spectrum disorders, who also suffer from social and emotional dysfunctions.
“Artificial Intelligence has been grounded in an algorithmic approach of human cognition. We are now bringing the social and emotional dimensions to the table as well,” said Guillaume Dumas, Ph.D., co-author, and a former post-doctoral member of FAU’s CCSBS who is currently with the Institut Pasteur in Paris.
The researchers anticipate that the virtual partner will soon be developed into the prototype of a cooperative machine that can be used for therapeutic purposes. This type of application might benefit many patients afflicted with social and emotional disorders.
“FAU has nurtured the Center for Complex Systems and Brain Sciences for 30 years, and this work is led by one of our outstanding doctoral students who is advancing our understanding of the orders and disorders that take place in our society and in our brains,” said Daniel C. Flynn, Ph.D., FAU’s vice president for research.
The Latest on: Human machine interface
via Google News
The Latest on: Human machine interface
- Augmented reality head-up displays to break new ground with holographyon February 24, 2021 at 2:05 am
The way in which drivers and occupants interact with the vehicle is changing dramatically, and augmented reality (AR) head-up displays (HUDs) could play a big role in the human-machine interface.
- Artificial Intelligence Is Taking the U-2 Spy Plane to New Heightson February 23, 2021 at 3:42 pm
The AI algorithm, called ARTUu, flew along with a human pilot on a U-2 Dragon Lady spy plane, performing tasks that would “otherwise be done by a pilot.” ...
- E2IP Technologies Announces Acquisition of Serious Integratedon February 23, 2021 at 3:09 pm
E2IP Technologies Announces Acquisition of Serious Integrated - E2ip technologies recently announced its acquisition of Serious Integrated Inc. to develop and produce global market ...
- Global Automotive Human Machine Interface (HMI) Market 2020 Analysis of Key Trend, Industry Dynamics and Future Growth 2025on February 23, 2021 at 2:15 pm
Global Automotive Human Machine Interface (HMI) Market 2020 by Company, Type and Application, Forecast to 2025 brings together an insightful overview of product specification, technology, applications ...
- Intelligent Process Automation: The New Imperative for Business Serviceson February 18, 2021 at 8:13 pm
The New Imperative for Business Services By Robert H. Brown, AVP, Center for the future of Work, Cognizant - New digital technologies are transforming traditional process management, opening new doors ...
- Synaptics Executing Well On A Higher-Growth, Higher-Margin Model And Not Getting Full Crediton February 17, 2021 at 1:52 pm
Synaptics has meaningfully repositioned the business toward growth and higher margins, but isn't getting full credit yet.
- Human Machine Interface Market : Key Facts, Dynamics, Segments and Forecast Predictions Presented Until 2025on February 10, 2021 at 8:52 pm
The global human machine interface market was valued at USD 3.71 billion in 2019, and it is expected to reach a value of USD 7.24 billion by 2025, at a CAGR of 10.3%, over the forecast period (2020 - ...
- PRESAGIS Selected to Provide Software Tools and Services to Meet Boeing’s Future Avionics User Interface Development Needson February 9, 2021 at 5:05 am
PRESAGIS Selected to Provide Software Tools and Services to Meet Boeing’s Future Avionics User Interface Development Needs VAPS XT, UA Accelerator and UA Emulator to become part of Boeing’s ...
- Wearable Device Provides Band-Aid-Like Human-Machine Interfaceon February 2, 2021 at 4:00 pm
The human-machine interface (HMI) device—a small strip of material that can attach to the skin—also has potential for use as prosthetic skin for a robotic hand or other robotic devices, with a robust ...
- E2IP TECHNOLOGIES acquires Serious Integrated, Inc.on February 2, 2021 at 3:30 am
MONTREAL, Feb. 2, 2021 /PRNewswire/ - e 2 ip technologies (e 2 ip), a Human-Machine Interface (HMI) and Smart Surface solutions innovation leader, today announced its acquisition of Serious ...
via Bing News