From Apple’s Siri to Honda’s robot Asimo, machines seem to be getting better and better at communicating with humans.
But some neuroscientists caution that today’s computers will never truly understand what we’re saying because they do not take into account the context of a conversation the way people do.
Specifically, say University of California, Berkeley, postdoctoral fellow Arjen Stolk and his Dutch colleagues, machines don’t develop a shared understanding of the people, place and situation – often including a long social history – that is key to human communication. Without such common ground, a computer cannot help but be confused.
“People tend to think of communication as an exchange of linguistic signs or gestures, forgetting that much of communication is about the social context, about who you are communicating with,” Stolk said.
The word “bank,” for example, would be interpreted one way if you’re holding a credit card but a different way if you’re holding a fishing pole. Without context, making a “V” with two fingers could mean victory, the number two, or “these are the two fingers I broke.”
“All these subtleties are quite crucial to understanding one another,” Stolk said, perhaps more so than the words and signals that computers and many neuroscientists focus on as the key to communication. “In fact, we can understand one another without language, without words and signs that already have a shared meaning.”
Babies and parents, not to mention strangers lacking a common language, communicate effectively all the time, based solely on gestures and a shared context they build up over even a short time.
Stolk argues that scientists and engineers should focus more on the contextual aspects of mutual understanding, basing his argument on experimental evidence from brain scans that humans achieve nonverbal mutual understanding using unique computational and neural mechanisms. Some of the studies Stolk has conducted suggest that a breakdown in mutual understanding is behind social disorders such as autism.
“This shift in understanding how people communicate without any need for language provides a new theoretical and empirical foundation for understanding normal social communication, and provides a new window into understanding and treating disorders of social communication in neurological and neurodevelopmental disorders,” said Dr. Robert Knight, a UC Berkeley professor of psychology in the campus’s Helen Wills Neuroscience Institute and a professor of neurology and neurosurgery at UCSF.
Stolk and his colleagues discuss the importance of conceptual alignment for mutual understanding in an opinion piece appearing Jan. 11 in the journal Trends in Cognitive Sciences.
Brain scans pinpoint site for ‘meeting of minds’
To explore how brains achieve mutual understanding, Stolk created a game that requires two players to communicate the rules to each other solely by game movements, without talking or even seeing one another, eliminating the influence of language or gesture. He then placed both players in an fMRI (functional magnetic resonance imager) and scanned their brains as they nonverbally communicated with one another via computer.
He found that the same regions of the brain – located in the poorly understood right temporal lobe, just above the ear – became active in both players during attempts to communicate the rules of the game. Critically, the superior temporal gyrus of the right temporal lobe maintained a steady, baseline activity throughout the game but became more active when one player suddenly understood what the other player was trying to communicate. The brain’s right hemisphere is more involved in abstract thought and social interactions than the left hemisphere.
“These regions in the right temporal lobe increase in activity the moment you establish a shared meaning for something, but not when you communicate a signal,” Stolk said. “The better the players got at understanding each other, the more active this region became.”
This means that both players are building a similar conceptual framework in the same area of the brain, constantly testing one another to make sure their concepts align, and updating only when new information changes that mutual understanding. The results were reported in 2014 in the Proceedings of the National Academy of Sciences.
“It is surprising,” said Stolk, “that for both the communicator, who has static input while she is planning her move, and the addressee, who is observing dynamic visual input during the game, the same region of the brain becomes more active over the course of the experiment as they improve their mutual understanding.”
Robots’ statistical reasoning
Robots and computers, on the other hand, converse based on a statistical analysis of a word’s meaning, Stolk said. If you usually use the word “bank” to mean a place to cash a check, then that will be the assumed meaning in a conversation, even when the conversation is about fishing.
Read more: Will computers ever truly understand what we’re saying?
The Latest on: Human computer understanding
[google_news title=”” keyword=”human computer understanding” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: Human computer understanding
- Qualtrics CEO: AI can make businesses more humanon July 24, 2024 at 12:18 am
Qualtrics CEO Zig Serafin believes AI can democratise experience management and help businesses better understand what matters to both customers and employees ...
- Using AI to Study Human Brain and Natural Intelligenceon July 23, 2024 at 8:28 am
Beyond neuroimaging analysis, AI is vital in developing and operating brain-computer interfaces (BCIs), which allow ... though realizing true AGI remains an aspirational goal. Understanding human ...
- Understanding AI in Truckingon July 22, 2024 at 2:18 pm
The first in a series of stories detailing the rise of artificial intelligence in the trucking industry looks at the basics: What is AI? How does it work? What does it do?
- 5 Uncommon Computer Science Concepts Tech Startup Founders Should Knowon July 22, 2024 at 2:09 pm
Here are five uncommon computer science concepts that tech startup founder should know to gain a strategic advantage.
- Human-Artificial Interaction in the Age of Generative AIson July 17, 2024 at 12:26 am
In the rapidly evolving landscape of generative artificial intelligence (AI), the interaction between human and artificial agents has become increasingly complex and multifaceted. This Research Topic ...
- Time Will Tell: How Human Rights Watch Identifies Time Through Analyzing Videoson July 16, 2024 at 9:00 pm
Human Rights Watch has continued to document ongoing atrocity crimes and human rights violations in Palestine and Israel. On July 17, we published a report on the events of October 7, 2023, “‘I Can’t ...
- Meet The Atomic Human - a draft blueprint for trustworthy AIon July 16, 2024 at 2:03 am
Neil D. Lawrence’s new book explains the ideas and misconceptions shaping modern AI. Building trustworthy AI needs to start with considerations of power imbalances and automated decision-making.
- 'The Last Human Job' probes our need for connection in professional sphereson July 13, 2024 at 1:16 am
Sociologist Allison Pugh explores this "connective labor" — work that "involves 'seeing' the other and reflecting that understanding back" — in her new book, The Last Human Job: The Work of Connecting ...
- Fox Cities theater: 'Silent Sky' tells of Wisconsin woman who transformed understanding of spaceon July 7, 2024 at 4:03 am
In the early 1900s, well over a decade before women gained the right to vote, Wisconsinite Henrietta Leavitt worked at Harvard University as a human “computer,” doing mathematical calculations ...
- Fox Cities theater: 'Silent Sky' tells of Wisconsin woman who transformed understanding of spaceon July 7, 2024 at 3:03 am
In the early 1900s, well over a decade before women gained the right to vote, Wisconsinite Henrietta Leavitt worked at Harvard University as a human “computer,” doing ... field of astronomy and ...
via Bing News