Computers learn to understand humans better by modelling them
Researchers from Aalto University, University of Birmingham and University of Oslo present results paving the way for computers to learn psychologically plausible models of individuals simply by observing them. In newly published conference article, the researchers showed that just by observing how long a user takes to click menu items, one can infer a model that reproduces similar behavior and accurately estimates some characteristics of that user’s visual system, such as fixation durations.
Despite significant breakthroughs in artificial intelligence, it has been notoriously hard for computers to understand why a user behaves the way she does. Cognitive models that describe individual capabilities, as well as goals, can much better explain and hence be able to predict individual behavior also in new circumstances. However, learning these models from the practically available indirect data has been out of reach.
“The benefit of our approach is that much smaller amount of data is needed than for ‘black box’ methods. Previous methods for performing this type of tuning have either required extensive manual labor, or a large amount of very accurate observation data, which has limited the applicability of these models until now”, Doctoral student Antti Kangasrääsiö from Aalto University explains.
The method is based on Approximate Bayesian Computation (ABC), which is a machine learning method that has been developed to infer very complex models from observations, with uses in climate sciences and epidemiology among others. It paves the way for automatic inference of complex models of human behavior from naturalistic observations. This could be useful in human-robot interaction, or in assessing individual capabilities automatically, for example detecting symptoms of cognitive decline.
“We will be able to infer a model of a person that also simulates how that person learns to act in totally new circumstances,” Professor of Machine Learning at Aalto University Samuel Kaski says.
“We’re excited about the prospects of this work in the field of intelligent user interfaces,” Antti Oulasvirta Professor of User Interfaces from Aalto University says.
“In the future, the computer will be able to understand humans in a somewhat similar manner as humans understand each other. It can then much better predict not only the benefits of a potential change but also its individual costs to an individual, a capability that adaptive interfaces have lacked”, he continues.
The Latest on: Computers learn by observing behavior
- Stanford researchers observe decision making in the brain – and influence the outcomeson January 25, 2021 at 12:18 am
A team of neuroscientists and engineers have developed a system that can show the neural process of decision making in real time, including the mental process of flipping between options before ...
- Researchers observe decision making in brain – and influence outcomeson January 25, 2021 at 12:04 am
A team of neuroscientists and engineers have developed a system that can show the neural process of decision making in real time, including the ...
- Consultations on Mental-Health and the Pandemic: What did we learn; what must we do now?on January 23, 2021 at 5:12 am
David Birnbaum*'We are not all equal in the face of this crisis.' QUEBEC CITY, /CNW Telbec/ - This stark and succinct observation ...
- Machine learning could cut delays from traffic lightson January 22, 2021 at 12:39 pm
A new self-learning system uses machine learning to improve the coordination of vehicles passing through intersections.
- Conservationists track elephant populations from SPACE using Earth-observation satellites and AI that can spot the animals even when they are camouflaged by their surroundingson January 21, 2021 at 7:45 am
Researchers are using satellite images processed with computer algorithms, which are trained with more than 1,000 images of elephants to help spot the creatures.
via Google News and Bing News