
via www.quora.com
THE R2-D2 ROBOT FROM STAR WARS DOESN’T COMMUNICATE IN HUMAN LANGUAGE BUT IS, NEVERTHELESS, CAPABLE OF SHOWING ITS INTENTIONS. FOR HUMAN-ROBOT INTERACTION, THE ROBOT DOES NOT HAVE TO BE A TRUE ‘HUMANOID’. PROVIDED THAT ITS SIGNALS ARE DESIGNED IN THE RIGHT WAY, UT RESEARCHER DAPHNE KARREMAN SAYS.
A human being will only be capable of communicating with robots if this robot has many human characteristics. That is the common idea. But mimicking natural movements and expressions is complicated, and some of our nonverbal communication is not really suitable for robots: wide arm gestures, for example. Humans prove to be capable of responding in a social way, even to machines that look like machines. We have a natural tendency of translating machine movements and signals to the human world. Two simple lenses on a machine can make people wave to the machine.
BEYOND R2-D2
Knowing that, designing intuitive signals is challenging. In her research, Daphne Karreman focused on a robot functioning as a guide in a museum or a zoo. If the robot doesn’t have arms, can it still point to something the visitors have to look at? Using speech, written language, a screen, projection of images on a wall and specific movements, the robot has quite a number of ‘modalities’ that humans don’t have. Add to this playing with light and colour, and even a ‘low-anthropomorphic’ robot can be equipped with strong communication skills. It goes way beyond R2-D2 that communicates using beeps that need to be translated first. Karreman’s PhD thesis is therefore entitled ‘Beyond R2-D2’.
IN THE WILD
Karreman analysed a huge amount of video data to see how humans respond to a robot. Up to now, this type of research was mainly done in controlled lab situations, without other people present or after the test person was informed about what was going to happen. In this case, the robot was introduced ‘in the wild’ and in an unstructured way. People could come across the robot in the Real Alcázar Palace, Sevilla, for example. They decide for themselves if they want to be guided by a robot. What makes them keep distance, do people recognize what this robot is capable of?
VIDEO TOOL
To analyse these video data, Karreman developed a tool called Data Reduction Event Analysis Method (DREAM). The robot called Fun Robotic Outdoor Guide (FROG) has a screen, communicates using spoken language and light signals, and has a small pointer on its ‘head’. All by itself, FROG recognizes if people are interested in interaction and guidance. Thanks to the powerful DREAM tool, for the first time it is possible to analyse and classify human-robot interaction in a fast and reliable way. Unlike other methods, DREAM will not interpret all signals immediately, but it compares several ‘coders’ for a reliable and reproducible result.
How many people show interest, do they join the robot during the entire tour, do they respond as expected? It is possible to evaluate this using questionnaires, but that places the robot in a special position: people primarily come to visit the expo or zoo and not for meeting a robot. Using the DREAM tool, spontaneous interaction becomes more visible and thus, robot behaviour can be optimized.
Learn more: ROBOT DOESN’T HAVE TO BE HUMAN LOOK-ALIKE
The Latest on: Human-robot interaction
[google_news title=”” keyword=”human-robot interaction” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: Human-robot interaction
- Robots Are Changing the Face of Customer Serviceon March 22, 2023 at 5:25 am
Thanks to the Covid-19 pandemic, service robot technology has been on the rise in the past few years. Robots like Hilton’s “Connie” and Softbank’s “Pepper” are already handling guest experiences in ...
- Robotic Bins Highlight Human-Machine Interactions, Show New Yorkers Aren't All That Rudeon March 21, 2023 at 3:21 am
Being considered to be a rude group of people isn't really all that nice, but New Yorkers have had a bad rap when it comes to rudeness. In a 2019 survey, New York City was voted by 34.3% of ...
- This human-size robot now has ‘eyes’ that show people where it’s goingon March 20, 2023 at 5:27 pm
It’s not clear how much it will cost the companies looking to get access to the bot, but Agility Robotics is opening up its Agility Partner Program, which will “provide partners with an exclusive ...
- Will AI Robots Coach Mental Well-Being in the Workplace?on March 20, 2023 at 3:25 pm
Our study provides valuable insights for robotic well-being coach design and deployment, and contributes to the vision of taking robotic coaches into the real world,” wrote t ...
- New Yorkers friendlier than expected as robots take out the trashon March 20, 2023 at 9:30 am
A quick search of stereotypes of New Yorkers yields one characteristic that turns up most frequently: rudeness.
- Study: Cute (and Non Human-Looking) Robots Will Subdue the Humanson March 17, 2023 at 2:13 pm
In a rather strange study, researchers wanted to see how employees responded to "wellbeing" robots in the workplace.
- How robots can coach us on our mental healthon March 15, 2023 at 7:55 am
Scientists believe robots can be useful tools to promote mental well-being in the workplace. The post How robots can coach us on our mental health appeared first on Talker.
- Robots can help improve mental wellbeing at work – as long as they look righton March 13, 2023 at 5:00 pm
The results will be reported today (15 March) at the ACM/IEEE International Conference on Human-Robot Interaction in Stockholm. The World Health Organization recommends that employers take action to ...
- Great Interface Design Is the Key to Human-robot Performanceon March 12, 2023 at 5:00 pm
here are four key things to pay particular attention to in order to improve human-robot interaction: UX designers must clearly understand the capabilities of the robot’s hardware -- know what’s ...
via Bing News