THE R2-D2 ROBOT FROM STAR WARS DOESN’T COMMUNICATE IN HUMAN LANGUAGE BUT IS, NEVERTHELESS, CAPABLE OF SHOWING ITS INTENTIONS. FOR HUMAN-ROBOT INTERACTION, THE ROBOT DOES NOT HAVE TO BE A TRUE ‘HUMANOID’. PROVIDED THAT ITS SIGNALS ARE DESIGNED IN THE RIGHT WAY, UT RESEARCHER DAPHNE KARREMAN SAYS.
A human being will only be capable of communicating with robots if this robot has many human characteristics. That is the common idea. But mimicking natural movements and expressions is complicated, and some of our nonverbal communication is not really suitable for robots: wide arm gestures, for example. Humans prove to be capable of responding in a social way, even to machines that look like machines. We have a natural tendency of translating machine movements and signals to the human world. Two simple lenses on a machine can make people wave to the machine.
Knowing that, designing intuitive signals is challenging. In her research, Daphne Karreman focused on a robot functioning as a guide in a museum or a zoo. If the robot doesn’t have arms, can it still point to something the visitors have to look at? Using speech, written language, a screen, projection of images on a wall and specific movements, the robot has quite a number of ‘modalities’ that humans don’t have. Add to this playing with light and colour, and even a ‘low-anthropomorphic’ robot can be equipped with strong communication skills. It goes way beyond R2-D2 that communicates using beeps that need to be translated first. Karreman’s PhD thesis is therefore entitled ‘Beyond R2-D2’.
IN THE WILD
Karreman analysed a huge amount of video data to see how humans respond to a robot. Up to now, this type of research was mainly done in controlled lab situations, without other people present or after the test person was informed about what was going to happen. In this case, the robot was introduced ‘in the wild’ and in an unstructured way. People could come across the robot in the Real Alcázar Palace, Sevilla, for example. They decide for themselves if they want to be guided by a robot. What makes them keep distance, do people recognize what this robot is capable of?
To analyse these video data, Karreman developed a tool called Data Reduction Event Analysis Method (DREAM). The robot called Fun Robotic Outdoor Guide (FROG) has a screen, communicates using spoken language and light signals, and has a small pointer on its ‘head’. All by itself, FROG recognizes if people are interested in interaction and guidance. Thanks to the powerful DREAM tool, for the first time it is possible to analyse and classify human-robot interaction in a fast and reliable way. Unlike other methods, DREAM will not interpret all signals immediately, but it compares several ‘coders’ for a reliable and reproducible result.
How many people show interest, do they join the robot during the entire tour, do they respond as expected? It is possible to evaluate this using questionnaires, but that places the robot in a special position: people primarily come to visit the expo or zoo and not for meeting a robot. Using the DREAM tool, spontaneous interaction becomes more visible and thus, robot behaviour can be optimized.
Learn more: ROBOT DOESN’T HAVE TO BE HUMAN LOOK-ALIKE
The Latest on: Human-robot interaction
[google_news title=”” keyword=”human-robot interaction” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: Human-robot interaction
- How to build your own robot friend: Making AI education more accessibleon February 23, 2024 at 10:20 am
Students can personalize the robot's "body," program the robot to mimic their ... model to not only improve education in AI for all students but also to make human-interaction research more affordable ...
- Robots And Our Productivity Salvationon February 22, 2024 at 8:59 am
Robots have a long history of freaking people out. And yet, as more countries slip into demographic decline, we will likely find that they are our salvation.
- A woman is marrying an AI hologram, ushering in a weird new era for human-robot relationshipson February 22, 2024 at 5:58 am
A Spanish-Dutch artist is set to marry an AI hologram this summer, with the groundbreaking union ushering in a weird new era of human-robot relationships. Alicia Framis will marry her holographic ...
- Newo.ai’s ‘digital employee’ merges AI agent with roboton February 22, 2024 at 1:00 am
This builder lets companies replicate the intelligence of an entry-level human worker “out of ... One of Newo’s defaults is Moxie, a robot from a company called Embodied that specializes in individual ...
- Scientists Are Putting ChatGPT Brains Inside Robot Bodies. What Could Possibly Go Wrong?on February 21, 2024 at 8:00 am
As a dinner-handling robot formulates its “policy”—the plan of action it will follow to fulfill its instructions—it will have to be knowledgeable about not just the partic ...
- From Trash to Treasure: How Robots are Salvaging E-Wasteon February 21, 2024 at 4:59 am
Robots enhance e-waste recycling efficiency and safety, with AI advancements promising future improvements in material recovery.
- RoboGuide robot dog uses AI to assist the visually impairedon February 19, 2024 at 5:15 am
Technologies such as RoboGuide could enable people with visual impairments to more fully navigate and interact with the world.
- The Science Behind I, Robot: How Scientists Are Working on Isaac Asimov's Laws of Roboticson February 16, 2024 at 10:55 am
The 2004 sci-fi action flick I, Robot (streaming now on Peacock) features Will Smith as Del Spooner, a Chicago detective in the year 2035. It’s a loose adaptation of Isaac Asimov’s robot stories, ...
- Japan’s handheld edible robot takes a bite out of scienceon February 16, 2024 at 8:58 am
Japanese scientists' handheld device aims to explore "human-edible robot interaction," bridging technology and human experience.
via Bing News