From a robot’s perspective, humans are normally a nuisance: when robots and humans have to work together, it often leads to problems.
Researchers on CogIMon, a new project starting at Bielefeld University, want to teach robots how to interact with humans and work together to accomplish tasks. CogIMon stands for “cognitive compliant interaction in motion.” This research group is working on humanoid as well as industrial robots. The project is coordinated by Professor Dr. Jochen Steil of CoR-Lab, the research institution on cognition and robotics at Bielefeld University. Together with six other international partners, the joint project will run from 2015–2018 and is funded with 7 million Euros from Horizon 2020, a framework programme for research and innovation of the European Union.
“The goal of CogIMon is to teach robots to understand the forces during the movement of objects and how to appropriately react to changes in weight or contact with the object while carrying it,” explains Jochen Steil. Humans have no problem estimating the weight of an object, as they can see how heavy something is in the body language of another person. This makes it easy for humans to correspondingly adjust the force exerted when lifting. Currently, however, robots lack this ability. “Robots can measure their own force and regulate it to a certain degree. They can stop their movements or ease off, but they have not yet been able to understand forces or actively control them to take part in a joint effort. We want to change that.”
“Understanding active forces is a big challenge because it entails complex, highly skilled interaction that combines abilities from a number of different areas. Perception, the ability to move objects, controlling flexibility and body motion are a few examples,” says Steil. At this time, there is little theory to help explain how robots can move objects together with humans. For this reason, project partners in Italy and the Great Britain are conducting basic research using interaction experiments with humans. Meanwhile, Steil’s group is developing new controlling and programming methods for the robots. A classic example for moving objects can be seen when a human and a robot or two robots carry a table together. In this action, it is important to adjust one’s forces: the one carrying leads the way, the other follows. When changing who leads and who follows, it is necessary that one is able to predict their partner’s motions and adjust their own movements accordingly.
To research human interaction with humanoid robots, researchers can draw on the humanoid robot prototype COMAN (COmpliant huMANoid platform). COMAN was developed at the Italian Institute of Technology in Genua. It is 95 cm tall and weighs 31 kg. For the CogIMon project, it must “grow” in size by about 25 percent so that it can also interact with human adults. In the future, COMAN is supposed to learn how to “read” human body language. This could allow the robot to be used, for instance, in physical rehabilitation, where it could help train motor skills and coordination by playing catch with patients. While throwing and catching the ball, the robot would have to be able to react directly to its human partners and even fake a shot. This group interaction should be so open, though, that both humans and robots would be able to leave and return to the group at any time without this causing confusion.
Read more: Carrying a Table Together with a Robot
The Latest on: Robot human interactions
[google_news title=”” keyword=”Robot human interactions” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: Robot human interactions
- How are Robots Transforming Coal Mining Operations?on May 10, 2024 at 2:15 am
This article explores the integration of robotics in the coal mining industry, highlighting its transformative impact on safety, efficiency, and environmental sustainability. It discusses the shift ...
- Team compares robot-assisted language learning systems and human tutors in English conversation lessonson May 9, 2024 at 9:00 pm
Advancements in large language models, robotics, and software such as text-to-speech, have made it possible to develop robots that can understand language, interact physically, and communicate ...
- Comparative analysis of robot-assisted language learning systems and human tutors in English conversation lessonson May 9, 2024 at 4:59 pm
Researchers compared students' English-speaking abilities when interacting with current mainstream robot-assisted language learning (RALL) systems versus human tutors. They discovered that students ...
- Lifelike robot invasion: Inside China’s insane humanoid factoryon May 9, 2024 at 8:30 am
A video depicting a Chinese humanoid robot factory reveals scores of robots at different stages of development, evoking surprise and concern.
- Collaborative Robot Market Size Expected to Reach USD 35.55 Bn by 2033 | Industries Seeking Agile Automation Solutionson May 8, 2024 at 10:00 am
The global collaborative robot market is anticipated to grow from USD 1.75 billion to USD 35.55 billion in 10 years. The expansion of the collaborative robot market can be attributed to the rising ...
- Brains, robots, and interfaceson May 7, 2024 at 4:02 am
Last year, Nature Electronics declared brain-computer interfaces their technology of the year. These interfaces can now translate neural signals into speech, at speeds close to normal conversation.
- Robots unveiled at National Robotarium family open dayon May 5, 2024 at 5:00 pm
The National Robotarium at Heriot-Watt University hosted its first family open day, part of the 2024 Edinburgh Science Festival.
- Mechatronic design of the LWR IIIon May 2, 2024 at 9:21 am
As the arm is dedicated to operate the DLR artificial Hand II for research on human haptics, its kinematics is similar ... pitch-roll wrist is provided in the mechanical design. LWR III Robot Joints ...
- Eve robot multitasks with single neural neton May 2, 2024 at 8:23 am
Humanoid robot EVE, from 1X, can now multitask from a single set of neural net weights. That's significant, as it makes androids in human spaces easier to command. The latest EVE video showed it ...
- From Robots to AI: Understanding the Uncanny Valley in Digital Innovationon May 2, 2024 at 6:03 am
As technology and AI continue to advance, pushing the boundaries of realism, it becomes crucial to understand where this phenomenon can have a negative impact.
via Bing News