A robot in Cornell’s Personal Robotics Lab has learned to foresee human action and adjust accordingly.
Seeing a person carrying a bowl toward the refrigerator, a robot identifies the objects in the scene. Knowing that bowls are storable and refrigerators are places to store things, it projects possible trajectories for the bowl, and decides to open the refrigerator door.
The robot was programmed to refill a person’s cup when it was nearly empty. To do this, the robot must plan its movements in advance and then follow the plan. But if a human sitting at the table happens to raise the cup and drink from it, the robot might pour a drink into a cup that isn’t there. But when the robot sees the human reaching for the cop, it can anticipate the human action and avoid making a mistake. In another test, the robot observed a human carrying an object toward a refrigerator and helpfully opened the refrigerator door.
Hema S. Koppula, Cornell graduate student in computer science, and Ashutosh Saxena, assistant professor of computer science, will describe their work at International Conference of Machine Learning, June 18-21 in Atlanta, and the Robotics: Science and Systems conference June 24-28 in Berlin, Germany.
From a database of 120 3-D videos of people performing common household activities, the robot has been trained to identify human activities by tracking the movements of the body — reduced to a symbolic skeleton for easy calculation — breaking them down into sub-activities like reaching, carrying, pouring or drinking, and to associate the activities with objects. Since each person performs tasks a little differently, the robot can build a model that is general enough to match new events.
“We extract the general principles of how people behave,” said Saxena. “Drinking coffee is a big activity, but there are several parts to it.“ The robot builds a “vocabulary” of such small parts that it can put together in various ways to recognize a variety of big activities, he explained.
Observing a new scene with its Microsoft Kinnect 3-D camera, the robot identifies the activities it sees, considers what uses are possible with the objects in the scene and how those uses fit with the activities; it then generates a set of possible continuations into the future — such as eating, drinking, cleaning, putting away — and finally chooses the most probable. As the action continues, it constantly updates and refines its predictions.
The Latest Bing News on:
Robots foresee human action
- Chinese Robot Maker Creates Humanoid Robot that Can Do Almost All Human Taskson May 16, 2024 at 9:58 am
A Shenzhen-based company launched Astribot S1, the newest humanoid robot that mimics typical human actions like slicing a fruit, folding laundry, and more.
- AI robot 'Sophia' gives commencement speech at a New York universityon May 16, 2024 at 3:29 am
Unlike most commencement speakers, Sophia isn't a human. She's an AI robot. "I was designed by humans to engage in conversations, learning and adapting through artificial intelligence algorithms ...
- Arduino Alvik robot system rolls into actionon May 13, 2024 at 5:00 pm
It’s been a while in development but the Alvik robot system is now here – Arduino’s new STEM support for coding and robotics is available. Alvik is designed to allow users to tackle a range of ...
- 12 Jobs That Will Never Be Replaced By Robots And Computerson May 2, 2024 at 4:57 am
foresee trends, and develop groundbreaking solutions. They can adapt to dynamic environments, make quick decisions, and lead teams with a human touch, factors that cannot be replaced by artificial ...
- Robot Speaker at Commencement? Some Human Students Balkon April 25, 2024 at 5:00 pm
Enter Sophia, a humanoid robot first launched in 2016 from Houston-based Hansen ... The three students implored university administration to pick a human —“literally anyone, really,” Fields told ...
- Robot uses AI to mimic human facial expressionson April 25, 2024 at 10:55 am
A robot capable of mimicking human facial expressions has been developed ... Watch the video above to see Emo in action and hear more from Yuhang Hu.
- Key to making robots social: Human interaction, not designon April 23, 2024 at 5:00 pm
It’s not just about programming a better character for the robot, making it respond better to human social features, making it look cuter or behaving more naturally.” The research was based on ...
- A Brief History of Automatons That Were Actually Peopleon April 22, 2024 at 4:59 pm
canny inventors and entrepreneurs have sought to slap the “automated” label on what was really just normal human activity. Take the “Mechanical Turk,” a robe-clad robot that inventor ...
- New 'emotional' robots aim to read human feelingson April 20, 2024 at 10:48 pm
Robot in human shoes Developing emotional intelligence in robots is a difficult task, melding the use of computer "vision" to interpret objects and people and creating software that can respond ...
- Robots can make jobs less meaningful for human colleagueson April 18, 2024 at 11:46 am
One aspect of the conversation that is oft neglected, however, is how human workers feel about their ... cross-referenced with robot deployment data issued by the International Federation of ...
The Latest Google Headlines on:
Robots foresee human action
[google_news title=”” keyword=”robots foresee human action” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
The Latest Bing News on:
Human-robot interaction
- Yaskawa exhibits robots for welding, materials handling, and bin pickingon May 17, 2024 at 5:57 pm
Yaskawa showed several technologies with partners at Automate, including cobots for welding and vision-guided robots for machine tending.
- To optimize guide-dog robots, first listen to the visually impairedon May 17, 2024 at 3:06 pm
What features does a robotic guide dog need? Ask the blind, say researchers. A new study identifies how to develop robot guide dogs with insights from guide dog users and trainers.
- Cats playing with robots proves a winning combo in novel art installationon May 17, 2024 at 1:59 pm
"At first glance, the project is about designing a robot to enrich the lives of a family of cats by playing with them," said co-author Steve Benford of the University of Nottingham, who led the ...
- GARMI Care Robot from Technical University of Munich Evolves into Universal Assistanton May 17, 2024 at 6:13 am
The GARMI assistance robot is becoming increasingly versatile and intelligent. As researchers from TUM's Munich Institute of Robotics and Machine Intelligence (MIRMI) demonstrated at the 2024 Internat ...
- Sophia the AI robot gives commencement speech at New York college. Some grads weren't so pleased.on May 16, 2024 at 2:55 pm
Sophia the robot was D'Youville University's spring commencement speaker on Saturday, despite some student's petitioning the private college's decision.
- New York school slammed for 'completely insulting' commencement speech from AI-powered roboton May 16, 2024 at 1:54 pm
At the Harborcenter on May 11, around 2,000 students, staff members, and families will be begrudgingly introduced to the humanoid robot, Sophia ...
- Cat collaboration demonstrates what it takes to trust robotson May 13, 2024 at 9:05 am
Would you trust a robot to look after your cat? New research suggests it takes more than a carefully designed robot to care for your cat, the environment in which they operate is also vital, as well ...
- How are Robots Transforming Coal Mining Operations?on May 10, 2024 at 2:22 am
This article explores the integration of robotics in the coal mining industry, highlighting its transformative impact on safety, efficiency, and environmental sustainability. It discusses the shift ...
- Comparative analysis of robot-assisted language learning systems and human tutors in English conversation lessonson May 9, 2024 at 4:59 pm
Researchers compared students' English-speaking abilities when interacting with current mainstream robot-assisted language learning (RALL) systems versus human tutors. They discovered that students ...
- Brains, robots, and interfaceson May 7, 2024 at 4:02 am
Last year, Nature Electronics declared brain-computer interfaces their technology of the year. These interfaces can now translate neural signals into speech, at speeds close to normal conversation.
The Latest Google Headlines on:
Human-robot interaction
[google_news title=”” keyword=”human-robot interaction” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]