A robot in Cornell’s Personal Robotics Lab has learned to foresee human action and adjust accordingly.
Seeing a person carrying a bowl toward the refrigerator, a robot identifies the objects in the scene. Knowing that bowls are storable and refrigerators are places to store things, it projects possible trajectories for the bowl, and decides to open the refrigerator door.
The robot was programmed to refill a person’s cup when it was nearly empty. To do this, the robot must plan its movements in advance and then follow the plan. But if a human sitting at the table happens to raise the cup and drink from it, the robot might pour a drink into a cup that isn’t there. But when the robot sees the human reaching for the cop, it can anticipate the human action and avoid making a mistake. In another test, the robot observed a human carrying an object toward a refrigerator and helpfully opened the refrigerator door.
Hema S. Koppula, Cornell graduate student in computer science, and Ashutosh Saxena, assistant professor of computer science, will describe their work at International Conference of Machine Learning, June 18-21 in Atlanta, and the Robotics: Science and Systems conference June 24-28 in Berlin, Germany.
From a database of 120 3-D videos of people performing common household activities, the robot has been trained to identify human activities by tracking the movements of the body — reduced to a symbolic skeleton for easy calculation — breaking them down into sub-activities like reaching, carrying, pouring or drinking, and to associate the activities with objects. Since each person performs tasks a little differently, the robot can build a model that is general enough to match new events.
“We extract the general principles of how people behave,” said Saxena. “Drinking coffee is a big activity, but there are several parts to it.“ The robot builds a “vocabulary” of such small parts that it can put together in various ways to recognize a variety of big activities, he explained.
Observing a new scene with its Microsoft Kinnect 3-D camera, the robot identifies the activities it sees, considers what uses are possible with the objects in the scene and how those uses fit with the activities; it then generates a set of possible continuations into the future — such as eating, drinking, cleaning, putting away — and finally chooses the most probable. As the action continues, it constantly updates and refines its predictions.
The Latest Bing News on:
Robots foresee human action
- Can Artificial Intelligence Be Criminally Liable for AI Offenses?on April 29, 2024 at 6:29 pm
Advocates for punishing AI argue that if an AI’s actions fulfill the elements of a specific offense, criminal liability should be imposed. However, opponents argue that since AI lacks the capacity for ...
- Boston Dynamics' New Atlas: Humanoid Robot Built to Fall and Get Back Upon April 29, 2024 at 7:43 am
Boston Dynamics has launched the new Atlas robot as the firm focuses on fall recovery and integrating humanoid robots into operations for agility and minimal disruptions.
- In the AI Economy, There Will Be Zero Percent Unemploymenton April 28, 2024 at 7:00 am
AI developer Andrew Mayne explains why technology could create more jobs and lead to unprecedented economic growth.
- Boston Dynamics' robot Atlas being billed as 'fully-electric humanoid': Watch it in actionon April 18, 2024 at 12:15 pm
Boston Dynamics reveals its news humanoid named Atlas in a 30 second video. In a partnership with Hyundai, Atlas was created to tackle tough tasks.
The Latest Google Headlines on:
Robots foresee human action
[google_news title=”” keyword=”robots foresee human action” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
The Latest Bing News on:
Human-robot interaction
- Astribot S1 AI Humanoid robot unveiled demonstrating its agility, dexterity and accuracyon April 27, 2024 at 12:55 am
This week the Astribot S1 humanoid robot was unveiled in Shenzhen, China, marking another significant leap forward in autonomous robotics ...
- University Of New South Wales- Introducing Robots to Baby Care: Exploring the Future of Parentingon April 26, 2024 at 11:15 pm
More and more children are growing up with robots at home, but their impact on early learning and development is still largely unknown and unregulated.Technology has a profound impact on children.From ...
- Robot uses AI to mimic human facial expressionson April 25, 2024 at 6:24 am
A robot capable of mimicking human facial expressions has been developed by engineers at Columbia University in the US.
- People, Not Design Features, Drive Social Interactions for Robots, Finds Cornell University Studyon April 24, 2024 at 11:19 pm
It takes a village to nurture social robots.Researchers who develop social robots – ones that people interact with – focus too much on design features and not enough on sociological factors, like ...
- This robot can predict a smile before it happenson April 24, 2024 at 11:17 am
STORY: This AI-integrated robotic face can predict a smile before it happens.It's called Emo and it can anticipate and mimic human facial expressions.Engineers at Columbia University’s Creative ...
- People, not design features, make a robot socialon April 22, 2024 at 8:24 am
It takes a village to nurture social robots. Researchers who develop social robots—ones that people interact with—focus too much on design features and not enough on sociological factors, like ...
- Make robots hairyon April 19, 2024 at 11:36 pm
Meanwhile, a study published in 2022 titled “Designing Man’s New Best Friend: Enhancing Human-Robot Dog Interaction through Dog-Like Framing and Appearance” noted that Boston Dynamics’ (fur-free) ...
- Behind 'world's most advanced' humanoid robot in UK - multilingual, facial expressions and 'threat'on April 17, 2024 at 7:08 am
As a human-esque robot built with artificial intelligence, called Ameca, is set to be showcased around schools in Scotland, the Mirror takes a look into its lifelike characteristics and future use ...
- New Method Improves Human-Robot Interactionon April 17, 2024 at 6:01 am
Engineers at the Massachusetts Institute of Technology (MIT) have developed a safety check technique that can prove with 100 percent accuracy that a robot’s trajectory will remain collision-free.
- How the Figure AI humanoid robot was createdon April 17, 2024 at 5:52 am
Learn more about how Brett Adcock created the company responsible for designing and building the Figure AI humanoid robot to address ...
The Latest Google Headlines on:
Human-robot interaction
[google_news title=”” keyword=”human-robot interaction” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]