A robot in Cornell’s Personal Robotics Lab has learned to foresee human action and adjust accordingly.
Seeing a person carrying a bowl toward the refrigerator, a robot identifies the objects in the scene. Knowing that bowls are storable and refrigerators are places to store things, it projects possible trajectories for the bowl, and decides to open the refrigerator door.
The robot was programmed to refill a person’s cup when it was nearly empty. To do this, the robot must plan its movements in advance and then follow the plan. But if a human sitting at the table happens to raise the cup and drink from it, the robot might pour a drink into a cup that isn’t there. But when the robot sees the human reaching for the cop, it can anticipate the human action and avoid making a mistake. In another test, the robot observed a human carrying an object toward a refrigerator and helpfully opened the refrigerator door.
Hema S. Koppula, Cornell graduate student in computer science, and Ashutosh Saxena, assistant professor of computer science, will describe their work at International Conference of Machine Learning, June 18-21 in Atlanta, and the Robotics: Science and Systems conference June 24-28 in Berlin, Germany.
From a database of 120 3-D videos of people performing common household activities, the robot has been trained to identify human activities by tracking the movements of the body — reduced to a symbolic skeleton for easy calculation — breaking them down into sub-activities like reaching, carrying, pouring or drinking, and to associate the activities with objects. Since each person performs tasks a little differently, the robot can build a model that is general enough to match new events.
“We extract the general principles of how people behave,” said Saxena. “Drinking coffee is a big activity, but there are several parts to it.“ The robot builds a “vocabulary” of such small parts that it can put together in various ways to recognize a variety of big activities, he explained.
Observing a new scene with its Microsoft Kinnect 3-D camera, the robot identifies the activities it sees, considers what uses are possible with the objects in the scene and how those uses fit with the activities; it then generates a set of possible continuations into the future — such as eating, drinking, cleaning, putting away — and finally chooses the most probable. As the action continues, it constantly updates and refines its predictions.
The Latest Bing News on:
Robots foresee human action
- How Robotics have Impacted the Manufacturing Industryon February 22, 2021 at 12:11 am
Robotic manufacturing systems can be a relatively new idea for certain areas of the manufacturing industry, but the technology has been in use for decades. By merging conventional manufacturing ...
- The Best Jimmy Eat World Songs, Rankedon February 16, 2021 at 6:01 am
If I had to choose my favorite Jimmy Eat World sequencing trope, I’ll go with the “B-side banger.” See: “nothingwrong,” “Get It Faster,” “Through,” “Clarity,” “Action Needs ...
- How WEF’s Global Alliance Can Address Ethical AI Concernson January 29, 2021 at 4:20 am
For example, labor-replacing robots took over floor ... “We are launching the Global AI Action Alliance along with our partners to shape a positive, human-centred future for AI at this decisive ...
- 100 best action movies of all timeon January 27, 2021 at 7:17 am
A stalwart of modern popular culture, action movies pay ongoing tribute to the power of cinematic spectacle. The genre's origins are as old as the medium itself, though action films as we know ...
- 3d printedon January 26, 2021 at 3:59 pm
AutoWhiteboardBot hangs from motors which pull it around, but we’ve also seen a SCARA-type robot writing away on a whiteboard. Watch the video embedded below, which begins with sped-up footage ...
- Freeholders No More: Sussex County Commissioners Reorganizeon January 11, 2021 at 9:05 am
The Office of Public Health Nursing dedicated its entire staff to the pandemic response with significant support from the Office of Environmental Health and the Human Services Divisions.” ...
- Intelligent computerson November 13, 2019 at 2:11 pm
Some scientist, a responsible and distinguished man, no fantasist, said the other day that he could foresee the time, maybe 30 years from now, when human beings would be the pets of computers ...
- We Need to Update Our Rules for Roboticson February 28, 2017 at 12:07 pm
A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey the orders given it by human beings except where such orders would conflict with ...
- Science Fiction Studieson February 10, 2010 at 6:11 am
Yet he could not foresee how such a monolithic force as the military ... it portrays multiple options rather than a fixed blueprint; it is achieved through human will and action rather than as the ...
The Latest Google Headlines on:
Robots foresee human action
The Latest Bing News on:
- AI and Its Implications for Our Futureon February 17, 2021 at 4:00 pm
Artificial Intelligence, often just called AI, has been part of science fiction for decades but, in one form or another, is now part of our daily lives. Much confusion about what "AI" really means ...
- Scientists explore whether robots can make good friendson February 15, 2021 at 5:49 am
Researchers found several scenarios where robotic companionship can constructively augment people's lives, leading to friendships that are directly comparable to human-to-human relationships.
- Will robots make good friends? Scientists are already starting to find outon February 14, 2021 at 5:27 pm
My recent research on human-robot relationships examines this topic in detail, looking beyond sex robots and robot love affairs to examine that most profound and meaningful of relationships: ...
- New AI technique allows robots to detect human touch by analyzing shadowson February 11, 2021 at 12:13 pm
Scientists from Cornell University say the method provides a low-cost way for robots to identify physical interactions with humans.
- Robots discern human touch by sensing shadow movementson February 11, 2021 at 3:56 am
Researchers have created robots that detect a range of physical interaction using 'ShadowSense technology'. The low-cost method, developed by Cornell University researchers, for soft, deformable ...
- AI takes centre stage at international conference in Christchurchon February 10, 2021 at 2:54 pm
How people interact with robots and virtual characters will be the hot topic at an international conference booked for Christchurch next year.
- Robots ‘capture’ shadows to sense touchon February 9, 2021 at 7:11 am
Researchers in the US have developed a low-cost method for soft, deformable robots to detect physical interactions, from pats to punches to hugs, without relying on touch.
- Great Interface Design Is the Key to Human-robot Performanceon February 4, 2021 at 4:00 pm
As there are some aspects of robot UX that require more exploration than for flat-screen design, here are four key things to pay particular attention to in order to improve human-robot interaction: UX ...
- human-robot interactionon February 1, 2021 at 4:00 pm
In such cases any human-robot interaction (HRI) will be superficial. Yet what if humans and robots have to work alongside each other? This is a question which a group of students at MIT’s ...
- Robot empowerment - a viable alternative to Asimov's three laws of robotics?on January 29, 2021 at 2:17 am
Human-robot interaction is upon us - we're in dire need of a framework that makes sense. Asimov's three laws of robotics are one model, but is it applicable to today's robots? An alternative based on ...