
Cornell researchers have created a low-cost way to enable soft robots to detect a range of physical human interactions, from pats to punches to hugs, by using an off-the-shelf USB camera that “captures” the shadows made by hand gestures on the robot’s skin and physical contact.
Soft robots may not be in touch with human feelings, but they are getting better at feeling human touch.
Cornell researchers have created a low-cost method for soft, deformable robots to detect a range of physical interactions, from pats to punches to hugs, without relying on touch at all. Instead, a USB camera located inside the robot captures the shadow movements of hand gestures on the robot’s skin and classifies them with machine-learning software.
The group’s paper, “ShadowSense: Detecting Human Touch in a Social Robot Using Shadow Image Classification,” published Dec. 17 in the Proceedings of the Association for Computing Machinery on Interactive, Mobile, Wearable and Ubiquitous Technologies. The paper’s lead author is doctoral student Yuhan Hu.
The new ShadowSense technology is the latest project from the Human-Robot Collaboration and Companionship Lab, led by the paper’s senior author, Guy Hoffman, associate professor and the Mills Family Faculty Fellow in the Sibley School of Mechanical and Aerospace Engineering.

“Touch is such an important mode of communication for most organisms, but it has been virtually absent from human-robot interaction. One of the reasons is that full-body touch used to require a massive number of sensors, and was therefore not practical to implement,” Hoffman said. “This research offers a low-cost alternative.”
The technology originated as part of a collaboration with Hadas Kress-Gazit, professor in the Sibley School of Mechanical and Aerospace Engineering, and Kirstin Petersen, assistant professor of electrical and computer engineering, to develop inflatable robots that could guide people to safety during emergency evacuations. Such a robot would need to be able to communicate with humans in extreme conditions and environments. Imagine a robot physically leading someone down a noisy, smoke-filled corridor by detecting the pressure of the person’s hand.
Rather than installing a large number of contact sensors – which would add weight and complex wiring to the robot, and would be difficult to embed in a deforming skin – the team took a counterintuitive approach. In order to gauge touch, they looked to sight.
“By placing a camera inside the robot, we can infer how the person is touching it and what the person’s intent is just by looking at the shadow images,” Hu said. “We think there is interesting potential there, because there are lots of social robots that are not able to detect touch gestures.”
The prototype robot, designed by Petersen’s Collective Embodied Intelligence Lab, consists of a soft inflatable bladder of nylon skin stretched around a cylindrical skeleton, roughly four feet in height, that is mounted on a mobile base. Under the robot’s skin is a USB camera, which connects to a laptop. The researchers developed a neural-network-based algorithm that uses previously recorded training data to distinguish between six touch gestures – touching with a palm, punching, touching with two hands, hugging, pointing and not touching at all – with an accuracy of 87.5 to 96%, depending on the lighting.
The robot can be programmed to respond to certain touches and gestures, such as rolling away or issuing a message through a loudspeaker. And the robot’s skin has the potential to be turned into an interactive screen.
By collecting enough data, a robot could be trained to recognize an even wider vocabulary of interactions, custom-tailored to fit the robot’s task, Hu said.
The robot doesn’t even have to be a robot. ShadowSense technology can be incorporated into other materials, such as balloons, turning them into touch-sensitive devices.
“While the technology has certain limitations, for example requiring a line of sight from the camera to the robot’s skin, these constraints could actually spark a new approach to social robot design that would support a visual touch sensor like the one we proposed,” Hoffman said. “In the future, we would like to experiment with using optical devices such as lenses and mirrors to enable additional form factors.”
In addition to providing a simple solution to a complicated technical challenge, and making robots more user-friendly to boot, ShadowSense offers a comfort that is increasingly rare in these high-tech times: privacy.
“If the robot can only see you in the form of your shadow, it can detect what you’re doing without taking high fidelity images of your appearance,” Hu said. “That gives you a physical filter and protection, and provides psychological comfort.”
The ability to physically interact and understand a person’s movements and moods could ultimately be just as important to the person as it is to the robot.
“Touch interaction is a very important channel in terms of human-human interaction. It is an intimate modality of communication,” Hu said. “And that’s not easily replaceable.”
The paper was co-authored by former research intern Sara Maria Bejarano of the University of Los Andes, Colombia.
Original Article: Soft robots use camera and shadows to sense human touch
More from: Cornell University | University of Los Andes
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Robot touch
- The future of robotics is soft and tactile – TUD startup teaches robots to feel
Robotics has evolved at an unprecedented rate over the past several decades. Yet many robots remain inflexible, cumbersome and noisy. Now, the TU Dresden spin-off PowerON seeks to change that. It aims ...
- Terminator-style shape-shifting robot can liquefy itself and pass through metal bars
‘But the magnetic particles also give the robots mobility and the ability to move in response to the magnetic field.’ Get in touch with our news team by emailing us at [email protected] For more ...
- Robot Cars Are Causing 911 False Alarms in San Francisco
City agencies say the incidents and other disruptions show the need for more transparency about the vehicles and a pause on expanding service.
- Robots Fitted With Live Locust Antennae Could Be the Next Sniffer Dogs
But looking to the future, Maoz believes it's possible, and even says "the sky is the limit" when it comes to integrating robots with biological sensors. Perhaps, he suggests, there'd be a way to give ...
- Can this new artificial skin transform touch screens and video games?
A new artificial skin can detect close objects and could be used in consumer electronics and robots, or even given to burn survivors.
Go deeper with Google Headlines on:
Robot touch
[google_news title=”” keyword=”robot touch” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]
Go deeper with Bing News on:
ShadowSense technology
- Gadgets and Technology News
If you are interested in building your very own electric guitar from recycled aluminum cans, you are sure to enjoy a new video published to YouTube this month by Burlls Art. Using 1000 recycled ...
- Technology has an interface problem
In 2020, Amazon launched its first piece of wearable technology, the Halo. Like an Apple Watch or Fitbit, the Halo could tell you how many steps you take, track your sleeping habits, and guide ...
- Privacy & Technology
The ACLU works to expand the right to privacy, increase the control individuals have over their personal information, and ensure civil liberties are enhanced rather than compromised by technological ...
- Catalysis Science & Technology
Material intended for Catalysis Science & Technology Communications should be of specific specialist interest to researchers in the field of catalysis. Full papers based upon Communications will be ...
- Technology Stocks
5-Star 4-Star 3-Star 2-Star 1-Star We sell different types of products and services to both investment professionals and individual investors. These products and services are usually sold through ...
Go deeper with Google Headlines on:
ShadowSense technology
[google_news title=”” keyword=”ShadowSense technology” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]