Electrical engineers at the University of California San Diego have developed a faster collision detection algorithm that uses machine learning to help robots avoid moving objects and weave through complex, rapidly changing environments in real time. The algorithm, dubbed “Fastron,” runs up to 8 times faster than existing collision detection algorithms.
A team of engineers, led by Michael Yip, a professor of electrical and computer engineering and member of the Contextual Robotics Institute at UC San Diego, will present the new algorithm at the first annual Conference on Robot Learning Nov. 13 to 15 at Google headquarters in Mountain View, Calif. The conference brings the top machine learning scientists to an invitation-only event. Yip’s team will deliver one of the long talks during the 3-day conference.
The team envisions that Fastron will be broadly useful for robots that operate in human environments where they must be able to work with moving objects and people fluidly. One application they are exploring in particular is robot-assisted surgeries using the da Vinci Surgical System, in which a robotic arm would autonomously perform assistive tasks (suction, irrigation or pulling tissue back) without getting in the way of the surgeon-controlled arms or the patient’s organs.
“This algorithm could help a robot assistant cooperate in surgery in a safe way,” Yip said.
The team also envisions that Fastron can be used for robots that work at home for assisted living applications, as well as for computer graphics for the gaming and movie industry, where collision checking is often a bottleneck for most algorithms.
A problem with existing collision detection algorithms is that they are very computation-heavy. They spend a lot of time specifying all the points in a given space—the specific 3D geometries of the robot and obstacles—and performing collision checks on every single point to determine whether two bodies are intersecting at any given time. The computation gets even more demanding when obstacles are moving.
To lighten the computational load, Yip and his team in the Advanced Robotics and Controls Lab (ARClab) at UC San Diego developed a minimalistic approach to collision detection. The result was Fastron, an algorithm that uses machine learning strategies—which are traditionally used to classify objects—to classify collisions versus non-collisions in dynamic environments. “We actually don’t need to know all the specific geometries and points. All we need to know is whether the robot’s current position is in collision or not,” said Nikhil Das, an electrical engineering Ph.D. student in Yip’s group and the study’s first author.
Fastron simulation: the autonomous arm (blue arm) reaches the target configuration (wireframe arm) while avoiding the motions of a human-controlled arm (red arm). Image courtesy of ARClab at UC San Diego.
The Fastron algorithm
The name Fastron comes from combining Fast and Perceptron, which is a machine learning technique for performing classification. An important feature of Fastron is that it updates its classification boundaries very quickly to accommodate for moving scenes, something that has been challenging for the machine learning community in general to do.
Fastron’s active learning strategy works using a feedback loop. It starts out by creating a model of the robot’s configuration space, or C-space, which is the space showing all possible positions the robot can attain. Fastron models the C-space using just a sparse set of points, consisting of a small number of so-called collision points and collision-free points. The algorithm then defines a classification boundary between the collision and collision-free points—this boundary is essentially a rough outline of where the abstract obstacles are in the C-space. As obstacles move, the classification boundary changes. Rather than performing collision checks on each point in the C-space, as is done with other algorithms, Fastron intelligently selects checks near the boundaries. Once it classifies the collisions and non-collisions, the algorithm updates its classifier and then continues the cycle.
Because Fastron’s models are more simplistic, the researchers set its collision checks to be more conservative. Since just a few points represent the entire space, Das explained, it’s not always certain what’s happening in the space between two points, so the team developed the algorithm to predict a collision in that space. “We leaned toward making a risk-averse model and essentially padded the workspace obstacles,” Das said. This ensures that the robot can be tuned to be more conservative in sensitive environments like surgery, or for robots that work at home for assisted living.
The team has so far demonstrated the algorithm in computer simulations on robots and obstacles in simulation. Moving forward, the team is working to further improve the speed and accuracy of Fastron. Their goal is to implement Fastron in a robotic surgery and a homecare robot setting.
The Latest on: Robots interacting with humans
- How to train a robot (using AI and supercomputers)on January 19, 2021 at 5:36 pm
Computer scientists developed a deep learning method to create realistic objects for virtual environments that can be used to train robots. The researchers used TACC's Maverick2 supercomputer to train ...
- Humanoid Robots Market Rise Impacted by Pandemic but Future Outlook Remains Positive According MRFRon January 19, 2021 at 5:40 am
Market Research Future highlights that the Novel Coronavirus Outbreak Influences Humanoid Robots Market Dynamics ...
- The Pros And Cons Of Anthropomorphic Robotson January 18, 2021 at 11:33 pm
Anthropomorphism in robots has become a hot topic in recent times. Let’s take a look at the positive and negative implications. A recent study showed the human brains react in the same way while ...
- Can Care Robots Improve Quality Of Life As We Age?on January 18, 2021 at 6:00 am
As a growing class of care robots claim to provide the elderly with the benefits of human connection through technology, what risks and opportunities must designers consider?
- Customers prefer robots to be human-like in service interactions, research showson January 18, 2021 at 3:40 am
Customers prefer robots to have human-like characteristics in service interactions such as banking, hotel reception, and information provision, according to new research from Durham University ...
- CES 2021 showed us how robots can ease our pandemic woeson January 17, 2021 at 9:29 am
This story is part of CES, where our editors will bring you the latest news and the hottest gadgets of the entirely virtual CES 2021. The first all-digital CES has come and gone, and it was a very ...
- Report forecasts speech technology advancements in human-robot teamingon January 15, 2021 at 10:39 pm
“Many Army programs that have a critical need for autonomous systems to interact through ... language interaction with robots, Marge said.They also made 31 recommendations involving eight general ...
- “What To Expect When You’re Expecting Robots” Has Some Good Information But Far Too Narrow A Focuson January 15, 2021 at 10:32 am
Human-robot collaboration is a critical issue that needs to be addressed. There are some very annoying things in the book and key pieces missing from it. Still, there’s enough of value to make it ...
- Best robots at CES 2021: Humanoid hosts, AI pets, UV-C disinfecting bots, and moreon January 13, 2021 at 7:27 am
CES features the latest in robotics innovation, and this year is no exception. From disinfecting robots to AI-enabled companions, here are some of the best robots at CES 2021.
- CES 2021 NUWA's robot innovates learning with social interactionon January 12, 2021 at 6:43 am
NUWA Robotics Corp. is a technology company that focuses on AI robot development. Its latest robot Kebbi Air provides a creative way ...
via Google News and Bing News