
A new algorithm helps robots predict the paths people take in structured environments like the factory floor, which may further enable close collaboration between humans and machines.
A new tool for predicting a person’s movement trajectory may help humans and robots work together in close proximity.
In 2018, researchers at MIT and the auto manufacturer BMW were testing ways in which humans and robots might work in close proximity to assemble car parts. In a replica of a factory floor setting, the team rigged up a robot on rails, designed to deliver parts between work stations. Meanwhile, human workers crossed its path every so often to work at nearby stations.
The robot was programmed to stop momentarily if a person passed by. But the researchers noticed that the robot would often freeze in place, overly cautious, long before a person had crossed its path. If this took place in a real manufacturing setting, such unnecessary pauses could accumulate into significant inefficiencies.
The team traced the problem to a limitation in the robot’s trajectory alignment algorithms used by the robot’s motion predicting software. While they could reasonably predict where a person was headed, due to the poor time alignment the algorithms couldn’t anticipate how long that person spent at any point along their predicted path — and in this case, how long it would take for a person to stop, then double back and cross the robot’s path again.
Now, members of that same MIT team have come up with a solution: an algorithm that accurately aligns partial trajectories in real-time, allowing motion predictors to accurately anticipate the timing of a person’s motion. When they applied the new algorithm to the BMW factory floor experiments, they found that, instead of freezing in place, the robot simply rolled on and was safely out of the way by the time the person walked by again.
“This algorithm builds in components that help a robot understand and monitor stops and overlaps in movement, which are a core part of human motion,” says Julie Shah, associate professor of aeronautics and astronautics at MIT. “This technique is one of the many way we’re working on robots better understanding people.”
Shah and her colleagues, including project lead and graduate student Przemyslaw “Pem” Lasota, will present their results this month at the Robotics: Science and Systems conference in Germany.

Clustered up
To enable robots to predict human movements, researchers typically borrow algorithms from music and speech processing. These algorithms are designed to align two complete time series, or sets of related data, such as an audio track of a musical performance and a scrolling video of that piece’s musical notation.
Researchers have used similar alignment algorithms to sync up real-time and previously recorded measurements of human motion, to predict where a person will be, say, five seconds from now. But unlike music or speech, human motion can be messy and highly variable. Even for repetitive movements, such as reaching across a table to screw in a bolt, one person may move slightly differently each time.
Existing algorithms typically take in streaming motion data, in the form of dots representing the position of a person over time, and compare the trajectory of those dots to a library of common trajectories for the given scenario. An algorithm maps a trajectory in terms of the relative distance between dots.
But Lasota says algorithms that predict trajectories based on distance alone can get easily confused in certain common situations, such as temporary stops, in which a person pauses before continuing on their path. While paused, dots representing the person’s position can bunch up in the same spot.
“When you look at the data, you have a whole bunch of points clustered together when a person is stopped,” Lasota says. “If you’re only looking at the distance between points as your alignment metric, that can be confusing, because they’re all close together, and you don’t have a good idea of which point you have to align to.”
The same goes with overlapping trajectories — instances when a person moves back and forth along a similar path. Lasota says that while a person’s current position may line up with a dot on a reference trajectory, existing algorithms can’t differentiate between whether that position is part of a trajectory heading away, or coming back along the same path.
“You may have points close together in terms of distance, but in terms of time, a person’s position may actually be far from a reference point,” Lasota says.
It’s all in the timing
As a solution, Lasota and Shah devised a “partial trajectory” algorithm that aligns segments of a person’s trajectory in real-time with a library of previously collected reference trajectories. Importantly, the new algorithm aligns trajectories in both distance and timing, and in so doing, is able to accurately anticipate stops and overlaps in a person’s path.
“Say you’ve executed this much of a motion,” Lasota explains. “Old techniques will say, ‘this is the closest point on this representative trajectory for that motion.’ But since you only completed this much of it in a short amount of time, the timing part of the algorithm will say, ‘based on the timing, it’s unlikely that you’re already on your way back, because you just started your motion.’”
The team tested the algorithm on two human motion datasets: one in which a person intermittently crossed a robot’s path in a factory setting (these data were obtained from the team’s experiments with BMW), and another in which the group previously recorded hand movements of participants reaching across a table to install a bolt that a robot would then secure by brushing sealant on the bolt.
For both datasets, the team’s algorithm was able to make better estimates of a person’s progress through a trajectory, compared with two commonly used partial trajectory alignment algorithms. Furthermore, the team found that when they integrated the alignment algorithm with their motion predictors, the robot could more accurately anticipate the timing of a person’s motion. In the factory floor scenario, for example, they found the robot was less prone to freezing in place, and instead smoothly resumed its task shortly after a person crossed its path.
While the algorithm was evaluated in the context of motion prediction, it can also be used as a preprocessing step for other techniques in the field of human-robot interaction, such as action recognition and gesture detection. Shah says the algorithm will be a key tool in enabling robots to recognize and respond to patterns of human movements and behaviors. Ultimately, this can help humans and robots work together in structured environments, such as factory settings and even, in some cases, the home.
“This technique could apply to any environment where humans exhibit typical patterns of behavior,” Shah says. “The key is that the [robotic] system can observe patterns that occur over and over, so that it can learn something about human behavior. This is all in the vein of work of the robot better understand aspects of human motion, to be able to collaborate with us better.”
Learn more: Algorithm tells robots where nearby humans are headed
The Latest on: Robots and humans working together
[google_news title=”” keyword=”robots and humans working together” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: Robots and humans working together
- NVIDIA Eases The Complexity Of Robot Trainingon March 21, 2023 at 1:01 pm
GPU market leader NVIDIA is holding its virtual GPU Technology Conference (GTC). Over the years, GTC has evolved from a graphics and gaming show to an industry event dedicated to all things artificial ...
- Digit Robot Is Ready to Handle 'Dull, Dirty, and Dangerous' Work at Warehouseson March 21, 2023 at 10:46 am
The robot from Agility Robotics can carry up to 40 pounds of weight while negotiating uneven terrain and avoiding obstacles.
- This human-size robot now has ‘eyes’ that show people where it’s goingon March 20, 2023 at 5:27 pm
It’s not clear how much it will cost the companies looking to get access to the bot, but Agility Robotics is opening up its Agility Partner Program, which will “provide partners with an exclusive ...
- Poet vs. chatbot: We gave the same prompt to a human, Microsoft Bing, and OpenAI’s new GPT-4on March 18, 2023 at 8:36 am
Are robots more poetic than humans? That was my question as I saw Alandra Markman seated on the sidewalk with an old typewriter on a recent brisk Sunday morning at… Read More ...
- Our Reactions To The Treatment Of Robotson March 16, 2023 at 5:00 pm
As the one who’d put Obby together ... work with. I also fear putting people in the uncanny valley. That’s where people feel an eeriness or revulsion toward a robot which looks almost human ...
- Shaping Robots, Environments, and People for Harmonyon March 16, 2023 at 5:56 am
Rod Brooks and Clara Vu both bring years of expertise to the question, and can speak to what’s needed in hardware, software, and beyond to bring the lives of robots and humans closer together.
- Humans Will Have Robot Twins By 2050 To Do Our Work For Us', Dubai Tech Lectures Hearon March 15, 2023 at 10:11 pm
The session brought together science and technology ... sessions were the possibility of robots becoming digital twins of humans and being able to carry out work and tasks instead of them, more ...
- The Robots Of Fukushima: Going Where No Human Has Gone Before (And Lived)on March 13, 2023 at 5:00 pm
The idea of sending robots into conditions that humans would not survive is a very ... What’s more, robots that worked on one reactor didn’t work on others, creating the need for yet more ...
- Robots can help improve mental wellbeing at work -- as long as they look righton March 13, 2023 at 5:00 pm
Robots can be useful as mental wellbeing coaches in the workplace -- but perception of their effectiveness depends in large part on what the robot looks like. Researchers from the University of ...
- Purdue University professor working to help robots better work with humanson March 11, 2023 at 3:51 pm
Working together with other technology, like drones, robots — like the robotic dog Bera is working with — can climb terrains that can be difficult for humans. Other robots might be able to ...
via Bing News
One Comment
Mark
Safety becomes a factor of concern when Robots and humans share their workspace. Thank you dear writer for penning your thoughts on the new innovation of robots detecting a person’s movement, thus making it easier for both to work together and at closer vicinity. The article was very informative.