Brown University researchers have developed software that lets users control robots over the internet with off-the-shelf virtual reality hardware.
Even as autonomous robots get better at doing things on their own, there will still be plenty of circumstances where humans might need to step in and take control. New software developed by Brown University computer scientists enables users to control robots remotely using virtual reality, which helps users to become immersed in a robot’s surroundings despite being miles away physically.
The software connects a robot’s arms and grippers as well as its onboard cameras and sensors to off-the-shelf virtual reality hardware via the internet. Using handheld controllers, users can control the position of the robot’s arms to perform intricate manipulation tasks just by moving their own arms. Users can step into the robot’s metal skin and get a first-person view of the environment, or can walk around the robot to survey the scene in the third person — whichever is easier for accomplishing the task at hand. The data transferred between the robot and the virtual reality unit is compact enough to be sent over the internet with minimal lag, making it possible for users to guide robots from great distances.
“We think this could be useful in any situation where we need some deft manipulation to be done, but where people shouldn’t be,” said David Whitney, a graduate student at Brown who co-led the development of the system. “Three examples we were thinking of specifically were in defusing bombs, working inside a damaged nuclear facility or operating the robotic arm on the International Space Station.”
Whitney co-led the work with Eric Rosen, an undergraduate student at Brown. Both work in Brown’s Humans to Robots lab, which is led by Stefanie Tellex, an assistant professor of computer science. A paper describing the system and evaluating its usability was presented this week at the International Symposium on Robotics Research in Chile.
Even highly sophisticated robots are often remotely controlled using some fairly unsophisticated means — often a keyboard or something like a video game controller and a two-dimensional monitor. That works fine, Whitney and Rosen say, for tasks like driving a wheeled robot around or flying a drone, but can be problematic for more complex tasks.
“For things like operating a robotic arm with lots of degrees of freedom, keyboards and game controllers just aren’t very intuitive,” Whitney said. And mapping a three-dimensional environment onto a two-dimensional screen could limit one’s perception of the space the robot inhabits.
Whitney and Rosen thought virtual reality might offer a more intuitive and immersive option. Their software links together a Baxter research robot with an HTC Vive, a virtual reality system that comes with hand controllers. The software uses the robot’s sensors to create a point-cloud model of the robot itself and its surroundings, which is transmitted to a remote computer connected to the Vive. Users can see that space in the headset and virtually walk around inside it. At the same time, users see live high-definition video from the robot’s wrist cameras for detailed views of manipulation tasks to be performed.
For their study, the researchers showed that they could create an immersive experience for users while keeping the data load small enough that it could be carried over the internet without a distracting lag. A user in Providence, R.I., for example, was able to perform a manipulation task — the stacking of plastic cups one inside the others — using a robot 41 miles away in Cambridge, Mass.
In additional studies, 18 novice users were able to complete the cup-stacking task 66 percent faster in virtual reality compared with a traditional keyboard-and-monitor interface. Users also reported enjoying the virtual interface more, and they found the manipulation tasks to be less demanding compared with keyboard and monitor.
Rosen thinks the increased speed in performing the task was due to the intuitiveness of the virtual reality interface.
“In VR, people can just move the robot like they move their bodies, and so they can do it without thinking about it,” Rosen said. “That lets people focus on the problem or task at hand without the increased cognitive load of trying to figure out how to move the robot.”
The researchers plan to continue developing the system. The first iteration focused on a fairly simple manipulation task with a robot that was stationary in the environment. They’d like to try more complex tasks and later combine manipulation with navigation. They’d also like to experiment with mixed autonomy, where the robot does some tasks on its own and the user takes over for other tasks.
The researchers have made the system freely available on the web. They hope other robotics researchers might give it a try and take it in new directions of their own.
Learn more: Software enables robots to be controlled in virtual reality
The Latest on: Virtual reality controlled robots
[google_news title=”” keyword=”virtual reality controlled robots” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]- Inside the virtual reality software LSU's Jayden Daniels used to help become a top NFL pickon April 25, 2024 at 3:05 am
LSU quarterback Jayden Daniels used a virtual reality software created by a German company to help him improve.
- The robots are coming for your brand endorsementson April 24, 2024 at 8:19 pm
Where do brands go from here? The key lies in responsible and ethical use of this new technology. Transparency is the cornerstone. Consumers deserve to know if they're interacting with an AI persona.
- Every Canon Vault Number Used In The Fallout Universeon April 24, 2024 at 9:30 am
If you ever wondered how many Vaults we have known about in the main Fallout games and in the TV series, we have answers for you.
- This time management app works across all your Apple devices and will make planning your days a piece of cakeon April 24, 2024 at 7:07 am
If you’re looking for a way to track your time during the day – or even keep track of the way your whole team at work spends their day – then Tyme 3 is a tool that’s designed to not only give you more ...
- The Showdown Over Who Gets to Build the Next DeLoreanon April 24, 2024 at 3:00 am
Decades after her dad’s iconic sports car time-traveled into movie history, Kat DeLorean wants to build a modern remake. There’s just one problem: Someone else owns the trademark on her name.
- Empowering Creativity: 10 Innovative Tech Gadgets to Spark Imagination in Kidson April 23, 2024 at 10:44 pm
In a world driven by technology, fostering creativity in children has become more important than ever. Fortunately, with the rise of innovative tech gadgets, parents now have a plethora of tools at ...
- From Learning to Play: Top 10 Educational Tech Gadgets for Kids in 2024on April 23, 2024 at 10:14 pm
In the ever-evolving landscape of educational technology, parents are continuously seeking innovative ways to blend learning with playtime. With the advancement of technology, there’s a vast array of ...
- igus to show affordable automation at Hannover Messe, Robotics Summiton April 23, 2024 at 8:16 am
At its annual press conference last week, igus GmbH previewed numerous new products in advance of the Hannover Messe trade show. The Cologne, Germany-based company announced 247 new products, ...
- This is what air travel will look like in 2030on April 19, 2024 at 4:00 pm
From pilotless air taxis and robot-controlled airports to glass-bottomed airships, these futuristic innovations in air travel are all coming soon.
- Star Trek’s Holodeck Recreated As Virtual Training Ground For Next-Gen Robotson April 15, 2024 at 5:44 am
The holodeck system depicted in Star Trek series like The Next Generation and Voyager is an infinitely customizable virtual environment, capable of turning quite basic verbal commands into complete, ...
via Google News and Bing News