Bringing remote users to life in real time with a robotic proxy

Mose Sakashita, a doctoral student in the field of information science, with the ReMotion robot.

Mose Sakashita, a doctoral student in the field of information science, with the ReMotion robot.

Bringing remote users to life in real time with a robotic proxy

Cornell researchers have developed a robot called ReMotion that occupies physical space on a remote user’s behalf, automatically mirroring the user’s movements in real time and conveying key body language that is lost in standard virtual environments.

“Pointing gestures, the perception of another’s gaze, intuitively knowing where someone’s attention is – in remote settings, we lose these nonverbal, implicit cues that are very important for carrying out design activities,” said Mose Sakashita, a doctoral student in the field of information science.

Sakashita is the lead author of “ReMotion: Supporting Remote Collaboration in Open Space with Automatic Robotic Embodiment,” which he presented at the Association for Computing Machinery CHI Conference on Human Factors in Computing Systems in Hamburg, Germany in April. “With ReMotion, we show that we can enable rapid, dynamic interactions through the help of a mobile, automated robot.”

With further development, ReMotion could be deployed in virtual collaborative environments as well as in classrooms and other educational settings, Sakashita said.

The idea for ReMotion came out of Sakashita’s experience as a teaching assistant for a popular rapid prototyping course in the spring 2020 semester, which was held largely online due to COVID-19. Confined with students to a virtual learning environment, Sakashita came to understand that physical movement is vital in collaborative design projects: teammates lean in to survey parts of the prototype; they inspect circuits, troubleshoot faulty code together and then may draw up solutions on a nearby whiteboard.

This range of motion is all but lost in a virtual environment, as are the subtle ways collaborators communicate through body language and expressions, he said.

“It was super challenging to teach. There are so many tasks that are involved when you’re doing a hands-on design activity,” Sakashita said. “The kind of instinctive, dynamic transitions we make – like gesturing or addressing a collaborator – are too dynamic to simulate through Zoom.”

The lean, nearly six-foot-tall ReMotion device itself is outfitted with a monitor for a head, omnidirectional wheels for feet and game-engine software for brains. It automatically mirrors the remote user’s movements – thanks to another Cornell-made device, Neckface, which the remote user wears to track head and body movements. The motion data is then sent remotely to the ReMotion robot in real-time.

Telepresence robots are not new, but remote users generally need to steer them manually, distracting from the task at hand, researchers said. Other options such as virtual reality and mixed reality collaboration can also require an active role from the user and headsets may limit peripheral awareness, researchers added.

In a small study of about a dozen participants, nearly all reported a heightened sense of co-presence and behavioral interdependence when using ReMotion compared to an existing telerobotic system. Participants also reported significantly higher shared attention among remote collaborators.

In its current form, ReMotion only works with two users in a one-on-one remote environment, and each user must occupy physical spaces of identical size and layout. In future work, ReMotion developers intend to explore asymmetrical scenarios, like a single remote team member collaborating virtually via ReMotion with multiple teammates in a larger room.

See Also

 

Original Article: I, robot: Remote proxy collaborates on your behalf

More from: Cornell University 

 

 

The Latest Updates from Bing News

Go deeper with Bing News on:
ReMotion
Go deeper with Bing News on:
Telepresence robots
What's Your Reaction?
Don't Like it!
0
I Like it!
0
Scroll To Top