Now Reading
Controlling Robots with your Thoughts

Controlling Robots with your Thoughts

robots
SINTEF-scientist Ingrid Schjølberg is demonstrating her three-fingered robotic grasper. Teaching robots new ways of grasping will greatly benefit the manufacturing industry, she says. Courtest of Thor Nielsen.

Angel Perez Garcia can make a robot move exactly as he wants via the electrodes attached to his head.

“I use the movements of my eyes, eyebrows and other parts of my face”, he says. “With my eyebrows I can select which of the robot’s joints I want to move” smiles Angel, who is a Master’s student at NTNU.

Facial grimaces generate major electrical activity (EEG signals) across our heads, and the same happens when Angel concentrates on a symbol, such as a flashing light, on a computer monitor. In both cases the electrodes read the activity in the brain. The signals are then interpreted by a processor which in turn sends a message to the robot to make it move in a pre-defined way.

“I can focus on a selection of lights on the screen. The robot’s movements depend on which light I select and the type of activity generated in my brain”, says Angel. “The idea of controlling a robot simply by using our thoughts (EEG brainwave activity), is fascinating and futuristic”, he says.

A school for robots

Angel Garcia is not alone in developing new ways of manoeuvring robots. Today, teaching robots dominates activity among the cybernetics community at NTNU/SINTEF.

In the robotics hall, fellow student Signe Moe is guiding a robot by moving her arms, while SINTEF researcher and supervisor Ingrid Schjølberg is using a new training programme to try to get her three-fingered robot to grasp objects in new ways.

“Why all this enthusiasm for training?”

“Well, everyone knows about industrial robots used on production lines to pick up and assemble parts”, says Schjølberg. “They are pre-programmed and relatively inflexible, and carry out repeated and identical movements of specialised graspers adapted to the parts in question”, she says.

“So you are developing something new?”

“We can see that industries encounter major problems every time a new part is brought in and has to be handled on the production line”, she says. “The replacement of graspers and the robot’s guidance programme is a complex process, and we want to make this simpler. We want to be able to programme robots more intuitively and not just in the traditional way using a panel with buttons pressed by an operator.

https://www.youtube.com/watch?v=JgdnMFKELAA

“We want you to move over here”

Signe Moe’s task has thus been to find out how a robot can be trained to imitate human movements. She has solved this using a system by which she guides the robot using a Kinect camera of the type used in games technology.

“Now it’s possible for anyone to guide the robot”, says Moe. “Not long ago some 6th grade pupils visited us here at the robotics hall. They were all used to playing video games, so they had no problems in guiding the robot”, she says.

To demonstrate, she stands about a metre and a half in front of the camera. “Firstly, I hold up my right hand and make a click in the air. This causes the camera to register me and trace the movements of my hand”, says Moe. “Now, when I move my hand up and to the right, you can see that the robot imitates my movements”, she says.

See Also

“It looks simple enough, but what happens if….?

“The Kinect camera has built-in algorithms which can trace the movements of my hand”, she says. “All we have to do is to transpose these data to define the position we want the robot to assume, and set up a communications system between the sensors in the camera and the robot”, she explains. “In this way the robot receives a reference along the lines of ‘we want you to move over here’, and an in-built regulator then computes how it can achieve the movement and how much electricity the motor requires to carry the movement out” says Moe.

New learning using camera images and sensors

Ingrid Schjølberg is demonstrating her three-fingered robotic grasper. Teaching robots new ways of grasping will greatly benefit the manufacturing industry, and this is why researchers are testing out new approaches.

“We are combining sensors in the robotic hand with Kinect images to identify the part which has to be picked up and handled”, says Schjølberg. “In this way the robot can teach itself the best ways of adapting its grasping action”, she says. “It is trying out different grips in just the same way as we humans do when picking up an unfamiliar object. We’ve developed some pre-determined criteria for what defines a good and bad grip”, she explains. “The robot is testing out several different grips, and is praised or scolded for each attempt by means of a points score”, smiles Schjølberg.

Read more . . .

 

The Latest Bing News on:
Controlling Robots with your Thoughts
  • Humanoid robots are learning to fall well
    on April 28, 2024 at 1:15 pm

    If you can go through that cycle of pushing your robot to failure, studying the failure, and fixing it, you can make progress to where it’s not falling. But if you build a machine or a control system ...

  • Europe taps deep learning to make industrial robots safer colleagues
    on April 26, 2024 at 1:07 am

    European researchers have launched the RoboSAPIENS project to make adaptive industrial robots more efficient and safer to work with humans.

  • Total Hack - Control Robots
    on April 25, 2024 at 2:03 pm

    The magazine Total Hack - Control Robots ... and hack your way into the room beyond. Here you'll find the magazine, as well as the terminal to activate the robots (which you can test your new ...

  • The robots are coming ... for your wedding
    on April 24, 2024 at 1:59 am

    Of course. Why didn’t we think of that? And, if you pay a little extra, Party Robot will blast a carbon-dioxide cannon of cold fog at your guests, and also let the bride and groom take a few ...

  • Throwflame unveils robot dog Thermonator — with flamethrower attached
    on April 23, 2024 at 7:29 am

    Throwflame suggest the robot can be used for wildfire control and prevention, entertainment shows or even clearing snow and ice from your driveway.

  • The 6 best self-emptying robot vacuums for no-effort cleaning in 2024
    on April 19, 2024 at 2:00 am

    The best self-emptying robot vacuums have one huge perk past the obvious act of vacuuming the floor for you. (Hint: It's in the name.) When you outsource vacuuming to a robot, your floors get ...

  • The 5 best robot vacuums for pet hair in 2024
    on April 12, 2024 at 12:42 pm

    The robot offers up to 90 minutes of cleaning time. The budget-minded Roomba 694 is Wi-Fi-enabled. Control the vacuum with your connected smartphone or tablet via the iRobot Home app. The Roomba ...

  • Could this robot unicorn be your next smart family member?
    on April 11, 2024 at 3:00 am

    Imagine your child’s excitement as they hop onto their very own smart vehicle – the XPENG Robot Unicorn. Its unique riding and driving system ensures a safe experience, and custom safety gear ...

  • MIT Technology Review
    on April 10, 2024 at 5:00 pm

    Plus: The complex math of counterfactuals could help Spotify pick your next favorite song ... roboticists have more or less focused on controlling robots’ “bodies”—their arms, legs ...

  • Meet the Robots Slicing Your Barbecue Ribs
    on April 9, 2024 at 6:00 am

    In Denison, Iowa, a robot spends eight hours a day slicing apart hog carcasses at a plant owned by Smithfield Foods. It serves a dual purpose: producing more ribs for barbecues and smokers ...

The Latest Google Headlines on:
Controlling Robots with your Thoughts

[google_news title=”” keyword=”Controlling Robots with your Thoughts” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]

The Latest Bing News on:
Thought-controlled robots
The Latest Google Headlines on:
Thought-controlled robots

[google_news title=”” keyword=”thought-controlled robots” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]

What's Your Reaction?
Don't Like it!
0
I Like it!
0
Scroll To Top