EPFL scientists have successfully tested new neuroprosthetic technology that combines robotic control with users’ voluntary control, opening avenues in the new interdisciplinary field of shared control for neuroprosthetic technologies.
EPFL scientists are developing new approaches for improved control of robotic hands – in particular for amputees – that combines individual finger control and automation for improved grasping and manipulation. This interdisciplinary proof-of-concept between neuroengineering and robotics was successfully tested on three amputees and seven healthy subjects. The results are published in today’s issue of Nature Machine Intelligence.
The technology merges two concepts from two different fields. Implementing them both together had never been done before for robotic hand control, and contributes to the emerging field of shared control in neuroprosthetics.
One concept, from neuroengineering, involves deciphering intended finger movement from muscular activity on the amputee’s stump for individual finger control of the prosthetic hand which has never before been done. The other, from robotics, allows the robotic hand to help take hold of objects and maintain contact with them for robust grasping.
“When you hold an object in your hand, and it starts to slip, you only have a couple of milliseconds to react,” explains Aude Billard who leads EPFL’s Learning Algorithms and Systems Laboratory. “The robotic hand has the ability to react within 400 milliseconds. Equipped with pressure sensors all along the fingers, it can react and stabilize the object before the brain can actually perceive that the object is slipping. ”
How shared control works
The algorithm first learns how to decode user intention and translates this into finger movement of the prosthetic hand. The amputee must perform a series of hand movements in order to train the algorithm that uses machine learning. Sensors placed on the amputee’s stump detect muscular activity, and the algorithm learns which hand movements correspond to which patterns of muscular activity. Once the user’s intended finger movements are understood, this information can be used to control individual fingers of the prosthetic hand.
“Because muscle signals can be noisy, we need a machine learning algorithm that extracts meaningful activity from those muscles and interprets them into movements,” says Katie Zhuang first author of the publication.
Next, the scientists engineered the algorithm so that robotic automation kicks in when the user tries to grasp an object. The algorithm tells the prosthetic hand to close its fingers when an object is in contact with sensors on the surface of the prosthetic hand. This automatic grasping is an adaptation from a previous study for robotic arms designed to deduce the shape of objects and grasp them based on tactile information alone, without the help of visual signals.
Many challenges remain to engineer the algorithm before it can be implemented in a commercially available prosthetic hand for amputees. For now, the algorithm is still being tested on a robot provided by an external party.
“Our shared approach to control robotic hands could be used in several neuroprosthetic applications such as bionic hand prostheses and brain-to-machine interfaces, increasing the clinical impact and usability of these devices,” says Silvestro Micera, EPFL’s Bertarelli Foundation Chair in Translational Neuroengineering, and Professor of Bioelectronics at Scuola Superiore Sant’Anna in Italy.
The Latest on: Neuroprosthetic technology
via Google News
The Latest on: Neuroprosthetic technology
- Technology Insight: Future Neuroprosthetic Therapies for Disorders of the Nervous Systemon July 29, 2021 at 5:00 pm
Although the underlying concept is easily understood, there are a number of basic questions that must be answered before the neuroprosthetic approach to sight restoration can move to clinical studies.
- Implantable Chips and Advanced Materials Stimulate Research Efforts to Restore Sighton July 27, 2021 at 4:59 pm
Several engineering approaches are being explored to develop neuroprosthetic devices that can emulate ... The researchers recently noted that, using micromachining technology, a prototype ...
- Medtech's Rising Stars: Nicole Moskowitzon July 26, 2021 at 5:00 pm
Moskowitz is chief technology officer at IntuiTap Medical ... the design and development of neuroprosthetic devices. Early on, this led me to approach unrelated engineering opportunities with some ...
- Sense of Touch and Motor Control Recreated in Paralyzed Patienton July 22, 2021 at 1:31 am
A paralyzed patient is taking part in a unique research trial that tests the use of neuroprosthetic devices to allow sense and manipulation of a virtual hand and, eventually, a robotic prosthetic.
- Severely paralyzed man communicates using brain signals sent to his vocal tracton July 17, 2021 at 4:51 am
So far, neuroprosthetic technology has only allowed paralyzed users to type out just one letter at a time, a process that can be slow and laborious. It also tapped parts of the brain that control the ...
- Researchers develop deep-learning method for translating vocal signals from the brain to texton July 16, 2021 at 5:21 am
The technology uses neural networks to translate brainwaves into words and phrases. It is a breakthrough because until now, the best neuroprosthetic technology has provided is letter-by-letter ...
- In major step, UCSF scientists translate unspoken words of paralyzed man into writingon July 15, 2021 at 5:14 am
The “neuroprosthetic” technology involved installing a credit-card-sized electrode panel on the surface of a volunteer’s brain, then collecting electrical signals as the person — a man ...
- Severely paralyzed man communicates using brain signals sent to his vocal tracton July 15, 2021 at 5:06 am
So far, neuroprosthetic technology has only allowed paralyzed users to type out just one letter at a time, a process that can be slow and laborious. It also tapped parts of the brain that control ...
via Bing News