Robots and prosthetic devices may soon have a sense of touch equivalent to, or better than, the human skin with the Asynchronous Coded Electronic Skin (ACES), an artificial nervous system developed by a team of NUS researchers.
The new electronic skin system has ultra-high responsiveness and robustness to damage, and can be paired with any kind of sensor skin layers to function effectively as an electronic skin.
The innovation, achieved by Assistant Professor Benjamin Tee and his team from NUS Materials Science and Engineering, was first reported in prestigious scientific journal Science Robotics on 18 July 2019.
Faster than the human sensory nervous system
“Humans use our sense of touch to accomplish almost every daily task, such as picking up a cup of coffee or making a handshake. Without it, we will even lose our sense of balance when walking. Similarly, robots need to have a sense of touch in order to interact better with humans, but robots today still cannot feel objects very well,” explained Asst Prof Tee, who has been working on electronic skin technologies for over a decade in hopes of giving robots and prosthetic devices a better sense of touch.
Drawing inspiration from the human sensory nervous system, the NUS team spent a year and a half developing a sensor system that could potentially perform better. While the ACES electronic nervous system detects signals like the human sensor nervous system, unlike the nerve bundles in the human skin, it is made up of a network of sensors connected via a single electrical conductor.. It is also unlike existing electronic skins which have interlinked wiring systems that can make them sensitive to damage and difficult to scale up.
Elaborating on the inspiration, Asst Prof Tee, who also holds appointments in the NUS Electrical and Computer Engineering, NUS Institute for Health Innovation & Technology, N.1 Institute for Health and the Hybrid Integrated Flexible Electronic Systems programme, said, “The human sensory nervous system is extremely efficient, and it works all the time to the extent that we often take it for granted. It is also very robust to damage. Our sense of touch, for example, does not get affected when we suffer a cut. If we can mimic how our biological system works and make it even better, we can bring about tremendous advancements in the field of robotics where electronic skins are predominantly applied.”
ACES can detect touches more than 1,000 times faster than the human sensory nervous system. For example, it is capable of differentiating physical contact between different sensors in less than 60 nanoseconds — the fastest ever achieved for an electronic skin technology — even with large numbers of sensors. ACES-enabled skin can also accurately identify the shape, texture and hardness of objects within 10 milliseconds, ten times faster than the blinking of an eye. This is enabled by the high fidelity and capture speed of the ACES system.
The ACES platform can also be designed to achieve high robustness to physical damage, an important property for electronic skins because they come into the frequent physical contact with the environment. Unlike the current system used to interconnect sensors in existing electronic skins, all the sensors in ACES can be connected to a common electrical conductor with each sensor operating independently. This allows ACES-enabled electronic skins to continue functioning as long as there is one connection between the sensor and the conductor, making them less vulnerable to damage.
Smart electronic skins for robots and prosthetics
ACES has a simple wiring system and remarkable responsiveness even with increasing numbers of sensors. These key characteristics will facilitate the scale-up of intelligent electronic skins for Artificial Intelligence (AI) applications in robots, prosthetic devices and other human machine interfaces.
“Scalability is a critical consideration as big pieces of high performing electronic skins are required to cover the relatively large surface areas of robots and prosthetic devices,” explained Asst Prof Tee. “ACES can be easily paired with any kind of sensor skin layers, for example, those designed to sense temperatures and humidity, to create high performance ACES-enabled electronic skin with an exceptional sense of touch that can be used for a wide range of purposes,” he added.
For instance, pairing ACES with the transparent, self-healing and water-resistant sensor skin layer also recently developed by Asst Prof Tee’s team, creates an electronic skin that can self-repair, like the human skin. This type of electronic skin can be used to develop more realistic prosthetic limbs that will help disabled individuals restore their sense of touch.
Other potential applications include developing more intelligent robots that can perform disaster recovery tasks or take over mundane operations such as packing of items in warehouses. The NUS team is therefore looking to further apply the ACES platform on advanced robots and prosthetic devices in the next phase of their research.
The Latest on: Artificial nervous system
via Google News
The Latest on: Artificial nervous system
- Treatment That May Be Viable For Human Brain Cancer Discoveredon August 27, 2021 at 10:06 pm
A new STING treatment induces immunological responses that allow the immune system to fight otherwise immunological resistant glioblastoma cancer cells, researchers report.
- Very Personal Computing: In Artist’s New Work, A.I. Meets Fatherhoodon August 27, 2021 at 2:00 am
Ian Cheng brings his latest piece to the Shed, a narrative animation powered by a video game engine and partly inspired by his daughter.
- New material aids in neural stimulation using lighton August 26, 2021 at 2:28 pm
The team dispersed flakes on the surface of dorsal root ganglion (DRG), cells in the peripheral nervous system ... researchers could embed MXenes into artificial tissue engineered in the form ...
- Insilico Medicine and 4B Technologies Announce Strategic Collaboration in Advancing Novel Drug Discovery for Neurodegenerative Diseaseson August 26, 2021 at 12:23 pm
Insilico Medicine ("Insilico"), an end-to-end artificial intelligence (AI)-driven drug discovery company, and 4B Technologies Co., Ltd. ("4B Technologies"), a leading end-to-end innovative ...
- Algorithms Are Almost Fluent in Human Speech, so Why Are They So Biased?on August 26, 2021 at 11:55 am
Natural Language Processing is transforming how computers understand humans. But what happens when these programs are embedded with racist and sexist bias?
- Q-State Biosciences Announces Presentation at the World Orphan Drug Congresson August 24, 2021 at 12:07 pm
Q-State Biosciences announces that CSO Graham Dempsey, PhD, will give an oral presentation at the 2021 World Orphan Drug Congress USA on August 25.
- Psychology Todayon August 24, 2021 at 4:42 am
An AI deep learning model developed by the Washington University School of Medicine is able to classify brain tumors using a single 3D MRI Scan. New convolutional neural network (CNN) for classifying ...
- Magnets Could Offer Better Control of Prosthetic Limbson August 22, 2021 at 10:30 pm
“Our hope is that MM will replace electromyography as the dominant way to link the peripheral nervous system to bionic limbs ... “Essentially the magnets and the exoskeleton act as an artificial ...
- AI Networks Based on Human Brain Connectivity Can Perform Cognitive Taskson August 20, 2021 at 8:18 pm
Artificial neural networks modeled on real brains can perform cognitive tasks. A new study shows that artificial intelligence networks based on human brain connectivity can perform cognitive tasks ...
- Anatomical Models Market Growth, Revenue Share Analysis, Company Profiles, and Forecast To 2026on August 18, 2021 at 10:51 pm
The global anatomical models market is expected to reach USD 55.10 Billion by 2026, according to a new report by Reports and Data. Anatomical models are in fact an artificial prototype of body parts ...
via Bing News