The ACES developed by Asst Prof Tee (far left) and his team responds 1000 times faster than the human sensory nervous system
Robots and prosthetic devices may soon have a sense of touch equivalent to, or better than, the human skin with the Asynchronous Coded Electronic Skin (ACES), an artificial nervous system developed by a team of NUS researchers.
The new electronic skin system has ultra-high responsiveness and robustness to damage, and can be paired with any kind of sensor skin layers to function effectively as an electronic skin.
The innovation, achieved by Assistant Professor Benjamin Tee and his team from NUS Materials Science and Engineering, was first reported in prestigious scientific journal Science Robotics on 18 July 2019.
Faster than the human sensory nervous system
“Humans use our sense of touch to accomplish almost every daily task, such as picking up a cup of coffee or making a handshake. Without it, we will even lose our sense of balance when walking. Similarly, robots need to have a sense of touch in order to interact better with humans, but robots today still cannot feel objects very well,” explained Asst Prof Tee, who has been working on electronic skin technologies for over a decade in hopes of giving robots and prosthetic devices a better sense of touch.
Drawing inspiration from the human sensory nervous system, the NUS team spent a year and a half developing a sensor system that could potentially perform better. While the ACES electronic nervous system detects signals like the human sensor nervous system, unlike the nerve bundles in the human skin, it is made up of a network of sensors connected via a single electrical conductor.. It is also unlike existing electronic skins which have interlinked wiring systems that can make them sensitive to damage and difficult to scale up.
Elaborating on the inspiration, Asst Prof Tee, who also holds appointments in the NUS Electrical and Computer Engineering, NUS Institute for Health Innovation & Technology, N.1 Institute for Health and the Hybrid Integrated Flexible Electronic Systems programme, said, “The human sensory nervous system is extremely efficient, and it works all the time to the extent that we often take it for granted. It is also very robust to damage. Our sense of touch, for example, does not get affected when we suffer a cut. If we can mimic how our biological system works and make it even better, we can bring about tremendous advancements in the field of robotics where electronic skins are predominantly applied.”
ACES can detect touches more than 1,000 times faster than the human sensory nervous system. For example, it is capable of differentiating physical contact between different sensors in less than 60 nanoseconds — the fastest ever achieved for an electronic skin technology — even with large numbers of sensors. ACES-enabled skin can also accurately identify the shape, texture and hardness of objects within 10 milliseconds, ten times faster than the blinking of an eye. This is enabled by the high fidelity and capture speed of the ACES system.
The ACES platform can also be designed to achieve high robustness to physical damage, an important property for electronic skins because they come into the frequent physical contact with the environment. Unlike the current system used to interconnect sensors in existing electronic skins, all the sensors in ACES can be connected to a common electrical conductor with each sensor operating independently. This allows ACES-enabled electronic skins to continue functioning as long as there is one connection between the sensor and the conductor, making them less vulnerable to damage.

Smart electronic skins for robots and prosthetics
ACES has a simple wiring system and remarkable responsiveness even with increasing numbers of sensors. These key characteristics will facilitate the scale-up of intelligent electronic skins for Artificial Intelligence (AI) applications in robots, prosthetic devices and other human machine interfaces.
“Scalability is a critical consideration as big pieces of high performing electronic skins are required to cover the relatively large surface areas of robots and prosthetic devices,” explained Asst Prof Tee. “ACES can be easily paired with any kind of sensor skin layers, for example, those designed to sense temperatures and humidity, to create high performance ACES-enabled electronic skin with an exceptional sense of touch that can be used for a wide range of purposes,” he added.
For instance, pairing ACES with the transparent, self-healing and water-resistant sensor skin layer also recently developed by Asst Prof Tee’s team, creates an electronic skin that can self-repair, like the human skin. This type of electronic skin can be used to develop more realistic prosthetic limbs that will help disabled individuals restore their sense of touch.
Other potential applications include developing more intelligent robots that can perform disaster recovery tasks or take over mundane operations such as packing of items in warehouses. The NUS team is therefore looking to further apply the ACES platform on advanced robots and prosthetic devices in the next phase of their research.
Learn more: Exceptional sense of touch for robots, prosthetics
The Latest on: Artificial nervous system
via Google News
The Latest on: Artificial nervous system
- Calyx, Qynapse to Expand Use of Neuroimaging AI Toolson July 28, 2022 at 4:17 pm
Calyx, Qynapse have partnered to expand artificial intelligence-based imaging tools for developing therapies for treating CNS disorders.
- AI Takes Neuro MRI to the Next Level of Careon July 27, 2022 at 5:00 pm
Our most meaningful goal is to improve patient care and outcomes. And we can do that by making better diagnoses with the help of AI tools,” says Matthew Kuhn, MD, FACR, a neuroradiologist with ...
- Calyx, Qynapse partner on AI-enhanced imaging for clinical trialson July 25, 2022 at 8:23 am
Calyx announced today that it partnered with Qynapse to use AI-powered neuroimaging analysis technology for central nervous system disorders.
- Calyx and Qynapse Partnering to Enable AI-enhanced Medical Imaging Services for CNS Clinical Trialson July 25, 2022 at 4:50 am
Calyx, the eClinical and Regulatory solutions and services provider relied on for solving complex data challenges in clinical research, ...
- Artificial Nervous System Gives Robots Unprecedented Sense of Touchon July 21, 2022 at 5:00 pm
Now researchers at the National University of Singapore (NUS) have tackled this same challenge but in a complementary way, developing an artificial nervous system they said can give robots and ...
- Meet Morti, the mechanical dog that can teach itself how to walkon July 21, 2022 at 11:43 am
The artificial canine mastered walking in just ... in a similar way to how the central nervous system allows baby animals to walk. “When an animal is born it has all the muscles and tendons ...
- Robot dog learns to walk in one houron July 18, 2022 at 11:18 am
Like a newborn animal, a four-legged robot stumbles around during its first walking attempts. But while a foal or a giraffe needs much longer to master walking, the robot learns to move forward ...
- Sentient AI? Do we really care?on July 2, 2022 at 3:12 pm
Artificial Intelligence (AI ... But absent the biological prerequisite of a central nervous system, they are not sentient, even if they pass Alan Turing’s famous Imitation Game test of seeming ...
- Brain–machine interfaces to restore motor function and probe neural circuitson May 13, 2022 at 6:46 am
Paralysis, one of the most common and debilitating outcomes of severe damage to the central nervous system, continues ... limbs (the most difficult goal) or artificial devices — such as robot ...
- Using data-driven insights from the nervous system to build neural digital therapieson April 11, 2022 at 9:28 am
As part of that revolution, BIOS Health, founded in 2015, is using artificial intelligence (AI) to read, write and analyse data from the nervous system. “We have built on the omics and precision ...
via Bing News