Fig. 1
Optical microscopy images of the 3D polymer wiring between a top electrode (TE) and three bottom electrodes (BEs) at the vertical distance from the surface of glass substrate z = 0 and 100 ?m.
Credit: 2023 Naruki Hagiwara et al., Advanced Functional Materials
Researchers from Japan have developed a technique for growing conductive polymer wire connections between electrodes to realize artificial neural networks that overcome the limits of traditional computer hardware
The development of neural networks to create artificial intelligence in computers was originally inspired by how biological systems work. These ‘neuromorphic’ networks, however, run on hardware that looks nothing like a biological brain, which limits performance. Now, researchers from Osaka University and Hokkaido University plan to change this by creating neuromorphic ‘wetware’.
While neural-network models have achieved remarkable success in applications such as image generation and cancer diagnosis, they still lag far behind the general processing abilities of the human brain. In part, this is because they are implemented in software using traditional computer hardware that is not optimized for the millions of parameters and connections that these models typically require.
Neuromorphic wetware, based on memristive devices, could address this problem. A memristive device is a device whose resistance is set by its history of applied voltage and current. In this approach, electropolymerization is used to link electrodes immersed in a precursor solution using wires made of conductive polymer. The resistance of each wire is then tuned using small voltage pulses, resulting in a memristive device.
“The potential to create fast and energy-efficient networks has been shown using 1D or 2D structures,” says senior author Megumi Akai-Kasaya. “Our aim was to extend this approach to the construction of a 3D network.”
The researchers were able to grow polymer wires from a common polymer mixture called ‘PEDOT:PSS’, which is highly conductive, transparent, flexible, and stable. A 3D structure of top and bottom electrodes was first immersed in a precursor solution. The PEDOT:PSS wires were then grown between selected electrodes by applying a square-wave voltage on these electrodes, mimicking the formation of synaptic connections through axon guidance in an immature brain.
Once the wire was formed, the characteristics of the wire, especially the conductance, were controlled using small voltage pulses applied to one electrode, which changes the electrical properties of the film surrounding the wires.
“The process is continuous and reversible,” explains lead author Naruki Hagiwara, “and this characteristic is what enables the network to be trained, just like software-based neural networks.”
The fabricated network was used to demonstrate unsupervised Hebbian learning (i.e., when synapses that often fire together strengthen their shared connection over time). What’s more, the researchers were able to precisely control the conductance values of the wires so that the network could complete its tasks. Spike-based learning, another approach to neural networks that more closely mimics the processes of biological neural networks, was also demonstrated by controlling the diameter and conductivity of the wires.
Next, by fabricating a chip with a larger number of electrodes and using microfluidic channels to supply the precursor solution to each electrode, the researchers hope to build a larger and more powerful network. Overall, the approach determined in this study is a big step toward the realization of neuromorphic wetware and closing the gap between the cognitive abilities of humans and computers.
Original Article: Growing bio-inspired polymer brains for artificial neural networks
More from: Osaka University | Hokkaido University
The Latest Updates from Bing News
Go deeper with Bing News on:
Neuromorphic wetware
- Giant artificial brain computer unboxed at Sandia labs
Computer development has taken a wide path away from the silicon-based hardware we've grown accustomed to. Research has been conducted into various other ways of building even more efficient systems.
- Sandia Pushes The Neuromorphic AI Envelope With Hala Point “Supercomputer”
But Intel’s Loihi 2 neuromorphic processor is one of them, and Sandia National Laboratories is firing up a supercomputer with 1,152 of them interlinked to create what Intel is calling the largest ...
- Studying optimization for neuromorphic imaging and digital twins
This project will set up the Neuromorphic Imaging and Digital Twins Lab—a first of its kind physical lab in the country under the Center for Mathematics and Artificial Intelligence (CMAI ...
- Intel builds world’s largest Neuromorphic system for more sustainable AI
Neuromorphic computing represents a groundbreaking approach in artificial intelligence (AI) technology, drawing inspiration from the human brain’s structure and function. This innovative field ...
- Intel builds world’s largest neuromorphic system
Meanwhile, another new technology is poised to make a much more immediate difference: neuromorphic computing. Neuromorphic computing looks to redesign how computer chips are built by looking at ...
Go deeper with Bing News on:
Neural networks
- Google DeepMind debuts AlphaFold 3 model for predicting the structure of biomolecules
Google’s AlphaFold family of artificial intelligence models can significantly speed up the task by automating manual work for scientists. The first iteration of the model that the company debuted in ...
- Unlocking consciousness: A new frontier in neuroscientific fusion
They contend that the emotional content stored within the neural network diverges from standard computer data, laying the foundation for neural memory and adding depth and significance to conscious ...
- High Accuracy Low-Bit Quantized Neural Networks on a 10-cent Microcontroller
The BitNetMCU initiative streamlines the development of highly accurate neural networks for basic microcontrollers, such as the CH32V003, the famous 10 cents RISC-V MCU from WCH. Although recent ...
- BrainChip Earns Australian Patent for Improved Spiking Neural Network
BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low ...
- Multiplexed neuron sets make smaller optical neural networks possible
Seeking to improve the practicality of optical neural networks that use wavelength division multiplexing, a research team developed a structure called multiplexed neuron sets and a corresponding ...