Sander Bohte
Photo by Inge Hoogland
Thanks to a mathematical breakthrough, AI applications like speech recognition, gesture recognition and ECG classification can become a hundred to a thousand times more energy efficient. This means it will be possible to put much more elaborate AI in chips, enabling applications to run on a smartphone or smartwatch where before this was done in the cloud.
Running the AI on local devices makes the applications more robust and privacy-friendly: robust, because a network connection with the cloud is no longer necessary. And more privacy friendly because data can be stored and processed locally.
The mathematical breakthrough has been achieved by researchers of Centrum Wiskunde & Informatica (CWI), the Dutch national research center for mathematics and computer science together with the IMEC/Holst Research Centre from Eindhoven. The results have been published in a paper (by Bojian Yin, Federico Corradi, and Sander M. Bohté) of the International Conference on Neuromorphic Systems. The underlying mathematical algorithms have been made available open source.
Under supervision of CWI researcher and UvA professor cognitive neurobiology Sander Bohté, researchers developed a learning algorithm for so-called spiking neural networks. Such networks have been around for some time, but are very difficult to handle from a mathematical perspective, making it hard to put them into practice so far. The new algorithm is groundbreaking in two ways: the neurons in the network are required to communicate a lot less frequently, and each individual neuron has to execute fewer calculations.
“The combination of these two breakthroughs make AI algorithms a thousand times more energy efficient in comparison with standard neural networks, and a factor hundred more energy efficient than current state-of-the-art neural networks”, says principal investigator Sander Bohté.
Inspired by the human brain
Bohté’s inspiration and motivation comes from the incredibly energy efficient way that the human brain processes information (20 Watt). Computers that mimic the brain’s neuronal networks have produced wonderful applications in recent years – ranging from image recognition, speech recognition, automatic translation, to medical diagnoses – but require up to a million times more energy than the human brain.
The spiking neural networks developed by Bohté and his research team differ from those already integrated in AI applications. “The communication between neurons in classical neural networks is continuous and easy to handle from a mathematical perspective. Spiking neurons look more like the human brain and communicate only sparingly and with short pulses. This however means that the signals are discontinuous and much more difficult to handle mathematically.”
New type of computer chip
To run spiking neural networks efficiently in the real-world, a new type of chips are needed. Bohté says that prototypes are already being developed. “All kinds of companies are working hard to make this happen, like our project partner IMEC/Holst Centre.”
Bohté’s methods can train spiking neural networks comprised of up to a few thousand neurons, less than typical classical neural networks, but sufficient for many applications like speech recognition, ECG classification and the recognition of gestures. The next challenge will therefore be to scale up these networks to 100.000 or a million neurons, which will expand the application possibilities even further.
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Spiking neural networks
- Neuromorphic Chip Market Size To Touch $2734.8 Million By 2031 Driven By Industrial Adoption And Big Data Analysis
Neuromorphic chips offer a revolutionary approach to computing by mimicking the human brain's structure and function. These chips leverage spiking neural networks (SNNs) that emulate the biological ...
- Neuromorphic supercomputer SpiNNaker 2 can be rented in the cloud
SpiNNcloud Systems offers access to up to 656,640 economical CPU cores with accelerators that calculate up to 0.3 quintillion operations per second.
- BrainChip Earns Australian Patent for Improved Spiking Neural Network
BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world's first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI, received its latest Australian ...
- Sandia Pushes The Neuromorphic AI Envelope With Hala Point “Supercomputer”
Not many devices in the datacenter have been etched with the Intel 4 process, which is the chip maker’s spin on 7 nanometer extreme ultraviolet immersion ...
- Advancing brain-inspired computing with hybrid neural networks
As a representative research paradigm driven by dual-brain principles, HNN seamlessly combines the neuroscience-oriented Spiking Neural Networks (SNNs) and the computer science-oriented Artificial ...
Go deeper with Google Headlines on:
Spiking neural networks
[google_news title=”” keyword=”spiking neural networks” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]
Go deeper with Bing News on:
Neural networks
- A new approach to using neural networks for low-power digital pre-distortion in mmWave systems
In a study published in the journal IEICE Electronics Express, researchers present a neural network digital pre-distortion (DPD) for mmWave RF-PAs.
- Scientists uncover quantum-inspired vulnerabilities in neural networks
In a recent study merging the fields of quantum physics and computer science, Dr. Jun-Jie Zhang and Prof. Deyu Meng have explored the vulnerabilities of neural networks through the lens of the ...
- Microscopic Brain Tissue Map Reveals Vast Neural Networks
Researchers created the largest 3D reconstruction of human brain tissue at synaptic resolution, capturing detailed images of a cubic millimeter of human temporal cortex.
- High Accuracy Low-Bit Quantized Neural Networks on a 10-cent Microcontroller
The BitNetMCU initiative streamlines the development of highly accurate neural networks for basic microcontrollers, such as the CH32V003, the famous 10 cents RISC-V MCU from WCH. Although recent ...
- ATR Network: Decentralized private data multimodal supercomputing network
On the basis of protecting data and privacy security, ATR Network uses neural network, deep learning and other technologies to deeply analyze heterogeneous data, extract useful data vectors, and ...
Go deeper with Google Headlines on:
Neural networks
[google_news title=”” keyword=”neural networks” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]