The NeuRRAM chip is not only twice as energy efficient as state-of-the-art, it’s also versatile and delivers results that are just as accurate as conventional digital chips.
(Image credit: David Baillot/University of California San Diego.)
Stanford engineers created a more efficient and flexible AI chip, which could bring the power of AI into tiny edge devices
AI-powered edge computing is already pervasive in our lives. Devices like drones, smart wearables, and industrial IoT sensors are equipped with AI-enabled chips so that computing can occur at the “edge” of the internet, where the data originates. This allows real-time processing and guarantees data privacy.
However, AI functionalities on these tiny edge devices are limited by the energy provided by a battery. Therefore, improving energy efficiency is crucial. In today’s AI chips, data processing and data storage happen at separate places – a compute unit and a memory unit. The frequent data movement between these units consumes most of the energy during AI processing, so reducing the data movement is the key to addressing the energy issue.
Stanford University engineers have come up with a potential solution: a novel resistive random-access memory (RRAM) chip that does the AI processing within the memory itself, thereby eliminating the separation between the compute and memory units. Their “compute-in-memory” (CIM) chip, called NeuRRAM, is about the size of a fingertip and does more work with limited battery power than what current chips can do.
“Having those calculations done on the chip instead of sending information to and from the cloud could enable faster, more secure, cheaper, and more scalable AI going into the future, and give more people access to AI power,” said H.-S Philip Wong, the Willard R. and Inez Kerr Bell Professor in the School of Engineering.
“The data movement issue is similar to spending eight hours in commute for a two-hour workday,” added Weier Wan, a recent graduate at Stanford leading this project. “With our chip, we are showing a technology to tackle this challenge.”
They presented NeuRRAM in a recent article in the journal Nature. While compute-in-memory has been around for decades, this chip is the first to actually demonstrate a broad range of AI applications on hardware, rather than through simulation alone.
Putting computing power on the device
To overcome the data movement bottleneck, researchers implemented what is known as compute-in-memory (CIM), a novel chip architecture that performs AI computing directly within memory rather than in separate computing units. The memory technology that NeuRRAM used is resistive random-access memory (RRAM). It is a type of non-volatile memory – memory that retains data even once power is off – that has emerged in commercial products. RRAM can store large AI models in a small area footprint, and consume very little power, making them perfect for small-size and low-power edge devices.
Even though the concept of CIM chips is well established, and the idea of implementing AI computing in RRAM isn’t new, “this is one of the first instances to integrate a lot of memory right onto the neural network chip and present all benchmark results through hardware measurements,” said Wong, who is a co-senior author of the Nature paper.
The architecture of NeuRRAM allows the chip to perform analog in-memory computation at low power and in a compact-area footprint. It was designed in collaboration with the lab of Gert Cauwenberghs at the University of California, San Diego, who pioneered low-power neuromorphic hardware design. The architecture also enables reconfigurability in dataflow directions, supports various AI workload mapping strategies, and can work with different kinds of AI algorithms – all without sacrificing AI computation accuracy.
To show the accuracy of NeuRRAM’s AI abilities, the team tested how it functioned on different tasks. They found that it’s 99% accurate in letter recognition from the MNIST dataset, 85.7% accurate on image classification from the CIFAR-10 dataset, 84.7% accurate on Google speech command recognition and showed a 70% reduction in image-reconstruction error on a Bayesian image recovery task.
“Efficiency, versatility, and accuracy are all important aspects for broader adoption of the technology,” said Wan. “But to realize them all at once is not simple. Co-optimizing the full stack from hardware to software is the key.”
“Such full-stack co-design is made possible with an international team of researchers with diverse expertise,” added Wong.
Fueling edge computations of the future
Right now, NeuRRAM is a physical proof-of-concept but needs more development before it’s ready to be translated into actual edge devices.
But this combined efficiency, accuracy, and ability to do different tasks showcases the chip’s potential. “Maybe today it is used to do simple AI tasks such as keyword spotting or human detection, but tomorrow it could enable a whole different user experience. Imagine real-time video analytics combined with speech recognition all within a tiny device,” said Wan. “To realize this, we need to continue improving the design and scaling RRAM to more advanced technology nodes.”
“This work opens up several avenues of future research on RRAM device engineering, and programming models and neural network design for compute-in-memory, to make this technology scalable and usable by software developers”, said Priyanka Raina, assistant professor of electrical engineering and a co-author of the paper.
If successful, RRAM compute-in-memory chips like NeuRRAM have almost unlimited potential. They could be embedded in crop fields to do real-time AI calculations for adjusting irrigation systems to current soil conditions. Or they could turn augmented reality glasses from clunky headsets with limited functionality to something more akin to Tony Stark’s viewscreen in the Iron Man and Avengers movies (without intergalactic or multiverse threats – one can hope).
If mass produced, these chips would be cheap enough, adaptable enough, and low power enough that they could be used to advance technologies already improving our lives, said Wong, like in medical devices that allow home health monitoring.
They can be used to solve global societal challenges as well: AI-enabled sensors would play a role in tracking and addressing climate change. “By having these kinds of smart electronics that can be placed almost anywhere, you can monitor the changing world and be part of the solution,” Wong said. “These chips could be used to solve all kinds of problems from climate change to food security.”
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
- materials science
An unexpected superconductor was beginning to look like a fluke, but a new theory and a second discovery have revealed that emergent quasiparticles may be behind the effect. This Data is Current ...
- neural networks
For centuries, mathematicians have tried to prove that Euler’s fluid equations can produce nonsensical answers. A new approach to machine learning has researchers betting that “blowup” is near. This ...
- Close up of the NeuRRAM chip layers (IMAGE)
The NeuRRAM chip uses an innovative architecture that has been co-optimized across the stack. Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to ...
- The NeuRRAM neuromorphic chip (IMAGE)
A team of international researchers designed, manufactured and tested the NeuRRAM chip. Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert!
Go deeper with Google Headlines on:
[google_news title=”” keyword=”NeuRRAM” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]
Go deeper with Bing News on:
- Qualcomm debuts latest flagship Snapdragon chip and a new AI platform
It’s likely not surprising for those who have been following the space for the last several years that Qualcomm is positioning AI/ML as the centerpiece of its latest system on a chip. With the new ...
- Qualcomm spotlights embedded AI in new Snapdragon smartphone chip
Qualcomm has launched a new Snapdragon processor for the next crop of top-tier Android smartphones — with artificial intelligence infused throughout the chip to boost photography, sound, connectivity, ...
- Flexible AI computer chips promise wearable health monitors that protect privacy
My colleagues and I have developed a flexible, stretchable electronic device that runs machine-learning algorithms to continuously collect and analyze health data directly on the body. The skinlike ...
- Astera Labs raises $150M for chips that deliver AI in the cloud
Watch now. Astera Labs provides chips for connecting AI systems in the cloud, and today it has raised $150 million at a $3.15 billion valuation. It makes sure AI processor are fed enough data. In ...
- Cerebras unveils AI supercomputer containing 16 dinner-plate-size chip
Startup Cerebras System's new AI supercomputer Andromeda is seen at a data center in Santa Clara, California, US(via REUTERS) Cerebras Systems, the Silicon valley startup recognised for its dinner ...
Go deeper with Google Headlines on:
[google_news title=”” keyword=”AI chip” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]