Chip-architecture breakthrough accelerates path to exascale computing; helps computers tackle complex, cognitive tasks such as pattern recognition sensory processing
Lawrence Livermore National Laboratory (LLNL) today announced it will receive a first-of-a-kind brain-inspired supercomputing platform for deep learning developed by IBM Research. Based on a breakthrough neurosynaptic computer chip called IBM TrueNorth, the scalable platform will process the equivalent of 16 million neurons and 4 billion synapses and consume the energy equivalent of a hearing aid battery – a mere 2.5 watts of power.
The brain-like, neural network design of the IBM Neuromorphic System is able to infer complex cognitive tasks such as pattern recognition and integrated sensory processing far more efficiently than conventional chips.
The new system will be used to explore new computing capabilities important to the National Nuclear Security Administration’s (NNSA) missions in cybersecurity, stewardship of the nation’s nuclear weapons stockpile and nonproliferation. NNSA’s Advanced Simulation and Computing (ASC) program will evaluate machine-learning applications, deep-learning algorithms and architectures and conduct general computing feasibility studies. ASC is a cornerstone of NNSA’s Stockpile Stewardship Program to ensure the safety, security and reliability of the nation’s nuclear deterrent without underground testing.
“Neuromorphic computing opens very exciting new possibilities and is consistent with what we see as the future of the high performance computing and simulation at the heart of our national security missions,” said Jim Brase, LLNL deputy associate director for Data Science. “The potential capabilities neuromorphic computing represents and the machine intelligence that these will enable will change how we do science.”
The technology represents a fundamental departure from computer design that has been prevalent for the past 70 years, and could be a powerful complement in the development of next-generation supercomputers able to perform at exascale speeds, 50 times (or two orders of magnitude) faster than today’s most advanced petaflop (quadrillion floating point operations per second) systems. Like the human brain, neurosynaptic systems require significantly less electrical power and volume.
“The low power consumption of these brain-inspired processors reflects industry’s desire and a creative approach to reducing power consumption in all components for future systems as we set our sights on exascale computing,” said Michel McCoy, LLNL program director for Weapon Simulation and Computing.
“The delivery of this advanced computing platform represents a major milestone as we enter the next era of cognitive computing,” said Dharmendra Modha, IBM fellow and chief scientist of Brain-inspired Computing, IBM Research. “We value our partnerships with the national labs. In fact, prior to design and fabrication, we simulated the IBM TrueNorth processor using LLNL’s Sequoia supercomputer. This collaboration will push the boundaries of brain-inspired computing to enable future systems that deliver unprecedented capability and throughput, while minimizing the capital, operating and programming costs – keeping our nation at the leading edge of science and technology.”
A single TrueNorth processor consists of 5.4 billion transistors wired together to create an array of 1 million digital neurons that communicate with one another via 256 million electrical synapses. It consumes 70 milliwatts of power running in real time and delivers 46 giga synaptic operations per second – orders of magnitude lower energy than a conventional computer running inference on the same neural network. TrueNorth was originally developed under the auspices of the Defense Advanced Research Projects Agency’s (DARPA) Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) program, in collaboration with Cornell University.
The Latest on: Neuromorphic computing
via Google News
The Latest on: Neuromorphic computing
- UK Research & Innovation opens £5m fund to develop exascale supercomputer software and algorithmson March 5, 2021 at 5:04 am
A new £5 million fund to research the kind of software and algorithms the UK’s first exascale supercomputer could run has opened for applications. The funded research is intended to address one of a ...
- 'Egg carton' quantum dot array could lead to ultralow power deviceson March 4, 2021 at 10:03 am
A new path toward sending and receiving information with single photons of light has been discovered by an international team of researchers led by the University of Michigan.
- BrainChip’s Success in 2020 Advances Fields of On-Chip Learning and Ultra-Low Power Edge AIon March 3, 2021 at 5:33 am
(ASX: BRN), a leading provider of ultra-low power, high-performance AI technology, ended the 2020 calendar year having made significant strides in the development of its technology backed by the ...
- New AI-Based Device Mimics Neural Activity of the Human Brainon March 2, 2021 at 11:22 pm
Artificial intelligence (AI) needs a large amount of computing power and also multipurpose hardware to support this computing power.
- ORNL’s Jeffrey Vetter on How IRIS Runtime will Help Deal with Extreme Heterogeneityon March 2, 2021 at 9:07 pm
Jeffery Vetter is a familiar figure in HPC. Last year he became one of the new section heads in a reorganization at Oak Ridge National Laboratory. He had ...
- Magnetic materials attract attention for reservoir computingon March 2, 2021 at 4:00 pm
The study of domain walls in ferromangnetic nanorings has demonstrated behavior which has the potential to be useful in the development of reservoir computing, a brain-inspired computing method.
- Argonne scientists help explain phenomenon in hardware that could revolutionize AIon March 2, 2021 at 9:11 am
Argonne scientists help explain phenomenon in hardware that could revolutionize AI. DOE/Argonne National Laboratory. Artificial intelligence, or AI, requires a huge amount of comp ...
- Making the shift from GPUs to ‘brainier’ computing in edge AIon February 28, 2021 at 9:21 pm
One approach is neuromorphic computing that draws inspiration from the human brain where memory and logic are fused. That’s how our brains can perform with a tiny fraction of the energy required ...
- Hospital Tests Neuromorphic Chip-Powered Robotic Armon February 24, 2021 at 6:29 pm
The device, mounted on wheelchairs and powered by technology that imitates the way the human brain works, could provide patients new levels of independence.
- Computing goes ‘neuromorphic’ with nanotechnologyon February 23, 2021 at 1:00 pm
Nanotechnology is poised to transform today's conventional information processing systems. University of Canterbury (UC) researchers are leading the ...
via Bing News