Chip-architecture breakthrough accelerates path to exascale computing; helps computers tackle complex, cognitive tasks such as pattern recognition sensory processing
Lawrence Livermore National Laboratory (LLNL) today announced it will receive a first-of-a-kind brain-inspired supercomputing platform for deep learning developed by IBM Research. Based on a breakthrough neurosynaptic computer chip called IBM TrueNorth, the scalable platform will process the equivalent of 16 million neurons and 4 billion synapses and consume the energy equivalent of a hearing aid battery – a mere 2.5 watts of power.
The brain-like, neural network design of the IBM Neuromorphic System is able to infer complex cognitive tasks such as pattern recognition and integrated sensory processing far more efficiently than conventional chips.
The new system will be used to explore new computing capabilities important to the National Nuclear Security Administration’s (NNSA) missions in cybersecurity, stewardship of the nation’s nuclear weapons stockpile and nonproliferation. NNSA’s Advanced Simulation and Computing (ASC) program will evaluate machine-learning applications, deep-learning algorithms and architectures and conduct general computing feasibility studies. ASC is a cornerstone of NNSA’s Stockpile Stewardship Program to ensure the safety, security and reliability of the nation’s nuclear deterrent without underground testing.
“Neuromorphic computing opens very exciting new possibilities and is consistent with what we see as the future of the high performance computing and simulation at the heart of our national security missions,” said Jim Brase, LLNL deputy associate director for Data Science. “The potential capabilities neuromorphic computing represents and the machine intelligence that these will enable will change how we do science.”
The technology represents a fundamental departure from computer design that has been prevalent for the past 70 years, and could be a powerful complement in the development of next-generation supercomputers able to perform at exascale speeds, 50 times (or two orders of magnitude) faster than today’s most advanced petaflop (quadrillion floating point operations per second) systems. Like the human brain, neurosynaptic systems require significantly less electrical power and volume.
“The low power consumption of these brain-inspired processors reflects industry’s desire and a creative approach to reducing power consumption in all components for future systems as we set our sights on exascale computing,” said Michel McCoy, LLNL program director for Weapon Simulation and Computing.
“The delivery of this advanced computing platform represents a major milestone as we enter the next era of cognitive computing,” said Dharmendra Modha, IBM fellow and chief scientist of Brain-inspired Computing, IBM Research. “We value our partnerships with the national labs. In fact, prior to design and fabrication, we simulated the IBM TrueNorth processor using LLNL’s Sequoia supercomputer. This collaboration will push the boundaries of brain-inspired computing to enable future systems that deliver unprecedented capability and throughput, while minimizing the capital, operating and programming costs – keeping our nation at the leading edge of science and technology.”
A single TrueNorth processor consists of 5.4 billion transistors wired together to create an array of 1 million digital neurons that communicate with one another via 256 million electrical synapses. It consumes 70 milliwatts of power running in real time and delivers 46 giga synaptic operations per second – orders of magnitude lower energy than a conventional computer running inference on the same neural network. TrueNorth was originally developed under the auspices of the Defense Advanced Research Projects Agency’s (DARPA) Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) program, in collaboration with Cornell University.
The Latest on: Neuromorphic computing
[google_news title=”” keyword=”neuromorphic computing” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: Neuromorphic computing
- Matrix multiplications at the speed of lighton February 9, 2023 at 3:54 am
As reported in Advanced Photonics, the team, led Nikos Pleros, have used SiGe electro-absorption modulators and a novel neuromorphic architectural design capable of encoding and computing data. The ...
- Ferroelectric semiconductor makes it big in thin filmson February 9, 2023 at 1:37 am
Research scientists Ding Wang (left) and Ping Wang (right) investigate the growth behavior of their ferroelectric semiconductor, which is deposited using the molecular beam epitaxy system visible on ...
- Neuromorphic Computing Market Size 2023 Will Touch A New Level In The Upcoming Year 2029on February 8, 2023 at 7:29 am
Pre and Post Covid Report Is Covered | Final Report Will Add the Analysis of the Impact of Russia-Ukraine War and ...
- Neuromorphic Computing Market 2023 Comprehensive Analysis of the Leading Players - Intel, IBM, BrainChip Holdings, Qualcommon February 7, 2023 at 6:43 pm
The report classifies and forecasts the global Neuromorphic Computing market based on type, application, and regional distribution. Historical, current, and projected market size in terms of volume ...
- Xtellix Announces Launch of Groundbreaking Quantum-Neuromorphic Inspired Algorithmon February 6, 2023 at 11:20 pm
Quantum-neuromorphic computing "physically implements neural networks in brain-inspired quantum hardware," vastly increasing computation speed. Their algorithms can find the optimal or near-optimal ...
- Nanoscale ferroelectric semiconductor could power AI and post-Moore's Law computing on a phoneon February 6, 2023 at 9:10 am
This way of computing can also emulate the connections between neurons, which enable both memory storage and information processing in the brain. Known as neuromorphic computing, this kind of ...
- Demand for next-gen computing services surgeson February 6, 2023 at 1:17 am
Next-gen computing revenue will hit $136.46 billion in 2023, and rise to $783.64 billion by 2033, says a new report.
- Editorial: Hardware implementation of spike-based neuromorphic computing and its design methodologieson January 18, 2023 at 7:12 pm
2021; Christensen et al., 2022) The biologically plausible, hardware based neuromorphic computing, based on mature CMOS technologies or emerging memristor technologies (Zhu et al., 2020), is ...
- Neuromorphic Computing Market Size, Share, Price, Trends, Growth, Analysis, Outlook, Report, Forecast 2023-2028on January 16, 2023 at 11:52 pm
Global Neuromorphic Computing Market Size To Grow At A CAGR Of 20% In The Forecast Period Of 2023-2028 30 NORTH GOULD STREET, SHERIDAN, WYOMING, UNITED STATES, January 16, 2023 /einpresswire.com ...
- US Neuromorphic Computing Market Growth 2022 Top Countries Data, Future Demand, Market Trends, Opportunity and Industry Report, 2028on January 10, 2023 at 4:00 pm
Global Neuromorphic Computing Market is further segmented by region into: North America Market Size, Share, Trends, Opportunities-o-Y Growth, CAGR – United States and Canada ...
via Bing News