While computers have become smaller and more powerful and supercomputers and parallel computing have become the standard, we are about to hit a wall in energy and miniaturization. Now, Penn State researchers have designed a 2D device that can provide more than yes-or-no answers and could be more brainlike than current computing architectures.
“Complexity scaling is also in decline owing to the non-scalability of traditional von Neumann computing architecture and the impending ‘Dark Silicon’ era that presents a severe threat to multi-core processor technology,” the researchers note in today’s (Sept 13) online issue of Nature Communications.
The Dark Silicon era is already upon us to some extent and refers to the inability of all or most of the devices on a computer chip to be powered up at once. This happens because of too much heat generated from a single device. Von Neumann architecture is the standard structure of most modern computers and relies on a digital approach — “yes” or “no” answers — where program instruction and data are stored in the same memory and share the same communications channel.
“Because of this, data operations and instruction acquisition cannot be done at the same time,” said Saptarshi Das, assistant professor of engineering science and mechanics. “For complex decision-making using neural networks, you might need a cluster of supercomputers trying to use parallel processors at the same time — a million laptops in parallel — that would take up a football field. Portable healthcare devices, for example, can’t work that way.”
The solution, according to Das, is to create brain-inspired, analog, statistical neural networks that do not rely on devices that are simply on or off, but provide a range of probabilistic responses that are then compared with the learned database in the machine. To do this, the researchers developed a Gaussian field-effect transistor that is made of 2D materials — molybdenum disulfide and black phosphorus. These devices are more energy efficient and produce less heat, which makes them ideal for scaling up systems.
“The human brain operates seamlessly on 20 watts of power,” said Das. “It is more energy efficient, containing 100 billion neurons, and it doesn’t use von Neumann architecture.”
The researchers note that it isn’t just energy and heat that have become problems, but that it is becoming difficult to fit more in smaller spaces.
“Size scaling has stopped,” said Das. “We can only fit approximately 1 billion transistors on a chip. We need more complexity like the brain.”
The idea of probabilistic neural networks has been around since the 1980s, but it needed specific devices for implementation.
“Similar to the working of a human brain, key features are extracted from a set of training samples to help the neural network learn,” said Amritanand Sebastian, graduate student in engineering science and mechanics.
The researchers tested their neural network on human electroencephalographs, graphical representation of brain waves. After feeding the network with many examples of EEGs, the network could then take a new EEG signal and analyze it and determine if the subject was sleeping.
“We don’t need as extensive a training period or base of information for a probabilistic neural network as we need for an artificial neural network,” said Das.
The researchers see statistical neural network computing having applications in medicine, because diagnostic decisions are not always 100% yes or no. They also realize that for the best impact, medical diagnostic devices need to be small, portable and use minimal energy.
Das and colleagues call their device a Gaussian synapse and it is based on a two-transistor setup where the molybdenum disulfide is an electron conductor, while the black phosphorus conducts through missing electrons, or holes. The device is essentially two variable resistors in series and the combination produces a graph with two tails, which matches a Gaussian function.
The Latest on: Probabilistic computing
via Google News
The Latest on: Probabilistic computing
- Expect the unexpected – Black swans can open doors to opportunityon April 14, 2021 at 2:00 am
Despite the drama of 2020, history suggests that black swan events are not as rare as the name suggests. Consider the dotcom crash in the early 2000s, the 9/11 attacks, the 2008 global financial ...
- Behind Microsoft’s Nuance Deal: Natural Language Processing, Explainedon April 13, 2021 at 9:02 am
From virtual assistants to call center attendants, voice is beginning to rival text as the primary computing interface.
- Cambridge Quantum Computing Pioneers Quantum Machine Learning Methods for Reasoningon April 11, 2021 at 6:00 am
Scientists at Cambridge Quantum Computing (CQC) have developed methods and demonstrated that quantum machines can learn to infer hidden information from very general probabilistic reasoning models.
- This Tech Dinosaur Just Took a Game-Changing Step in Healthcareon April 10, 2021 at 7:57 am
Tech giants like Microsoft ( NASDAQ:MSFT) and Alphabet ( NASDAQ:GOOG)( NASDAQ:GOOGL) have been racing to build useful quantum computers, but IBM ( NYSE:IBM) just signed a partnership that could vault ...
- Quantum Computing Market Growth Probability, Key Vendors and Future Scenario| IBM, Google Inc, Microsoft Corporationon April 8, 2021 at 12:13 am
Pune, India, April 08, 2021 (Wiredrelease) Prudour Pvt. Ltd –: The New Report “Quantum Computing Market” posted through MarketResearch.Biz, covers the market panorama and its growth possibilities over ...
- Know about How to build a Probabilistic Computer and more!on April 7, 2021 at 6:53 am
All of us have been waiting for quantum computers for quite some time now. What could be a better alternative here is bringing in probabilistic computing. Here’s everything you need to know about How ...
- New computing algorithms expand the boundaries of a quantum futureon April 6, 2021 at 5:59 am
Quantum computing promises to harness the strange properties of quantum mechanics in machines that will outperform even the most powerful supercomputers of today. But the extent of their application, ...
- Randomization and Probabilistic Techniques in Algorithms and Data Analysison April 3, 2021 at 1:46 pm
Donald E. Knuth, Stanford University 'Of all the courses I have taught at Berkeley, my favorite is the one based on the Mitzenmacher-Upfal book Probability and Computing. Students appreciate the ...
- Quantum computing may be able to solve the age-old problem of reasoningon April 1, 2021 at 11:16 pm
Probabilistic inference ... In practical terms, this means that quantum computing can be useful to solve both scientific and engineering problems. The results are "quite flexible, surprisingly ...
- Researchers show that quantum computers can reasonon April 1, 2021 at 3:03 am
The CQC team created two algorithms that enable variational inference (a method from machine learning that approximates probability densities in quantum computers). Ibaraki expects “proven ...
via Bing News