Computers and artificial intelligence continue to usher in major changes in the way people shop. It is relatively easy to train a robot’s brain to create a shopping list, but what about ensuring that the robotic shopper can easily tell the difference between the thousands of products in the store?
Purdue University researchers and experts in brain-inspired computing think part of the answer may be found in magnets. The researchers have developed a process to use magnetics with brain-like networks to program and teach devices such as personal robots, self-driving cars and drones to better generalize about different objects.
“Our stochastic neural networks try to mimic certain activities of the human brain and compute through a connection of neurons and synapses,” said Kaushik Roy, Purdue’s Edward G. Tiedemann Jr. Distinguished Professor of Electrical and Computer Engineering. “This allows the computer brain to not only store information but also to generalize well about objects and then make inferences to perform better at distinguishing between objects.”
The switching dynamics of a nano-magnet are similar to the electrical dynamics of neurons. Magnetic tunnel junction devices show switching behavior, which is stochastic in nature.
The stochastic switching behavior is representative of a sigmoid switching behavior of a neuron. Such magnetic tunnel junctions can be also used to store synaptic weights.
The Purdue group proposed a new stochastic training algorithm for synapses using spike timing dependent plasticity (STDP), termed Stochastic-STDP, which has been experimentally observed in the rat’s hippocampus. The inherent stochastic behavior of the magnet was used to switch the magnetization states stochastically based on the proposed algorithm for learning different object representations.
The trained synaptic weights, encoded deterministically in the magnetization state of the nano-magnets, are then used during inference. Advantageously, use of high-energy barrier magnets (30-40KT where K is the Boltzmann constant and T is the operating temperature) not only allows compact stochastic primitives, but also enables the same device to be used as a stable memory element meeting the data retention requirement. However, the barrier height of the nano-magnets used to perform sigmoid-like neuronal computations can be lowered to 20KT for higher energy efficiency.
“The big advantage with the magnet technology we have developed is that it is very energy-efficient,” said Roy, who leads Purdue’s Center for Brain-inspired Computing Enabling Autonomous Intelligence. “We have created a simpler network that represents the neurons and synapses while compressing the amount of memory and energy needed to perform functions similar to brain computations.”
Roy said the brain-like networks have other uses in solving difficult problems as well, including combinatorial optimization problems such as the traveling salesman problem and graph coloring. The proposed stochastic devices can act as “natural annealer”, helping the algorithms move out of local minimas.
The Latest on: Stochastic neural networks
[google_news title=”” keyword=”stochastic neural networks” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: Stochastic neural networks
- The most popular neural network styles and how they workon February 28, 2024 at 2:00 am
All neural networks share one basic characteristic: they are interrelated groups of nodes. More technically, they are graphs. The attributes of the nodes and the ways the edges are connected vary ...
- DeepSig Announces Commercial Release of Industry's First Neural Receiver Software for Open RAN 5G Networkson February 27, 2024 at 11:42 pm
This press release features multimedia. View the full release here: The OmniPHY 5G software transforms 5G Physical Uplink Shared Channel (PUSCH) processing in an Open Distributed Unit (O-DU), ...
- Understanding Dropout in AIon February 19, 2024 at 11:49 am
Dropout has evolved into an essential regularization method within artificial intelligence (AI), notably in training deep neural networks . This paper delves into the intricacies of dropout, ...
- Enhancing Wind Speed Prediction for Sustainable Energy Using Neural Networkson January 11, 2024 at 3:46 pm
They employed a dynamic neural network model to optimize time series ... addressing challenges posed by the stochastic nature of wind speeds on power grid stability. The researchers emphasized ...
- An overview of graph neural networks (GNNs), types and applicationson May 25, 2023 at 9:28 pm
Graph neural networks (GNNs) have emerged as a powerful ... GNNs are trained using gradient-based optimization methods, such as stochastic gradient descent (SGD), or variants like Adam.
- Machine Learning and Data Sciences for Financial Marketson May 14, 2023 at 2:32 pm
Di Persio, Luca and Garbelli, Matteo 2023. From Optimal Control to Mean Field Optimal Transport via Stochastic Neural Networks. Symmetry, Vol. 15, Issue. 9, p. 1724.
- Liquid Neural Networks Do More With Lesson April 30, 2023 at 6:09 pm
[Ramin Hasani] and colleague [Mathias Lechner] have been working with a new type of Artificial Neural Network called Liquid Neural Networks, and presented some of the exciting results at a recent ...
- What Is a Neural Network?on October 24, 2022 at 12:52 am
Investopedia / Joules Garcia A neural network is a series of algorithms that endeavors to recognize underlying relationships in a set of data through a process that mimics the way the human brain ...
- Training A Neural Network To Play A Driving Gameon November 6, 2020 at 9:42 pm
In these cases, it can make more sense to create a neural network and train the computer to do the job, as one would a human. On a more basic level, [Gigante] did just that, teaching a neural ...
- Operations Research and Financial Engineeringon May 21, 2017 at 6:34 am
Machine learning methods are linked to the stochastic optimization models ... massively parallel computing and neural networks, and the concept of resonant synergism in human-machine interactions. Two ...
via Bing News