Artificial neural networks are computing systems inspired by biological neural networks that constitute animal brains. Like biological models, they can learn (be trained) by processing examples and forming probability associations, then apply that information to other tasks.
Depending on age, humans need 7 to 13 hours of sleep per 24 hours. During this time, a lot happens: Heart rate, breathing and metabolism ebb and flow; hormone levels adjust; the body relaxes. Not so much in the brain.
“The brain is very busy when we sleep, repeating what we have learned during the day,” said Maxim Bazhenov, PhD, professor of medicine and a sleep researcher at University of California San Diego School of Medicine. “Sleep helps reorganize memories and presents them in the most efficient way.”
In previous published work, Bazhenov and colleagues have reported how sleep builds rational memory, the ability to remember arbitrary or indirect associations between objects, people or events, and protects against forgetting old memories.
Artificial neural networks leverage the architecture of the human brain to improve numerous technologies and systems, from basic science and medicine to finance and social media. In some ways, they have achieved superhuman performance, such as computational speed, but they fail in one key aspect: When artificial neural networks learn sequentially, new information overwrites previous information, a phenomenon called catastrophic forgetting.
“In contrast, the human brain learns continuously and incorporates new data into existing knowledge,” said Bazhenov, “and it typically learns best when new training is interleaved with periods of sleep for memory consolidation.”
Writing in the November 18, 2022 issue of PLOS Computational Biology, senior author Bazhenov and colleagues discuss how biological models may help mitigate the threat of catastrophic forgetting in artificial neural networks, boosting their utility across a spectrum of research interests.
The scientists used spiking neural networks that artificially mimic natural neural systems: Instead of information being communicated continuously, it is transmitted as discrete events (spikes) at certain time points.
They found that when the spiking networks were trained on a new task, but with occasional off-line periods that mimicked sleep, catastrophic forgetting was mitigated. Like the human brain, said the study authors, “sleep” for the networks allowed them to replay old memories without explicitly using old training data.
“It meant that these networks could learn continuously, like humans or animals. Understanding how human brain processes information during sleep can help to augment memory in human subjects. Augmenting sleep rhythms can lead to better memory. In other projects, we use computer models to develop optimal strategies to apply stimulation during sleep, such as auditory tones, that enhance sleep rhythms and improve learning. This may be particularly important when memory is non-optimal, such as when memory declines in aging or in some conditions like Alzheimer’s disease.”
Maxim Bazhenov, PhD, professor of medicine and a sleep researcher at University of California San Diego School of Medicine.
Memories are represented in the human brain by patterns of synaptic weight — the strength or amplitude of a connection between two neurons.
“When we learn new information,” said Bazhenov, “neurons fire in specific order and this increases synapses between them. During sleep, the spiking patterns learned during our awake state are repeated spontaneously. It’s called reactivation or replay.
“Synaptic plasticity, the capacity to be altered or molded, is still in place during sleep and it can further enhance synaptic weight patterns that represent the memory, helping to prevent forgetting or to enable transfer of knowledge from old to new tasks.”
When Bazhenov and colleagues applied this approach to artificial neural networks, they found that it helped the networks avoid catastrophic forgetting.
“It meant that these networks could learn continuously, like humans or animals. Understanding how human brain processes information during sleep can help to augment memory in human subjects. Augmenting sleep rhythms can lead to better memory.
“In other projects, we use computer models to develop optimal strategies to apply stimulation during sleep, such as auditory tones, that enhance sleep rhythms and improve learning. This may be particularly important when memory is non-optimal, such as when memory declines in aging or in some conditions like Alzheimer’s disease.”
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Artificial neural networks
- Fujitsu Launches Advanced AI Applications to Simplify 5G+ Network Challenges
Fujitsu Network Communications introduced Virtuora® IA, a collection of network applications powered by artificial intelligence (AI) that use network-focused machine learning (ML) models and inherent ...
- DeepSig Announces Commercial Release of Industry’s First Neural Receiver Software for Open RAN 5G Networks
DeepSig, a pioneer in AI-native wireless communications, announced today at Mobile World Congress the general availability (GA) of its Gen 1 OmniPHY® 5G software. DeepSig’s AI-powered solution ...
- Artificial Intelligence and Its Impact
Artificial intelligence (AI) has emerged as one of the most disruptive forces in geoeconomics and geopolitics. Its rise to prominence began with twin breakthroughs in 2012 and 2013: the successful use ...
- Terra Quantum Will Use Quantum Neural Networks and New Qubits
They will demonstrate the potential of Quantum Neural Networks to enhance efficiency ... It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, ...
- Exploring the use of silicon microresonators for artificial neural networks
Researchers have made significant progress in the development of artificial neural networks using tiny silicon devices called microresonators, paving the way for faster and more energy-efficient ...
Go deeper with Bing News on:
Artificial neural networks continuous learning
- These New Neural Networks Operate At Breakneck Speed
Applying differential equations to each node, these new networks can do the same kinds of advanced things that a traditional network did with 1000 or 2000 neurons. But here's the big news – they can ...
- Silicon microresonators for artificial neural networks
Researchers have made significant progress in the development of artificial neural ... light signal and is crucial for learning and adapting in neural networks. The range of these weights depends ...
- What are artificial neural networks?
When it comes to tasks other than number crunching, the human brain possesses many advantages over a digital computer. We can quickly recognize a face, even when seen from the side in bad lighting ...
- Implementing artificial neural network hardware systems by stacking them like 'neuron-synapse-neuron' structural blocks
With the emergence of new industries such as artificial intelligence, the Internet of Things, and machine learning ... for large-scale artificial neural network hardware to become practical ...
- Innovative Nanowire Networks Enable Real-Time Learning in Neural Systems
The magic behind this neural network’s learning and memory ... with the intricate dynamics of continuous data streams. It is worth noting that conventional artificial neural networks, in their ...