
Artificial neural networks are computing systems inspired by biological neural networks that constitute animal brains. Like biological models, they can learn (be trained) by processing examples and forming probability associations, then apply that information to other tasks.
Depending on age, humans need 7 to 13 hours of sleep per 24 hours. During this time, a lot happens: Heart rate, breathing and metabolism ebb and flow; hormone levels adjust; the body relaxes. Not so much in the brain.
“The brain is very busy when we sleep, repeating what we have learned during the day,” said Maxim Bazhenov, PhD, professor of medicine and a sleep researcher at University of California San Diego School of Medicine. “Sleep helps reorganize memories and presents them in the most efficient way.”
In previous published work, Bazhenov and colleagues have reported how sleep builds rational memory, the ability to remember arbitrary or indirect associations between objects, people or events, and protects against forgetting old memories.
Artificial neural networks leverage the architecture of the human brain to improve numerous technologies and systems, from basic science and medicine to finance and social media. In some ways, they have achieved superhuman performance, such as computational speed, but they fail in one key aspect: When artificial neural networks learn sequentially, new information overwrites previous information, a phenomenon called catastrophic forgetting.
“In contrast, the human brain learns continuously and incorporates new data into existing knowledge,” said Bazhenov, “and it typically learns best when new training is interleaved with periods of sleep for memory consolidation.”
Writing in the November 18, 2022 issue of PLOS Computational Biology, senior author Bazhenov and colleagues discuss how biological models may help mitigate the threat of catastrophic forgetting in artificial neural networks, boosting their utility across a spectrum of research interests.
The scientists used spiking neural networks that artificially mimic natural neural systems: Instead of information being communicated continuously, it is transmitted as discrete events (spikes) at certain time points.
They found that when the spiking networks were trained on a new task, but with occasional off-line periods that mimicked sleep, catastrophic forgetting was mitigated. Like the human brain, said the study authors, “sleep” for the networks allowed them to replay old memories without explicitly using old training data.
“It meant that these networks could learn continuously, like humans or animals. Understanding how human brain processes information during sleep can help to augment memory in human subjects. Augmenting sleep rhythms can lead to better memory. In other projects, we use computer models to develop optimal strategies to apply stimulation during sleep, such as auditory tones, that enhance sleep rhythms and improve learning. This may be particularly important when memory is non-optimal, such as when memory declines in aging or in some conditions like Alzheimer’s disease.”
Maxim Bazhenov, PhD, professor of medicine and a sleep researcher at University of California San Diego School of Medicine.
Memories are represented in the human brain by patterns of synaptic weight — the strength or amplitude of a connection between two neurons.
“When we learn new information,” said Bazhenov, “neurons fire in specific order and this increases synapses between them. During sleep, the spiking patterns learned during our awake state are repeated spontaneously. It’s called reactivation or replay.
“Synaptic plasticity, the capacity to be altered or molded, is still in place during sleep and it can further enhance synaptic weight patterns that represent the memory, helping to prevent forgetting or to enable transfer of knowledge from old to new tasks.”
When Bazhenov and colleagues applied this approach to artificial neural networks, they found that it helped the networks avoid catastrophic forgetting.
“It meant that these networks could learn continuously, like humans or animals. Understanding how human brain processes information during sleep can help to augment memory in human subjects. Augmenting sleep rhythms can lead to better memory.
“In other projects, we use computer models to develop optimal strategies to apply stimulation during sleep, such as auditory tones, that enhance sleep rhythms and improve learning. This may be particularly important when memory is non-optimal, such as when memory declines in aging or in some conditions like Alzheimer’s disease.”
Original Article: Artificial Neural Networks Learn Better When They Spend Time Not Learning at All
More from: University of California San Diego School of Medicine | University of California San Diego | Czech Academy of Sciences
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Artificial neural networks
- A Diverse Patent Portfolio Better Protects Artificial Intelligence Inventions
Takeaways from 'IBM v. Zillow' from a Patent Drafting Perspective Part Two of a Two-Part Article Part One of this article discussed the IBM v. Zillow case, where IBM sued Zillow for infringing on ...
- Artificial Intelligence (AI) utilizing deep learning techniques to enhance ADAS
Artificial Intelligence and machine learning has significantly revolutionized the Advanced Driver Assistance System (ADAS), by utilizing the strength of deep learning techniques. ADAS relies heavily ...
- Human-Inspired Artificial Network Could Assist in Carbon Sequestration, Climate Change Efforts
Prof. Yiqi Luo, integrative plant science, highlights the importance of soil and terrestrial ecosystems for carbon sequestration.
- Liquid Neural Networks Do More With Less
[Ramin Hasani] and colleague [Mathias Lechner] have been working with a new type of Artificial Neural Network called Liquid Neural Networks, and presented some of the exciting results at a recent ...
- New Teslabot Capabilities. All Neural Net Training and Great Balance. Mass Production Near
Optimus can now sort objects autonomously 🤖 Its neural network is trained fully end-to-end ... It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, ...
Go deeper with Bing News on:
Artificial neural networks continuous learning
- The Language Of AI: How Humans Communicate With LLMs
In this fast-paced AI race, how people talk to LLMs has evolved to optimize the accuracy of the answers they get from these LLM systems.
- Artificial Intelligence (AI) utilizing deep learning techniques to enhance ADAS
Artificial Intelligence and machine learning has significantly revolutionized the Advanced Driver Assistance System (ADAS), by utilizing the strength of deep learning techniques. ADAS relies heavily ...
- Breaking Barriers: ChatGPT Open API and the New Era of Conversational AI
From the humble beginnings of rudimentary logic machines to the sophisticated artificial neural networks of today, artificial intelligence (AI) has undergone a revolutionary transformation. In the ...
- Artificial Consciousness Tech Breathes Life into Robots
As we navigate through 2023, the AI industry is buzzing with innovations, from self-driving cars to virtual assistants ...
- Researchers Build Neural Networks With Actual Neurons
Neural networks have become a hot topic over the last decade, put to work on jobs from recognizing image content to generating text and even playing video games. However, these artificial neural ...