Researchers have shown how to write any magnetic pattern desired onto nanowires, which could help computers mimic how the brain processes information.
Much current computer hardware, such as hard drives, use magnetic memory devices. These rely on magnetic states – the direction microscopic magnets are pointing – to encode and read information.
With this new writing method, we open up research into ‘training’ these magnetic nanowires to solve useful problems.
– Dr Jack Gartside
Exotic magnetic states – such as a point where three south poles meet – represent complex systems. These may act in a similar way to many complex systems found in nature, such as the way our brains process information.
Computing systems that are designed to process information in similar ways to our brains are known as ‘neural networks’. There are already powerful software-based neural networks – for example one recently beat the human champion at the game ‘Go’ – but their efficiency is limited as they run on conventional computer hardware.
Now, researchers from Imperial College London have devised a method for writing magnetic information in any pattern desired, using a very small magnetic probe called a magnetic force microscope.
With this new writing method, arrays of magnetic nanowires may be able to function as hardware neural networks – potentially more powerful and efficient than software-based approaches.
The team, from the Departments of Physics and Materials at Imperial, demonstrated their system by writing patterns that have never been seen before. They published their results today in Nature Nanotechnology.
Dr Jack Gartside, first author from the Department of Physics, said: “With this new writing method, we open up research into ‘training’ these magnetic nanowires to solve useful problems. If successful, this will bring hardware neural networks a step closer to reality.”
As well as applications in computing, the method could be used to study fundamental aspects of complex systems, by creating magnetic states that are far from optimal (such as three south poles together) and seeing how the system responds.
The Latest on: Hardware neural networks
- Neural Networks Without Matrix Mathon September 16, 2022 at 5:00 pm
Neural network training is time consuming and expensive ... implementing it with digital hardware requires extra steps. To obtain an explicit solution, a digital architecture would need to numerically ...
- Intel's Raptor Lake Mobile VPU: What Is It And What Does It Doon September 14, 2022 at 2:08 pm
Image: @bobodtech on Twitter According to a slide shared on Twitter by Bob O'Donnell from TECHnalysis Research, Intel will be adding a different kind of neural-network processor to its 13th ...
- 10 years later, deep learning ‘revolution’ rages on, say AI pioneers Hinton, LeCun and Lion September 14, 2022 at 12:00 pm
On the 10th anniversary of key research that led to deep learning breakthroughs, AI luminaries say the 'revolution' will continue.
- Attacking Neural Networks Could Lead to a Better Understanding of AIon September 14, 2022 at 1:50 am
How neural networks work is considered a bit of a mystery within the AI research community, but a novel method of "attacking" neural networks during the training process could reveal new insights.
- New method for comparing neural networks exposes how artificial intelligence workson September 13, 2022 at 1:02 pm
A team at Los Alamos National Laboratory has developed a novel approach for comparing neural networks that looks within the "black box" of artificial intelligence to help researchers understand neural ...
- OPSWAT Launches Neuralyzer, A New AI-Powered Product for Industrial Asset and OT Network Visibility to Enhance Critical Infrastructure Protectionon September 12, 2022 at 5:00 am
Neural network security product will protect OT environments through asset discovery and inventory management OPSWAT Neuralyzer Rethink OT Cybersecurity OPSWAT Neuralyzer Product Image Easy to use, ...
- MLCommons releases results of its latest MLPerf AI inference benchmark teston September 8, 2022 at 2:00 pm
MLCommons’ benchmark tests help data center operators compare the performance of different suppliers’ products when purchasing new hardware. Today, MLCommons released the results from the ...
- Neural networks make sense of complex electron interactionson September 8, 2022 at 7:41 am
Researchers from the Center for Materials Technologies at Skoltech have delivered a proof-of-concept demonstration of a neural network-driven method for creating a precise exchange-correlation ...
- Neural Magic's sparsity, Nvidia's Hopper, and Alibaba's network among firsts in latest MLPerf AI benchmarkson September 7, 2022 at 5:00 pm
If neural networks can be tuned to be less resource ... world that don't have fancy accelerators and may never have such hardware. "There are industries that have been around for decades that ...
- Convolutional Neural Networks: Co-Design of Hardware Architecture and Compression Algorithmon September 7, 2022 at 1:55 pm
“Over the past decade, deep-learning-based representations have demonstrated remarkable performance in academia and industry. The learning capability of convolutional neural networks (CNNs) originates ...
via Google News and Bing News