Even the most powerful computers are still no match for the human brain when it comes to pattern recognition, risk management, and other similarly complex tasks. Recent advances in optical neural networks, however, are closing that gap by simulating the way neurons respond in the human brain.
In a key step toward making large-scale optical neural networks practical, researchers have demonstrated a first-of-its-kind multilayer all-optical artificial neural network. Generally, this type of artificial intelligence can tackle complex problems that are impossible with traditional computational approaches, but current designs require extensive computational resources that are both time-consuming and energy intensive. For this reason, there is great interest developing practical optical artificial neural networks, which are faster and consume less power than those based on traditional computers.
In Optica, The Optical Society’s journal for high-impact research, researchers from The Hong Kong University of Science and Technology, Hong Kong detail their two-layer all-optical neural network and successfully apply it to a complex classification task.
“Our all-optical scheme could enable a neural network that performs optical parallel computation at the speed of light while consuming little energy,” said Junwei Liu, a member of the research team. “Large-scale, all-optical neural networks could be used for applications ranging from image recognition to scientific research.”
Building an all-optical network
In conventional hybrid optical neural networks, optical components are typically used for linear operations while nonlinear activation functions—the functions that simulate the way neurons in the human brain respond—are usually implemented electronically because nonlinear optics typically require high-power lasers that are difficult to implement in an optical neural network.
To overcome this challenge, the researchers used cold atoms with electromagnetically induced transparency to perform nonlinear functions. “This light-induced effect can be achieved with very weak laser power,” said Shengwang Du, a member of the research team. “Because this effect is based on nonlinear quantum interference, it might be possible to extend our system into a quantum neural network that could solve problems intractable by classical methods.”
To confirm the capability and feasibility of the new approach, the researchers constructed a two-layer fully-connected all optical neural network with 16 inputs and two outputs. The researchers used their all-optical network to classify the order and disorder phases of the Ising model, a statistical model of magnetism. The results showed that the all-optical neural network was as accurate as a well-trained computer-based neural network.
Optical neural networks at larger scales
The researchers plan to expand the all-optical approach to large-scale all-optical deep neural networks with complex architectures designed for specific practical applications such as image recognition. This will help demonstrate that the scheme works at larger scales.
“Although our work is a proof-of-principle demonstration, it shows that it may become possible in the future to develop optical versions of artificial intelligence,” said Du. “The next generation of artificial intelligence hardware will be intrinsically much faster and exhibit lower power consumption compared to today’s computer-based artificial intelligence,” added Liu.
Learn more: Researchers Demonstrate All-Optical Neural Network for Deep Learning
The Latest on: Optical artificial neural networks
[google_news title=”” keyword=”optical artificial neural networks” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: Optical artificial neural networks
- Plato's Burial Place and Details of His Last Night Revealed in Carbonized Scroll From Herculaneumon April 26, 2024 at 11:50 am
Scientists have deciphered a document lost in the 79 AD eruption of Mount Vesuvius, using AI to read ink invisible to the naked eye.
- Dresden AI Pioneer Richard Socher Awarded Honorary Doctorateon April 26, 2024 at 10:50 am
With this award, the Faculty of Computer Science at TU Dresden recognizes Socher's groundbreaking achievements in the fields of deep learning and natural language processing, as well as his commitment ...
- Deciphered Herculaneum papyrus reveals precise burial place of Platoon April 25, 2024 at 10:33 am
Most notably, the historical account of Plato being sold into slavery in his later years after running afoul of the tyrannical Dionysius is usually pegged to around 387 BCE. According to the newly ...
- AI Efficiency Breakthrough: How Sound Waves Are Revolutionizing Optical Neural Networkson April 23, 2024 at 12:34 pm
Researchers have developed a way to use sound waves in optical neural networks, enhancing their ability to process data with high speed and energy efficiency. Optical neural networks may provide the ...
- Lethal AI weapons are here: how can we control them?on April 22, 2024 at 4:59 pm
Autonomous weapons guided by artificial intelligence are already in use. Researchers, legal experts and ethicists are struggling with what should be allowed on the battlefield.
- Nanofluidic memristors compute in brain-inspired logic circuitson April 22, 2024 at 7:29 am
Building artificial nanofluidic neural networks could provide a closer analogy to real neural systems, and could also be more energy-efficient. A memristor is a circuit element with a resistance (and ...
- Fish under the influence reveal how psychedelics workon April 22, 2024 at 7:05 am
The method combines powerful optical microscopy, advanced image analysis and artificial ... and used to train a deep neural network—an advanced AI algorithm—to identify the nuances of the fish's ...
- Photonic computation with sound waveson April 15, 2024 at 5:00 pm
Neural networks have the potential to form the backbone of artificial intelligence. Building them as optical neural networks – based on light instead of electric signals – promises the handling of ...
- A New Photonic Computer Chip Uses Light to Slash AI Energy Costson April 15, 2024 at 1:45 pm
The chip was nearly 92 percent accurate at image recognition, matching current chip performance, but cut energy consumption over a thousand-fold.
- NTT Research unveils AI model, sustainable path for AI, and better distributed data centerson April 11, 2024 at 3:00 pm
Japanese telecommuications firm NTT announced a series of research projects that could pave the way for better AI and more energy efficient data centers.
via Bing News