A massively parallel amplitude-only Fourier neural network
Researchers invent an optical convolutional neural network accelerator for machine learning
SUMMARY
Researchers at the George Washington University, together with researchers at the University of California, Los Angeles, and the deep-tech venture startup Optelligence LLC, have developed an optical convolutional neural network accelerator capable of processing large amounts of information, on the order of petabytes, per second. This innovation, which harnesses the massive parallelism of light, heralds a new era of optical signal processing for machine learning with numerous applications, including in self-driving cars, 5G networks, data-centers, biomedical diagnostics, data-security and more.
THE SITUATION
Global demand for machine learning hardware is dramatically outpacing current computing power supplies. State-of-the-art electronic hardware, such as graphics processing units and tensor processing unit accelerators, help mitigate this, but are intrinsically challenged by serial data processing that requires iterative data processing and encounters delays from wiring and circuit constraints. Optical alternatives to electronic hardware could help speed up machine learning processes by simplifying the way information is processed in a non-iterative way. However, photonic-based machine learning is typically limited by the number of components that can be placed on photonic integrated circuits, limiting the interconnectivity, while free-space spatial-light-modulators are restricted to slow programming speeds.
THE SOLUTION
To achieve a breakthrough in this optical machine learning system, the researchers replaced spatial light modulators with digital mirror-based technology, thus developing a system over 100 times faster. The non-iterative timing of this processor, in combination with rapid programmability and massive parallelization, enables this optical machine learning system to outperform even the top-of-the-line graphics processing units by over one order of magnitude, with room for further optimization beyond the initial prototype.
Unlike the current paradigm in electronic machine learning hardware that processes information sequentially, this processor uses the Fourier optics, a concept of frequency filtering which allows for performing the required convolutions of the neural network as much simpler element-wise multiplications using the digital mirror technology.
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Machine intelligence
- AI: Hype Machine or Hidden Gem? The Great Tech Debate And What It Means For Your Wallet
AI has been widely touted as the way of the future. Major tech firms are investing billions in developing AI, while some scientists working on it have signed an open letter warning of its dangers. On ...
- AI regulation - will machines really destroy us before we do that ourselves?
A Cambridge professor – and AI specialist – is one of a growing number of academics, technologists, and business leaders who believe AI really does pose an existential threat to humanity. Should ...
- CHI Memorial unveils a new AI cardiac machine to expedite care
A new machine will bring advanced, artificial intelligence-enabled technology to CHI Memorial Hospital to help expedite cardiac care for patients in a less invasive yet more accurate manner.
- AI in the OR: How the artificial intelligence is helping surgeons see cancer
CHICAGO — Artificial Intelligence is changing medicine and helping doctors diagnose and treat patients with more precision. Now a suburban surgical team is turning to AI for an on-the-spot assist in ...
- Apple Intelligence beta leaves a lot to be desired, but already beats Google AI in one key way
We tried the Apple Intelligence beta, and it's quite limited to start — though it already has a great thing going for it ...
Go deeper with Google Headlines on:
Machine intelligence
[google_news title=”” keyword=”machine intelligence” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]
Go deeper with Bing News on:
Machine learning hardware
- Virtual machine migration - dispelling myths for enterprise IT teams
If enterprises are going to execute exceptional workload performance and IT services in today’s rapidly evolving landscape, there is a growing need to refresh legacy data center infrastructure used to ...
- How machine learning and robotics are sparking a revolution in property marketing
THE ARTICLES ON THESE PAGES ARE PRODUCED BY BUSINESS REPORTER, WHICH TAKES SOLE RESPONSIBILITY FOR THE CONTENTS ...
- AI hardware: India’s next leap
The semiconductor industry is often described as the most complex and precise manufacturing endeavour known to humanity.
- Transformer Impact: Has Machine Translation Been Solved?
Google recently announced their release of 110 new languages on Google Translate as part of their 1000 languages initiative launched in 2022. In 2022, at the start they added 24 languages. With the ...
- Advanced Hardware Device Slashes AI Energy Consumption by 1000x
This device could slash artificial intelligence energy consumption by at least 1,000 times. Researchers in engineering at the University of Minnesota Twin Cities have developed an advanced hardware de ...
Go deeper with Google Headlines on:
Machine learning hardware
[google_news title=”” keyword=”machine learning hardware” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]