Building on concepts such as quantum entanglement, quantum computers promise a wealth of machine learning applications.
(Photo: Keystone/Science Photo Library)
The future quantum computers should be capable of super-?fast and reliable computation. Today, this is still a major challenge. Now, computer scientists led by ETH Zurich conduct an early exploration for reliable quantum machine learning.
Anyone who collects mushrooms knows that it is better to keep the poisonous and the non-?poisonous ones apart. Not to mention what would happen if someone ate the poisonous ones. In such “classification problems”, which require us to distinguish certain objects from one another and to assign the objects we are looking for to certain classes by means of characteristics, computers can already provide useful support to humans.
Intelligent machine learning methods can recognise patterns or objects and automatically pick them out of data sets. For example, they could pick out those pictures from a photo database that show non-?toxic mushrooms. Particularly with very large and complex data sets, machine learning can deliver valuable results that humans would not be able to find out, or only with much more time. However, for certain computational tasks, even the fastest computers available today reach their limits. This is where the great promise of quantum computers comes into play: that one day they will also perform super-?fast calculations that classical computers cannot solve in a useful period of time.
The reason for this “quantum supremacy” lies in physics: quantum computers calculate and process information by exploiting certain states and interactions that occur within atoms or molecules or between elementary particles.
The fact that quantum states can superpose and entangle creates a basis that allows quantum computers the access to a fundamentally richer set of processing logic. For instance, unlike classical computers, quantum computers do not calculate with binary codes or bits, which process information only as 0 or 1, but with quantum bits or qubits, which correspond to the quantum states of particles. The crucial difference is that qubits can realise not only one state – 0 or 1 – per computational step, but also a state in which both superpose. These more general manners of information processing in turn allow for a drastic computational speed-?up in certain problems.
Translating classical wisdom into the quantum realm
These speed advantages of quantum computing are also an opportunity for machine learning applications – after all, quantum computers could compute the huge amounts of data that machine learning methods need to improve the accuracy of their results much faster than classical computers.
However, to really exploit the potential of quantum computing, one has to adapt the classical machine learning methods to the peculiarities of quantum computers. For example, the algorithms, i.e. the mathematical calculation rules that describe how a classical computer solves a certain problem, must be formulated differently for quantum computers. Developing well-?functioning “quantum algorithms” for machine learning is not entirely trivial, because there are still a few hurdles to overcome along the way.
On the one hand, this is due to the quantum hardware. At ETH Zurich, researchers currently have quantum computers that work with up to 17 qubits (see “ETH Zurich and PSI found Quantum Computing Hub” of 3 May 2021). However, if quantum computers are to realise their full potential one day, they might need thousands to hundreds of thousands of qubits.
Quantum noise and the inevitability of errors
One challenge that quantum computers face concerns their vulnerability to error. Today’s quantum computers operate with a very high level of “noise”, as errors or disturbances are known in technical jargon. For the American Physical Society, this noise is ” the major obstacle to scaling up quantum computers”. No comprehensive solution exists for both correcting and mitigating errors. No way has yet been found to produce error-?free quantum hardware, and quantum computers with 50 to 100 qubits are too small to implement correction software or algorithms.
To a certain extent, one has to live with the fact that errors in quantum computing are in principle unavoidable, because the quantum states on which the concrete computational steps are based can only be distinguished and quantified with probabilities. What can be achieved, on the other hand, are procedures that limit the extent of noise and perturbations to such an extent that the calculations nevertheless deliver reliable results. Computer scientists refer to a reliably functioning calculation method as “robust” and in this context also speak of the necessary “error tolerance”.
This is exactly what the research group led by Ce Zhang, ETH computer science professor and member of the ETH AI Center, has has recently explored, somehow “accidentally” during an endeavor to reason about the robustness of classical distributions for the purpose of building better machine learning systems and platforms. Together with Professor Nana Liu from Shanghai Jiao Tong University and with Professor Bo Li from the University of Illinois at Urbana, they have developed a new approach. This allows them to prove the robustness conditions of certain quantum-?based machine learning models, for which the quantum computation is guaranteed to be reliable and the result to be correct. The researchers have published their approach, which is one of the first of its kind, in the scientific journal “npj Quantum Information”.
Protection against errors and hackers
“When we realised that quantum algorithms, like classical algorithms, are prone to errors and perturbations, we asked ourselves how we can estimate these sources of errors and perturbations for certain machine learning tasks, and how we can guarantee the robustness and reliability of the chosen method,” says Zhikuan Zhao, a postdoc in Ce Zhang’s group. “If we know this, we can trust the computational results, even if they are noisy.”
The researchers investigated this question using quantum classification algorithms as an example – after all, errors in classification tasks are tricky because they can affect the real world, for example if poisonous mushrooms were classified as non-?toxic. Perhaps most importantly, using the theory of quantum hypothesis testing – inspired by other researchers’ recent work in applying hypothesis testing in the classical setting – which allows quantum states to be distinguished, the ETH researchers determined a threshold above which the assignments of the quantum classification algorithm are guaranteed to be correct and its predictions robust.
With their robustness method, the researchers can even verify whether the classification of an erroneous, noisy input yields the same result as a clean, noiseless input. From their findings, the researchers have also developed a protection scheme that can be used to specify the error tolerance of a computation, regardless of whether an error has a natural cause or is the result of manipulation from a hacking attack. Their robustness concept works for both hacking attacks and natural errors.
“The method can also be applied to a broader class of quantum algorithms,” says Maurice Weber, a doctoral student with Ce Zhang and the first author of the publication. Since the impact of error in quantum computing increases as the system size rises, he and Zhao are now conducting research on this problem. “We are optimistic that our robustness conditions will prove useful, for example, in conjunction with quantum algorithms designed to better understand the electronic structure of molecules.”
Original Article: Early endeavours on the path to reliable quantum machine learning
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Quantum machine learning
- What Does the Future Hold for Machine Learning?
Quantum machine learning has the potential to improve data analysis and yield more profound insights. Such improved performance can assist businesses ...
- Keyon Christ Launches First-Ever Experimental Music NFT With Quantum Machine Learning
Following his release of the first AI-infused rap song NFT with NVIDIA earlier this year, G.O.O.D Music producer Keyon Christ has teamed up with Dadabots AI Music Research Group’s CJ Carr and NASA ...
- Quantum Computing And Machine Learning: The Future
With the rise of the fourth industrial revolution, digital innovation became the precursor to global evolution; with the tech industry’s ...
- Agnostiq Selects Pennylane to Develop Quantum Platform for Finance
Agnostiq selects Xanadu's open-source library PennyLane to bring variational quantum computing and quantum machine learning capabilities to financial services. TORONTO, Nov. 10, 2021 /CNW ...
- Is Quantum Computing the Future of AI?
Quantum computing has grabbed the imagination of computer scientists as one possible future of the discipline after we’ve reached the limits of digital binary computers. Thanks to its capability to ...
Go deeper with Google Headlines on:
Quantum machine learning
Go deeper with Bing News on:
- New quantum logic gates help to advance the quantum puzzle
The outcome was that the quantum logic gates were over 400 times more efficient in preventing computational errors than any previously demonstrated techniques.
- Newly improved quantum algorithm performs full configuration interaction calculations without controlled time evolutions
In a continuing effort to improve upon previous work, a research team at the Graduate School of Science, Osaka City University, has applied its recently developed Bayesian phase difference estimation ...
- AI biweekly: Quantum Computing - the new AI?
Dear Artificial Intelligence Enthusiasts, Are you ready for the quantum revolution? Potential corporate users are already gearing up. While only 1% of companies actively budgeted for Quantum Computing ...
- Post-Quantum Cryptography: A Q&A With NIST’s Matt Scholl
Quantum computing algorithms seek to use quantum phenomena to perform certain types of calculations much more efficiently than today’s classical, binary, ...
- QuEra Stealth Quantum Computer Startup Reveals 256 Qubit Simulator
QuEra Computing Inc. emerged from stealth mode has revealed a completed 256 qubit device. QuEra Computing Inc. emerged from stealth mode today with $17 ...