Building on concepts such as quantum entanglement, quantum computers promise a wealth of machine learning applications.
(Photo: Keystone/Science Photo Library)
The future quantum computers should be capable of super-?fast and reliable computation. Today, this is still a major challenge. Now, computer scientists led by ETH Zurich conduct an early exploration for reliable quantum machine learning.
Anyone who collects mushrooms knows that it is better to keep the poisonous and the non-?poisonous ones apart. Not to mention what would happen if someone ate the poisonous ones. In such “classification problems”, which require us to distinguish certain objects from one another and to assign the objects we are looking for to certain classes by means of characteristics, computers can already provide useful support to humans.
Intelligent machine learning methods can recognise patterns or objects and automatically pick them out of data sets. For example, they could pick out those pictures from a photo database that show non-?toxic mushrooms. Particularly with very large and complex data sets, machine learning can deliver valuable results that humans would not be able to find out, or only with much more time. However, for certain computational tasks, even the fastest computers available today reach their limits. This is where the great promise of quantum computers comes into play: that one day they will also perform super-?fast calculations that classical computers cannot solve in a useful period of time.
The reason for this “quantum supremacy” lies in physics: quantum computers calculate and process information by exploiting certain states and interactions that occur within atoms or molecules or between elementary particles.
The fact that quantum states can superpose and entangle creates a basis that allows quantum computers the access to a fundamentally richer set of processing logic. For instance, unlike classical computers, quantum computers do not calculate with binary codes or bits, which process information only as 0 or 1, but with quantum bits or qubits, which correspond to the quantum states of particles. The crucial difference is that qubits can realise not only one state – 0 or 1 – per computational step, but also a state in which both superpose. These more general manners of information processing in turn allow for a drastic computational speed-?up in certain problems.
Translating classical wisdom into the quantum realm
These speed advantages of quantum computing are also an opportunity for machine learning applications – after all, quantum computers could compute the huge amounts of data that machine learning methods need to improve the accuracy of their results much faster than classical computers.
However, to really exploit the potential of quantum computing, one has to adapt the classical machine learning methods to the peculiarities of quantum computers. For example, the algorithms, i.e. the mathematical calculation rules that describe how a classical computer solves a certain problem, must be formulated differently for quantum computers. Developing well-?functioning “quantum algorithms” for machine learning is not entirely trivial, because there are still a few hurdles to overcome along the way.
On the one hand, this is due to the quantum hardware. At ETH Zurich, researchers currently have quantum computers that work with up to 17 qubits (see “ETH Zurich and PSI found Quantum Computing Hub” of 3 May 2021). However, if quantum computers are to realise their full potential one day, they might need thousands to hundreds of thousands of qubits.
Quantum noise and the inevitability of errors
One challenge that quantum computers face concerns their vulnerability to error. Today’s quantum computers operate with a very high level of “noise”, as errors or disturbances are known in technical jargon. For the American Physical Society, this noise is ” the major obstacle to scaling up quantum computers”. No comprehensive solution exists for both correcting and mitigating errors. No way has yet been found to produce error-?free quantum hardware, and quantum computers with 50 to 100 qubits are too small to implement correction software or algorithms.
To a certain extent, one has to live with the fact that errors in quantum computing are in principle unavoidable, because the quantum states on which the concrete computational steps are based can only be distinguished and quantified with probabilities. What can be achieved, on the other hand, are procedures that limit the extent of noise and perturbations to such an extent that the calculations nevertheless deliver reliable results. Computer scientists refer to a reliably functioning calculation method as “robust” and in this context also speak of the necessary “error tolerance”.
This is exactly what the research group led by Ce Zhang, ETH computer science professor and member of the ETH AI Center, has has recently explored, somehow “accidentally” during an endeavor to reason about the robustness of classical distributions for the purpose of building better machine learning systems and platforms. Together with Professor Nana Liu from Shanghai Jiao Tong University and with Professor Bo Li from the University of Illinois at Urbana, they have developed a new approach. This allows them to prove the robustness conditions of certain quantum-?based machine learning models, for which the quantum computation is guaranteed to be reliable and the result to be correct. The researchers have published their approach, which is one of the first of its kind, in the scientific journal “npj Quantum Information”.
Protection against errors and hackers
“When we realised that quantum algorithms, like classical algorithms, are prone to errors and perturbations, we asked ourselves how we can estimate these sources of errors and perturbations for certain machine learning tasks, and how we can guarantee the robustness and reliability of the chosen method,” says Zhikuan Zhao, a postdoc in Ce Zhang’s group. “If we know this, we can trust the computational results, even if they are noisy.”
The researchers investigated this question using quantum classification algorithms as an example – after all, errors in classification tasks are tricky because they can affect the real world, for example if poisonous mushrooms were classified as non-?toxic. Perhaps most importantly, using the theory of quantum hypothesis testing – inspired by other researchers’ recent work in applying hypothesis testing in the classical setting – which allows quantum states to be distinguished, the ETH researchers determined a threshold above which the assignments of the quantum classification algorithm are guaranteed to be correct and its predictions robust.
With their robustness method, the researchers can even verify whether the classification of an erroneous, noisy input yields the same result as a clean, noiseless input. From their findings, the researchers have also developed a protection scheme that can be used to specify the error tolerance of a computation, regardless of whether an error has a natural cause or is the result of manipulation from a hacking attack. Their robustness concept works for both hacking attacks and natural errors.
“The method can also be applied to a broader class of quantum algorithms,” says Maurice Weber, a doctoral student with Ce Zhang and the first author of the publication. Since the impact of error in quantum computing increases as the system size rises, he and Zhao are now conducting research on this problem. “We are optimistic that our robustness conditions will prove useful, for example, in conjunction with quantum algorithms designed to better understand the electronic structure of molecules.”
Original Article: Early endeavours on the path to reliable quantum machine learning
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Quantum machine learning
- APAC Is Projected to Hold a Major Share in the Quantum Computing Market in Coming Yearson June 29, 2021 at 5:00 pm
The Quantum Computing market is expected to grow from USD 472 million in 2021 to USD 1,765 million by 2026, at a CAGR of 30.2%. The early adoption of quantum computing in the banking and finance ...
- QC Ware Forge Breaks New Ground with Industry-first Quantum Linear Algebra APIson June 22, 2021 at 7:13 am
Linear algebra calculations are at the heart of complex quantum optimization and quantum machine learning algorithms and critical for the realization of quantum computing advantage. For quantum ...
- Outlook of Global Quantum Computing Technologies Market: Research Report during 2021-2029.on June 21, 2021 at 10:56 pm
Quantum machine learning (QML) is a combination of machine learning and quantum physics. Alphabet Inc. launched TensorFlow Quantum library in March 2020 for developing QML apps. Researchers at ...
- Black Holes, Quantum Entanglement and the No-Go Theoremon June 20, 2021 at 3:30 pm
New research shows that there are problems even quantum computers might never be able to solve Suppose someone—let’s call her Alice—has a book of secrets she wants to destroy so she tosses it into a ...
- Honeywell Quantum Solutions And Cambridge Quantum Computing Merge With Go-Public In Mindon June 8, 2021 at 9:21 am
CQC develops quantum software for many disciplines, including quantum chemistry, quantum machine learning, and quantum augmented cybersecurity. CQC has operations in the United States, Europe & Japan.
Go deeper with Google Headlines on:
Quantum machine learning
Go deeper with Bing News on:
- Strangeworks Announces Exclusive Early Preview Access to Qiskit Runtime From IBM Quantumon July 8, 2021 at 7:09 am
The new service from IBM Quantum is exclusively available on Strangeworks QC (community edition) and Strangeworks EQ (enterprise edition)AUSTIN, TX / ACCESSWIRE / July 8, 2021 / Strangeworks, a global ...
- 4 Funds to Shine as Quantum Computing Comes Into Playon July 8, 2021 at 6:41 am
In 1998, the trio, Isaac Chuang of the Los Alamos National Laboratory, Neil Gershenfeld of the Massachusetts Institute of Technology and Mark Kubinec of the University of California at Berkeley ...
- ADVA launches world's first optical transport solution with post-quantum cryptographyon July 8, 2021 at 12:00 am
Standard encryption algorithms are at risk from emerging quantum computers New ADVA FSP 3000 ConnectGuard™ encryption technology addresses the ...
- This quantum computer with a 3D chip is heading into the cloudon July 7, 2021 at 8:35 am
A University of Oxford spin-out start-up aims to compete against US-based tech giants in providing access to quantum computing over the internet.
- Quantum Blockchain Technologies to develop cryptography algorithms for cryptocurrency miningon July 5, 2021 at 11:23 pm
The company will use the Leap quantum cloud service from D-Wave Systems to exploit the speed of quantum computing in the mining process ...