On the Horizon: Quantum Machine Learning As computers using quantum-mechanical effects begin to appear, a range of very difficult problems may soon be solvable. It all begins with the qubit.

For thousands of years, people have been asking questions and providing answers — some of them profound, others absurd. Humans are inquisitive creatures by nature, and our curiosity has empowered us to come quite far.

In his iconic 1979 book, The Hitchhiker’s Guide to the Galaxy, Douglas Adams unveils the answer to what he calls the Ultimate Question of Life, the Universe and Everything:

What do you get if you multiply six by nine?

“Six by nine. Forty two.”                                                        

“That’s it. That’s all there is.”

“I always thought there is something fundamentally wrong with the universe.”

There is something fundamentally wrong here. Adams provides an answer, but we don’t know the question. Is it possible by knowing the answer to reverse-engineer the problem and come up with the question? In Adams’s book, the sole purpose of Earth and humanity is to solve this riddle. Such inverse problems are, in fact, extremely difficult, especially when the answer is not wrong. In fact, these problems constitute the backbone of modern learning algorithms, where we come up with model parameters based on specific data. To determine the value of such models in terms of accuracy and robustness, large chunks of data must be processed — a computationally intensive task.

How much computational power would we need to arrive at the Ultimate Question of Life, the Universe and Everything?

To handle this question, or at least a class of so-called open NP (or nondeterministic polynomial time) problems — the set of decision problems that are not efficiently solvable by a conventional computer but are significant for humanity — we need to reach the next level of supercomputing. All of these tasks, whether quantum machine learning, artificial intelligence and the Internet of Things (IoT), aerospace engineering, portfolio management or cybersecurity and cryptography, can be reduced to optimization problems, which generally are too big for conventional digital computers to handle. Parallel problems, which can be broken down and solved very rapidly in parallel, do not really exist in a nonlinear world, and modern computing is only able to deliver a small portion of what is needed, unless we are cosmically patient and willing to wait millions of years for solutions to emerge.

However, there is an answer on the horizon and it looks like a qubit.

The qubit, or quantum bit, is the key informational unit of quantum computing, which appears likely to become the Holy Grail of quantitative research in years to come. So what is this “sorcery” that is going to dwarf everything we have seen so far in the technology world? Is this how we crack complex cryptography algorithms with ease? Is this how we come up with space travel beyond the limitations of the speed of light? Is quantum-powered artificial intelligence going to put an end to the world as we know it today? Will it provide the Question to Adams’s Ultimate Answer?

We are not attempting to answer these questions in this paper. What we aim to do is present our insights on the topic because we believe in the power of shared knowledge and the wisdom of the crowd. It is only when everyone begins asking the Ultimate Question that the answer will eventually come.

The Importance of Quantum Computing

What makes people think that quantum computing can master complex problems? It’s natural to ask whether quantum computing is really much more powerful than existing supercomputing technologies, which are delivering huge gains compared with just a decade ago. There is no point in developing a sophisticated technology if it brings nothing new to the table. But quantum computing is a very promising candidate not only to revolutionize our computational world but to reveal some of nature’s most hidden secrets.

Before today’s digital computer there was the analog computer, which essentially was a special-purpose machine. Analog computing manipulates potential differences using basic electrical components such as operational amplifiers, which in combination perform complex mathematical tasks. This type of computing is especially well suited to large-scale dynamic systems because analog simulations are tailor-made for specific applications and therefore can execute at a high computational efficiency. Until the early 1950s, analog machines were heavily used in the aircraft and nuclear industries, but they have very limited application today because of the rapid evolution of digital high-performance computing (HPC). However, analog computing may be about to make a remarkable comeback as quantum analog simulators begin to compete with their digital counterparts.

Consider the conventional digital computer, a device that solves problems by operating on data expressed in a binary code — essentially, a combination of zeros (electrical signals switched off) and ones (electrical signals on). These are the two states of the basic information unit called the bit, which is physically generated, stored and manipulated by electrical components such as capacitors and transistors in a digital computer. Following a set of instructions embedded in its memory, the computer manipulates data and performs tasks such as opening an email or writing a document.

In 1942, John Atanasoff, an American-born scientist of Bulgarian origin, and his assistant Clifford Berry built the first general-purpose digital computer, ABC, designed to solve differential equations. Today the digital supercomputers used in industrial research, such as China’s Sunway TaihuLight and the U.S.’s IBM Sequoia, are designed with HPC architecture that utilizes processing elements for parallel-data throughput-intensive tasks — for example, NVIDIA’s general-purpose graphics-processing units (GPGPUs) and the Intel Xeon Phi many-core arrays and field-programmable gate arrays (FPGAs). HPC is currently working on problems that require very high computing intensity in terms of floating point operations per second (FLOPs), memory utilization, storage and network usage. The real-world industrial applications include, but are not limited to, nuclear power research, bioinformatics, aerospace engineering, blockchain technology and powerful machine learning libraries such as Google’s TensorFlow; HPC cloud computing is currently a major driver of the software industry as well.

But a significant number of extremely challenging tasks of great scientific importance are simply not solvable by the existing computing solutions. As we become ever more deeply immersed in the digital era, we will find that cybersecurity will become even more important in the years to come. Biometric authentication and encrypted communication are needed to protect enterprise data and communications systems that are matters of national security. Further, because the IoT is already becoming widespread, there is growing demand for a highly efficient architecture able to process enormous amounts of data at lightning speed. Machine learning classifiers that identify key subgroups of data with applications to medical problems, fraud and anomaly detection are thirsty for quantum power. That is why, to make the next industrial leap, we need quantum computing.

Quantum Computing Basics

Let’s briefly review the fundamentals of quantum computing, without going deeply into quantum mechanics itself, in part because of a truth once uttered by Nobel Prize–winning physicist Richard Feynman: “I think I can safely say that nobody understands quantum mechanics.”

As quants, we believe in numbers and the Feynman–Kac formula. Named in part after Polish-American mathematician Mark Kac, this formula offers a way of solving parabolic partial differential equations by simulating the random paths of a stochastic process. It is a ubiquitous formula in quantitative finance, particularly with respect to Monte Carlo methods in derivatives pricing, and thus it’s a good example of an application of quantum computing.

What is the advantage of quantum computing? It is the ability to process simultaneously an exponentially large number of states. As we’ve noted, the bit is the main unit of information of classical computers and assumes only two values — either zero or one. When you enter the quantum world, you find the qubit, which can assume not only the stable states of |0> and |1> (the bit equivalents of 0 and 1) but a multitude of intermediate states.

This seems bizarre and the stuff of fiction, but it is quite real. The phenomenon is due to the quantum-mechanical principle known as quantum superposition, in which multiple states can coexist at any given time. Think of the Schrödinger’s cat thought experiment, in which the cat is simultaneously dead and alive until a random event either occurs or does not occur.

By stringing qubits together, we can describe an exponential space with a linear number of qubits, in which each state in the space occurs with a specific probability. Assuming that quantum computing is error-free, adding a qubit doubles the computing power.

From a computing point of view, this is not weird at all — essentially, you are running a finite cross product of digital, deterministic simulation threads. A quantum computer has a probabilistic architecture that lets you explore the whole countable set of scenarios at once. It is beyond parallel for a class of problems known as BQP (bounded-error quantum polynomial time). The quantum computer solves the problem if all outcomes probabilistically converge on what Adams might call the Answer.

There are multiple ways to implement a qubit. For example, by using the polarization type of a photon ( is horizontal and  is vertical); nuclear-spin direction in a uniform magnetic field; or the electron-spin direction ( is up and  is down).

But how do we perform a computation on a quantum computer?

Unlike the qubit arithmetic that provides the underlying basis, quantum models do not all work the same. There are a number of ways to model a quantum simulation: so-called quantum circuit-based, adiabatic, one-way and topological. Here, we will tackle the quantum circuit-based and adiabatic only. Whereas a classical computer uses logic gates to perform computations, a quantum circuit-based computer uses quantum gates. A quantum gate takes a couple of qubits and transforms them until each gets to the desired “excited” superposition state; the particular resulting combination of qubits encodes the problem at hand. After applying the last gate, the qubits are read electronically and collapse — it’s as if the world we have built collapses — revealing the solution.

The most popular enterprise model today is adiabatic quantum computing, which constantly evolves optimization parameters until they converge. Adiabatic computing can be implemented using quantum annealing, which exploits the quantum phenomenon of tunneling. This type of computation is used in quantum computers built by a Vancouver company, D-Wave Systems — in particular, in D-Wave’s newest model, the 2000Q, the only commercially available quantum computer to date.

Adiabatic computing focuses on solving what’s known as a quadratic unconstrained binary optimization (QUBO) problem. A large-scale problem like QUBO is intractable with a classical computer but is relatively straightforward with an adiabatic-type quantum computer. Let’s say there are 60 binary states to consider and that it takes a second to calculate the energy of a particular combination. With 60 bits, it would take more time than the age of the observable universe to find the minimum of all possible 260 combinations. With 60 qubits, the problem is solved instantaneously.

Because not every problem is explicitly stated as this kind of optimization task, you must first properly reformulate it as a QUBO to reap the benefits of the quantum speed-up. Quantum computing is quite task-specific, and there is no clear answer to the question of which problems can be solved. But we know there is a class of problems of great significance that can be tackled — namely, optimization, which underpins many engineering and business applications, including robotics, image classification and natural language processing such as news, and attempts to find patterns in biometric and financial data. The latest major breakthrough achieved by adiabatic computing: calculating nonconflicting optimal trajectories for air traffic management. It takes one second for D-Wave 2000Q to solve the air traffic problem, with a 99 percent probability of producing a correct answer.

In another interesting configuration, a quantum computer is able to adapt dynamically in response to new data: You can feed a large volume of already labeled data to the computer, which then learns a model that reacts appropriately and labels unseen data accurately. In this way, we can achieve supervised machine learning directly on a quantum computer. Although this is still only theoretically possible, it would immensely accelerate machine learning innovation and propel us into a new age of artificial intelligence.

Quantum Machine Learning

Conventional machine learning techniques use mathematical algorithms to search for patterns in a dataset. With the amount of digital data growing exponentially, the prediction analytics empowered by these algorithms can capture hidden and unanticipated traits, surpassing human scalability and accuracy. In recent years, these techniques have become powerful tools for many different applications. Further boosted by the rapid adoption of powerful digital supercomputing, machine learning reigns over the modern tech industry, despite the fact that certain application domains remain out of reach because of the enormous computational complexity required.

The good news is that recent advances in quantum computing have raised hopes that near- term quantum devices can expediently solve computationally intractable problems in simulation, optimization and machine learning. The opportunities that quantum computing raise for machine learning are difficult to overstate; quantum machine learning will speed up information processing rates, far beyond what is currently possible, and will breed innovative research such as quantum deep learning.

At the heart of quantum deep learning is the Harrow-Hassidim-Lloyd (HHL) algorithm, which solves a linear system of equations in time proportional to the logarithm of the input data size. Why is this important? There are many fundamental learning techniques that employ linear algebra, including regression analysis, support vector machines and Gaussian processes. Solving a linear system of equations in logarithmic time means that the algorithm can be applied to an exponential amount of data in linear time — it’s very practical. The more data fed to the computer, the better the results produced by these machine learning methods. As a result, quantum computing is likely to drive machine learning into a new dimension where data size is no longer of much computational concern.

In another application, Berkeley, California–based Rigetti Computing has successfully implemented a clustering algorithm on quantum chips. This is another very important class of machine learning algorithms, which organizes data in clusters based on similarity. Quantum computing can deliver a significant performance boost when the computationally intensive parts of the algorithm are delegated to the quantum chip. A good example of this approach is the use of Grover’s search algorithm to speed up k-medians, a specific clustering algorithm. Grover’s approach delivers quadratic speed improvement when searching for an element among a larger group. Let’s imagine that we have a million drawers and are searching for a specific item — say, a red sock. If we’re lucky, only one step is required to find the sock, in the first drawer we open. But we might have to take a million steps before we find it. Although, on average, we have to go through 500,000 drawers when employing Grover’s algorithm, the number of steps is guaranteed to be no more than 1,000.

Deep learning is yet another major machine learning subclass that could benefit enormously from quantum power. Thanks to deep learning, machine learning has reached human accuracy levels in previously intractable tasks such as computer vision, speech recognition and natural language processing. At the heart of every deep learning solution is a neural network that mimics how a human brain learns and adapts to new data. Quantum neural networks (QNNs) are still in their infancy, but intensive research is ongoing.

Despite major challenges, quantum machine learning applications are improving rapidly, much like quantum computing itself. Certainly, it is no overstatement to say that the huge interest in machine learning and artificial intelligence— and in their extremely large application domain — is fueling this progress. Tech leaders such as Google and IBM Corp. are funding billion-dollar quantum research projects because they believe in quantum power and are determined to make the quantum future happen.

Challenges of Quantum Computing

Of course, there are hurdles to overcome before we can make the quantum future happen. Quantum engineers have to master the physical qubit, as computing is impossible if the qubit cannot remain in superposition long enough to perform a useful computation. In superposition, the qubit transcends the classical world for the quantum world for only a fraction of a second — say, about 2.4 milliseconds — before returning to normal. The superposition is a fragile physical state, and its duration heavily depends on the sterility and stability of the system. Any “noise” distorts the qubit excitement, rendering the processed signal useless. There has been significant progress on this issue in recent years, and, as you may guess, quantum engineering is converging on the right solution — or perhaps a few solutions.

Quantum stability with reliable manipulations has already been solved for a smaller number of qubits, but scaling up is a challenge. Essentially, the issue is the number of qubits that can be operated by a single processor. Some companies, such as Rigetti Computing, prefer stability over scale, but the parity barrier of 20 qubits, where quantum performance matches that of classical computing, was shattered some time ago. And we are about to break the quantum supremacy barrier of 50 qubits: Google is working on a 49-qubit computer, a team from Harvard University has successfully tested a 51-qubit machine, and IBM is testing a 56-qubit system. This is like playing 256 scenarios of your favorite TV series instantaneously; it is huge.

With respect to quantum stability and scaling, what matters, obviously, is the infrastructure. A high-end classical supercomputer is already tricky to maintain given the necessary resources, such as large amounts of electrical power. But power might not be the only issue with the quantum infrastructure: It is testing very general physical limits. In a fragile and sterilized environment, every detail matters; noise must not exist when aiming for maximum qubit longevity and stability.

Game Changing and Paradigm Shifts

Humanity is curious by nature; curiosity drives our evolution and progress. Man has long dreamed of revealing nature’s secrets. But to tackle those unknowns, we require greater computing power, which seems to mean that we require quantum magic. That magic itself is a piece of nature’s fantastic puzzle. As Feynman said, “Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly, it’s a wonderful problem, because it doesn’t look so easy.”

With IBM leading the quantum race, with the promise of Google introducing a viable quantum computer in the next several years and with all the human brainpower focused on making this scientific breakthrough happen, quantum computing is almost surely a game changer. When this happens, the world will experience a paradigm shift. And yet we do not know how to define success or whether the paradigm shift is going in the right direction.

What we do know is that we should not retreat. It is bound to be a very interesting journey to six by nine. And to 42.


Esma Aïmeur, Gilles Brassard and Sébastien Gambs. “Quantum Speed-up for Unsupervised Learning.” Machine Learning 90, no. 2 (2013).

Peter Dockrill. “Scientists Just Broke a New Record for Quantum Computing Stability.” Science Alert (2016).

D-Wave Systems. “Quantum Computing Primer.”

Larry Hardesty. “Long Live the Qubit!” MIT News (2011).

Larry Hardesty. “Qubits with Staying Power.” MIT News (2015).

Mike McRae. “We Are About to Cross the ‘Quantum Supremacy’ Limit in Computing.” Science Alert (2017).

Cade Metz. “The Race to Sell True Quantum Computers Begins Before They Really Exist.” Wired (2017).

S. Otterbach, R. Manenti, N. Alidoust, A. Bestwick, M. Block, B. Bloom, S. Caldwell, N. Didier, E. Schuyler Fried, S. Hong, P. Karalekas, C. B. Osborn, A. Papageorge, E. C. Peterson, G. Prawiroatmodjo, N. Rubin, Colm A. Ryan, D. Scarabelli, M. Scheer, E. A. Sete, P. Sivarajah, Robert S. Smith, A. Staley, N. Tezak, W. J. Zeng, A. Hudson, Blake R. Johnson, M. Reagor, M. P. da Silva and C. Rigetti. “Unsupervised Machine Learning on a Hybrid Quantum Computer” (2017).

Patrick Rebentrost, Masoud Mohseni and Seth Lloyd. “Quantum Support Vector Machine for Big Data Classification.” Physical Review Letters 113: (2014).

Tobias Stollenwerk, Bryan O’Gorman, Davide Venturelli, Salvatore Mandrà, Olga Rodionova, Hok K. Ng, Banavar Sridhar, Eleanor G. Rieffel, Rupak Biswas. “Quantum Annealing Applied to De-Conflicting Optimal Trajectories for Air Traffic Management” (2017).

Mike White and Igor Tulchinsky. “Is Quantum Computing Ready to Take a Quantum Leap?” (2017).

Thought Leadership articles are prepared by and are the property of WorldQuant, LLC, and are being made available for informational and educational purposes only. This article is not intended to relate to any specific investment strategy or product, nor does this article constitute investment advice or convey an offer to sell, or the solicitation of an offer to buy, any securities or other financial products. In addition, the information contained in any article is not intended to provide, and should not be relied upon for, investment, accounting, legal or tax advice. WorldQuant makes no warranties or representations, express or implied, regarding the accuracy or adequacy of any information, and you accept all risks in relying on such information. The views expressed herein are solely those of WorldQuant as of the date of this article and are subject to change without notice. No assurances can be given that any aims, assumptions, expectations and/or goals described in this article will be realized or that the activities described in the article did or will continue at all or in the same manner as they were conducted during the period covered by this article. WorldQuant does not undertake to advise you of any changes in the views expressed herein. WorldQuant and its affiliates are involved in a wide range of securities trading and investment activities, and may have a significant financial interest in one or more securities or financial products discussed in the articles.