The Quantum Computer – The Holy Grail of the Quantum Revolution 2.0

Every year, the amount of data we humans produce doubles. That means, in 2018 alone as many giga-, tera-, peta- and exabytes will come together worldwide as in the whole of human history until 2017. The background of this growth is that data, its collection and transfer are no longer bound to static computers. Smart phones, smart homes, smart clothes, smart factories, smart cities … a lot of “smart” things are meanwhile connected via the Internet. And they all produce more and more data of their own. Thus the demand on the performance of computer chips is growing exponentially as well. In fact, their computing power has been approximately doubling every 18 months in the last 50 years. The growth in the number of components per unit area on integrated circuits follows a law formulated in 1965 by later co-founder of Intel, Gordon Moore.

But it looks as if “Moore’s Law” will soon lose its validity. That does not mean,however, that we have reached the end of the flagpole with regard to the further increase in the efficiency of information processing. There are other ways to build faster computers, and one of them even yields millions and billions more powerful ones: quantum computers. These are computers that are ultimately based on the bizarre properties of quantum theory itself. With their help, problems could be solved that are still far too complex for the “supercomputers” used today in physics, biology, weather research, and elsewhere. The development of quantum computers would be a technological revolution of its own kind that would shape the 21st century as much as the development of digital circuits formed the 20th century.

A quantum computer works in an entirely different way compared to conventional computers. Classical computers use as the smallest possible pieces of information the “bits” that have either the state 1 or 0 (that is, they can take two values, hence the term “digital”). The so-called von Neumann architecture ensures that the calculation steps are processed sequentially, i.e. bit by bit. Although today’s computers now contain components that are so small that quantum effects play a major role in them, this does not change the fact that their functionality is based entirely on the principles of classical physics.

In contrast, quantum computers are subject to a completely different information theory. The simplest system in quantum mechanics is the so-called “quantum bit” (“qubit” for short). Qubits can accept both states, 0 and 1, as well as all intermediate values (and more in the sphere of complex numbers). This is due to the possibilities of quantum states to exist in so-called “superpositions”, combinations of classically mutually exclusive states. This bizarre property of quantum particles was once the catalyst for heated discussions among the fathers of quantum physics. In addition, different quantum particles can be brought into so-called entangled states. It is as if the qubits are connected to each other via an invisible spring. Through a “spooky action at a distance” (a term originally invented by Albert Einstein with ironic intent), they are all in direct contact, so to speak, and each quantum bit “knows” so to say what the others are doing. In this way, entangled qubits are in a superposition of infinitely many different states at the same time (casually speaking: they are in all these states at the same time), each of which upon measurementis realized with a certain probability and which (before any measurement) can all be processed simultaneously with the aid of an appropriate algorithm. It is as if many chocolate factories start their production lines at the same time. The more qubits you have, the more states you can process in parallel. Unlike conventional computers, whose computing power increases linearly with the number of computational components, the computing power of a quantum computer increases exponentially with the number of qubits used. The performance of a quantum computer is thus not doubled when another 100 qubits are added to existing 100 qubits, but already when only a single qubit is added to the 100 qubits. Coming up with 10, its performance increases a thousand times, with 20 new qubits the quantum computer is already one million times faster, with 50 new qubits one million billion times. And with 100 new information carriers, when the performance of a classic computer has just doubled, the increase in the performance of a quantum computer can hardly be named in numbers anymore.

However, upon closer inspection the massive parallelization with the help of entangled states is not quite comparable to parallel working chocolate factories. Information stored and processed in entangled systems is very different from the information processed by ordinary digital computers. Quantum computers do not literally work in parallel, but they organize the information such that it is distributed over many entangled components of the overall system. The followingaimsto illustrate this: Imagine a book with 100 pages. For an ordinary 100-page classic book, every time you read a page, you have captured another 1% of the book’s content. After reading all the pages one by one, you know everything that is in the book. In a quantum book, where the pages are entangled, things are different. Looking at the pages one at a time, one sees only random gibberish, and after reading all pages one after the other, one still knows very little about the contents of the book. This is because the information in the quantum book is not printed on the individual pages, but is encoded almost exclusively in the correlation of the pages with each other. Anyone who wants to read the book must so to say look at the pages simultaneously.

It seems strange that quantum computers have not been realized a long time ago. After all, quantum theory was long established at the time of the creation of the modern computer. Nevertheless, decades passed before physicists embraced the possibilities of quantum information processing. One of the reasons for this is obvious: For a long time, neither physicists nor computer scientists knew how to deal with the phenomena of superposition and entanglement. But there is a second reason: In the 1940s, the American mathematician Claude Shannon founded classical information theory, which is based on the use of bits. His essay A Mathematical Theory of Communication is still considered the bible of the information age and is one of the most influential scientific works of the 20th century. The problem: Shannon had formulated his theory of information processing very abstractly and generally. For a long time, computer scientists took it for granted that the principle of bits applies to every possible form of information processing. Only much later did the computer scientists realize that there exists information concepts beyond bits and classical physics – and that the calculation on qubits require a completely new theoretical foundation, one that is explicitly about superposition and entanglement of quantum states. And their characteristics do not fit into classical information theory. Both basic units – bits and qubits – are so different from each other that a qubit cannot be mapped to classical bits. A respective new information theory was not created until the late 1990s through the combined efforts of physicists and information theorists.

Five problems that today’s computers – no matter how great – cannot solve in a realistic time span show what possibilities arise with a quantum computer.

  1. Cryptography: Today’s common (asymmetric) encryptions (with a public and a private key as in the well-known RSA method) are based on the factorization of the product of two very large prime numbers. As of a certain magnitude of the numbers, this task can no longer be solved with a classic computer. This is what nearly all common encryption methods are based on. In 1994 the computer scientist Peter Shor developed an algorithm, which could factorize the largest products of primes used today in encryption into their divisors within minutes with the help of a quantum computer.
  2. Solution to complex optimization problems: The task of finding the optimal solution from many variants is considered particularly tricky among mathematicians. The standard problem is that of the traveling salesman: The task is to choose the order of travelling to several places such that the entire route is as short as possible. With only 15 cities, there are already over 43 billion different paths possible, and with 18 cities over 177 trillion. Related problems occur in industrial logistics, in the design of microchips or in the optimization of traffic flows. Already with a small number of points, classical computers thus fail in the calculation of optimal solutions. Quantum computers could solve such optimization problems in a relatively short time.
  3. Significant applications could arise in the field of artificial intelligence: The deep neural networks used therein come with hard combinatorial optimization problems, which can be solved much faster and better by quantum computers than by classical computers. Especially from very noisy data quantum computers could detect structures much faster and learn accordingly quickly. Thus a new “mega buzzword” has come up whichcombines two individual buzz words each of which on its own stimulating the imagination of many people: Quantum machine learning.
  4. Searching in large data bases: When searching unsorted data sets a classical computer must examine each data point individually. The search time thus increases linearly with the number of data points. For large amounts of data, the number of computational steps required is too large to be practical for a classical computer. In 1996, the Indian-American computer scientist Lov Grover published a quantum computer algorithm, for which the number of required computational steps grows only with the square root of the number of data points. Instead of needing a thousand times longer for one billion data entries compared to one million,with the Grove algorithm it would take a quantum computer only a little over 30 times longer.
  5. Finding new chemical compounds: In simulating quantum systems, complex optimization problems come up over and over again that involve finding the best possible, i.e. energetically most favorable solution in the configuration of the electrons of an atom or molecule. Theoretical physicists and chemists have dealt with these problems for decades with rather contained successes. Quantum computers could directly represent and model the quantum behavior of the participating electrons, since they behave like a quantum system themselves, while with classical computers in most cases crude simplifications are necessary. If, in turn, one could predict and understand molecules and the details in chemical reactions much better than today, it would be conceivable to find new medicines at weekly intervals or to develop much better battery technologies than today within one month.

Some physicists even hope to be able to calculate any problem in nature with a quantum computer, from the behavior of black holes, the development of the very early universe to superconductivity and collisions of high-energy elementary particles (all problems that are difficult to calculate on classical computers, as complex quantum properties play a key role therein).

However, there are still huge problems to overcomeon the path towards functional quantum computers. The most important of these is: Under the omnipresent influence of heat and radiation Entangled quantum states decay very quickly – often too fast to perform the desired operations without error. The physicists speak in this context of “decoherence” of the quantum states. Working with qubits seems like writing on writing on a water surface rather than a piece of paper. While the paper can last for centuries, the writing on water disappears within fractions of a second. So it is important to master a crazy speed.

In order to overcome this hurdle the quantum engineers pursue a twofold strategy: On the one hand, they try to extend the life of the qubits or reduce their susceptibility to errors, on the other hand, they develop algorithms that correct the errors that occur. With the help of ultra-cold refrigerators the physicists are able to diminish the decoherence. For the treatment of errors caused by decoherence in individual qubits, they are developing increasingly better methods (so-called quantum error correction) with which the reliability of quantum computers can be increased.

However, their efforts still fall short of being able to build reliably functioning quantum computers. Nevertheless,in recent weeks and months firms like IBM and Google have announced that they have built quantum processors that consist up to 50 qubits. At this size, quantum computers could – at least for some very specific computational problems – surpass the computing capacity of any modern (classic) supercomputer. Google calls this “quantum supremacy” and already in 2017 announced to reach such state by the end of the same year. We have not heard of it yet, but it is worth reading the science part of the daily newspaper a bit more closely in the next week and months.

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.
You need to agree with the terms to proceed