fbpx

Computer: the promises of the quantum dawn





For a long time a mere physicist’s idea, the quantum computer, which promises to revolutionize computing, is becoming an increasingly tangible reality. In a few years, the first machines capable of outperforming conventional computers are expected to appear. This article is part of the TOP5 most read on our site in 2019.

The most powerful supercomputers on the planet could soon be consigned to the prehistory of computing. Within a few years, for the most optimistic, machines of a new kind, offering phenomenal computing capabilities, will make their appearance: quantum computers. Imagined in the early 1980s by the Nobel Prize in Physics Richard Feynman, the concept of such a computer is now becoming more and more a reality. “We are currently living in a pivotal era where industrialists, such as Google and IBM, are taking up the subject that has until now remained the prerogative of research laboratories, and this promises to take us through major technological milestones,” rejoices Tristan Meunier, from the Institut Néel1. The same goes for Eleni Diamanti of the Laboratoire d’Informatique de Paris 62. “In the next few years, we will have quantum computers with sufficient performance to beat our traditional computers on certain problems. »

The power of layering and entanglement

As its name suggests, a quantum computer takes advantage of the laws of quantum mechanics, a theory that describes physical phenomena at the atomic scale. These astonishing laws allow a particle, atom or molecule to be in different states at the same time – called superimposed states. Thus, whereas in an ordinary computer, information is encoded in the form of bits that can only take on two values, 0 or 1, depending on the passage of electric current through a transistor, quantum bits (or qubits) can simultaneously take on the values 0 and 1. What’s more, when two qubits interact, their physical states become “entangled”, so that the two systems can no longer be described independently – they are called entangled states.

Thanks to these two phenomena, superposition and entanglement, a quantum computer can theoretically access all the possible results of a calculation in a single step, whereas a conventional computer has to process the information sequentially, one result after the other. It is this massive parallelism that is at the heart of the power of the quantum computer.

The computational speed of quantum algorithms

As early as the 1990s, researchers proposed algorithms for such computers. And demonstrated mathematically that implemented on these machines, they would indeed perform certain calculations at a speed beyond anything imaginable with a conventional computer. Thus, in 1994, the American mathematician Peter Shor, from MIT, presented an algorithm with which it would be possible to factorize any number, that is, to decompose it into a product of prime numbers, in record time.



“A quantum computer can theoretically access all the possible results of a calculation in a single step, whereas a conventional computer has to process the information sequentially, one result after the other.”

Enough to break most of today’s cryptographic systems, from the encryption of our banking transactions to the encodings used to exchange state secrets, which are based precisely on the explosion in computing time of factorization for increasingly large numbers. A problem of this kind, which would take several billion years for a conventional computer, would thus be solved in just a few minutes by a quantum computer!

Similarly, in 1997, Lov Grover, from Bell Laboratories, demonstrated with his algorithm that a quantum computer could considerably increase the efficiency of classical algorithms used to search for information in a database.

For example, the search for one element among ten thousand data would require only a hundred steps, compared to ten thousand for a traditional computer. The time saved in processing massive data would be considerable. It is therefore understandable why companies like Google are interested in this new technology.

The accumulation of qubits

The next step was to develop the basic building blocks of these quantum computers, the famous qubits, capable of being in two states at the same time. Physicists all over the world quickly got down to it. Numerous candidates were tested: atoms, ions, molecules, electrons, photons and superconducting circuits. With, already, great success in the manipulation of these qubits. In 2003, Rainer Blatt from the University of Innsbruck, Austria, realized the first two-qubit logic gate using calcium ions, a key device for performing operations by coupling qubits together.

As for the most beautiful computational feat, it was accomplished in 2012 by a team from the University of Bristol, England, which managed to factor 21, i.e. to show that this number can be decomposed into 3 times 7, thanks to a photonic device. Although the performance is modest, it represents a proof of principle of Shor’s algorithm, whose power should prove itself for much larger numbers.

For, as we will have understood, the advantage of quantum computing over its classical equivalent is all the greater as the amount of information to be processed is high. In other words, “to be efficient and interesting, a quantum computer must have a large number of qubits. For factorization problems, for example, a thousand qubits will have to be coupled together, at the very least,” explains Simon Perdrix, from the Laboratoire lorrain de recherche en informatique et ses applications3.

The brake on decoherence

And that is the problem. “For a quantum computer to work, its qubits must retain their quantum properties for the duration of the computation. However, due to interactions with the environment (magnetic field, light, thermal agitation…), everything pushes a quantum system to lose its properties. And this is all the more true since the system contains more qubits,” explains Sébastien Tanzilli, from the Nice Institute of Physics4 and who represents France within the Quantum Community Network, the committee in charge of steering the European Quantum Technologies Flagship initiative. This phenomenon known as decoherence is the main obstacle in the construction of these computers.




But far from being discouraged, physicists, from the very beginning, tried to better understand the process and did everything they could to control it. As a result, since the first qubits were developed about twenty-five years ago, their coherence time has continued to increase and more and more qubits have been intrigued. The progress in material.

“In order for a quantum computer to work, its qubits must retain their quantum properties for the duration of the computation.”

The design of the qubits and their manipulation have made it possible to push back, slowly but surely, the limit of the number of qubits, and no one can say today where that limit lies,” says Tristan Meunier. Currently, the record for the number of entangled qubits is 20, and it is true that Google announced in 2018 that it had produced a quantum processor consisting of 72 qubits, but without demonstrating how many of them could actually be entangled.

Promising implementations

In this race for the number of qubits and the realization of the quantum computer, two systems are today neck and neck and offer the most interesting perspectives. The first: trapped ions. Developed in the early 1990s, these are atoms – especially calcium – from which one or more electrons have been removed and trapped in a vacuum using lasers. They hold the record for coherence time, which can reach several minutes in some devices. On the other hand, they are slow to manipulate, which would slow down the calculations. Another disadvantage: “The trapping techniques are relatively complicated to set up, so it’s hard to imagine how we can increase in size and reach a thousand qubits,” notes Sébastien Tanzilli. Some people are already imagining solutions to get there, but the challenge remains daunting.

“Since the development of the first qubits, about 25 years ago, their coherence time has not stopped increasing and we have managed to intrigue an ever-increasing number of qubits.”

Second favorite: superconducting circuits. Appearing at the end of the 1990s, these are micrometer-sized electrical circuits with metal electrodes that become superconducting – that is, conducting electricity without resistance – at very low temperatures. Thanks to ultra-thin insulating barriers between these electrodes, called Josephson junctions, these circuits behave like artificial atoms whose quantum state can be manipulated. In terms of decoherence, superconducting qubits do less well than trapped ions, but they are faster to manipulate.

Another advantage, according to Patrice Bertet, from the Condensed State Physics Department at CEA Saclay5 , a laboratory that played a pioneering role in the development of these systems: “Their manufacturing technique is rather simple, which makes it possible to duplicate them easily and to envisage integrating a large number of them on the same chip. “This explains why these devices are now among the most popular with manufacturers. This option has notably been chosen by IBM and Google for their machines.

More recently, a third outsider has joined the race: electron spins in silicon. This involves isolating electrons in a silicon matrix and using their spin, a kind of rotation of the particle on itself, as a quantum bit of information. Developed only five years ago, these qubits are still relatively “fragile” and only two of them have been able to be intertwined to date. But by all accounts, they should soon reach the same performance levels as the two pioneering devices. Above all, “they are the only ones that can be integrated on a very large scale.

Indeed, their manufacture, which uses exactly the same techniques, already perfectly mastered, as those of micro and nanoelectronics, precisely based on silicon, means that they can be miniaturized to the extreme,” Tristan Meunier explains. Enthusiastic about the potential of such a technology, the researcher is one of the leaders of the QuCube project, which is being conducted in partnership with two other laboratories6 and whose goal is to develop a silicon-based processor with 100 qubits within six years.

 

So, which of these three candidates will come out ahead and lead to the realization of the first quantum computer? “Impossible to say, each device has certain advantages that the other does not have. None of them can claim victory today,” says Tanzilli.

The indispensable error correction

One thing is sure: improving qubit performance won’t be enough. To make the quantum computer a reality, it will also be necessary to be able to correct computational errors related to decoherence. Mathematicians and computer scientists understood this very quickly and developed corrector codes, the quantum equivalent of the error correction algorithms used in our computers. It has even been shown that, in theory, if the error rate of a qubit is below a certain value, then it is possible to correct errors faster than they occur. “The idea of correcting codes was a small revolution in the field. With it, even the most pessimistic began to believe in the possibility of a quantum computer,” says Tanzilli.




In principle, the correcting codes are simple. The idea is to use a group of several so-called “physical” qubits to code the information of a single so-called “logical” qubit. By measuring the properties of the physical qubits, we then know – because of the entanglement – if the logical qubit is no longer in the desired state and we correct it immediately. In practice, however, their implementation is more complicated: it is estimated that 1,000 – even 10,000 – physical qubits would be needed for each logical qubit that can be used for calculations. In other words, the ideal quantum computer should have not a few thousand qubits but a few million! “The manipulation and control of such a large number of qubits is still largely out of reach,” warns Patrice Bertet. This does not prevent physicists from experimenting with these methods on a very small number of qubits.




For her part, Eleni Diamanti acknowledges that a theoretical breakthrough will be needed to make these codes more efficient and therefore less greedy in qubits. “Only then will we have a quantum computer worthy of the name. Computer scientists, mathematicians and physicists are working on it hand in hand, and I’m confident that they will one day be able to overcome this problem,” she says.

Towards quantum micro-computers?

But not everyone wants to wait for the emergence of these universal quantum computers which, equipped with the best qubits and the most powerful corrector codes, would be capable of performing any complex calculation. “The current trend, which is generating a lot of research, is to identify which problems, with which algorithms, could be solved by intermediate machines, containing fewer qubits and lacking an error correction system,” notes Iordanis Kerenidis of the Institut de recherche en informatique fondamentale7 and director of the Paris Centre for Quantum Computing.

The first step towards this goal will be to demonstrate quantum supremacy, i.e. to experimentally prove the advantage of quantum over classical for a given algorithm. In the opinion of specialists, this feat should be achieved within just five years, with the appearance of small quantum computers with 50 to 100 qubits. Researchers are preparing for this and have already identified a type of mathematical problem – a calculation of probability distributions – that will lend itself to such a demonstration. “Its resolution will certainly not be of practical use, but it will launch the use of quantum machines for problems worthy of interest,” says Kerenidis.

From computation to quantum simulation

It is the fields of chemistry and materials science that should benefit first. It is predicted that the synthesis of new molecules or the development of materials with novel properties will be greatly accelerated with machines of around 100 qubits. How will this happen? By using quantum computers not for calculation, but for simulation. The idea, which was originally that of Richard Feynman, is to imitate complex physical systems (molecules, materials, etc.) using simpler artificial quantum systems: qubits. By varying at will the parameters (distance of atoms, force of interactions…) that are not adjustable in real systems, we can model the dynamics of the latter and thus better understand them.

“The advantage of simulation is that decoherence is finally no longer an enemy since the systems we simulate are themselves subject to this phenomenon. No need to have perfect quantum computers.”

Quantum simulation has already produced results, but with the increase in the number of qubits, it promises even more spectacular advances. “The advantage of simulation is that decoherence is no longer an enemy, since the systems being simulated are themselves subject to this phenomenon. There is therefore no need for perfect quantum computers,” emphasizes Simon Perdrix.

With computers of several hundred qubits, many other applications could then be developed.

First of all, all optimization tasks could be made much more efficient with the help of quantum computing: from road traffic management to energy transport meshing to financial prediction, many sectors should benefit from this.

A revolution in machine learning?

Accelerating computing speed also promises to have major benefits in terms of automatic learning, a very fashionable technique of artificial intelligence used to analyze and sort information in very large digital databases. Here too, there will be many applications: improving search engines on the Internet, for example, or improving the quality of information in the Internet.

It is therefore no coincidence that the field of quantum algorithms has never been as active as it is today. Just one example: in 2017, Iordanis Kerenidis presented a machine learning algorithm that, in theory, makes it possible to recommend movies, books or meetings exponentially more efficiently than with current methods. Clever who can say when a real quantum computer will be born and even if it will actually become a reality, but on the road to its realization, the prospects for Mr. and Mrs. Everyman promise to be extremely attractive.



Leave a Comment