The first quantum circuit lays the foundation for a computing revolution

The first quantum circuit lays the foundation for a computing revolution

This first atomic-scale quantum integrated circuit represents an important step in the development of quantum computing useful in real conditions.

Australian researchers have just announced the creation of what they describe as the “first quantum integrated circuit fabricated on the atomic scalee”; they affirm indeed to have succeeded in fitting all the elements necessary to the operation of a computer of this type on a chip with the standard format.

And this circuit is even capable of functioning as a full-fledged quantum processor. It allowed researchers to simulate the movement of electrons in a small molecule, polyacetylene. It has the advantage of being perfectly known by researchers. The latter can therefore immediately determine the consistency of the result, and by extension the reliability of the chip.

And at the end of the test protocol, the result was clear: the circuit displayed stunning precision during the simulations. According to the researchers, this is enough to “definitively prove the validity of this technology in the context of quantum modeling systems”.

This work published in the prestigious journal Nature is very exciting. It is indeed a proof of concept that unquestionably brings us closer to the democratization of quantum computers, even if this deadline is still far away.

From simple “transistor” to real circuit

This circuit is the very first manifestation of a long series of works that started in 2012. At the time, quantum computing was even more in its infancy than it is today. These same researchers had just created the very first “quantum transistor”.

Transistors are small electronic components based on semiconductor materials – those whose shortage has put tech on fire for months. Very briefly, they function like small 100% electronic switches; they are therefore fundamental elements of all logic circuits since they support the famous “bits”.

It is therefore an eminently important technological base for all modern computing. This is a technology that is now very well mastered, and manufacturers are doing real feats when it comes to miniaturizing these components. But it’s a different story when it comes to applying this concept to quantum computing, where everything is played out on the scale of the infinitely small.

Humans already master the production of microscopic components very well… © Umberto – Unsplash

An ultra-demanding manufacturing process

To build their chip, they had to use a transmission electron microscope capable of discerning atomic-scale detail. They then had to perform the entire process in near-absolute vacuum, because at this scale, even a single atom of oxygen could be a problem.

These are very important constraints that it is unfortunately impossible to circumvent to achieve the desired level of precision on the final chip. This allowed researchers to arrange a host of quantum dots, better known as quantum dots (QDs). A name that will certainly ring a bell for fans of display technology.

Concretely, these QDs are structures based on semiconductor materials, like the transistors of current computers. On the other hand, these measure only a few nanometers. They can therefore behave like quantum transistors once arranged with extreme precision. They can therefore serve as pixels in some high-end OLED screens. Just as standard transistors house bits, these quantum dots can also serve as carriers for qubits, the fundamental unit of quantum computing.

But on this scale, tolerances are virtually non-existent. The researchers had to determine the exact number of phosphorus atoms needed in each QD. They then had to determine the position of each point, then arrange them on the chip with a precision well below a nanometer and a margin of error close to zero.

If they are too big or too close, the interactions between the points become too powerful; it becomes impossible to control them individually. Conversely, if they are too small or too far apart, these interactions become unpredictable. In both cases, it impairs the functioning of the chip.

…but everything becomes more complicated when working on the scale of elementary particles. © Norbert Kowalczyk

The beginning of a real paradigm shift?

Unsurprisingly, the researchers therefore needed very many iterations to build their chip; they managed to place 10 QDs there. This therefore represents a lot of effort for a circuit which, in the end, remains weak despite its precision. Indeed, these 10 qubits are still insufficient to be useful in real conditions.

But the interest of this work lies more in the method than in the final product. By opening the door to the production of real quantum chips, we can begin to glimpse the first practical and relatively “mainstream” applications (all things considered) of this technology.

Because for the moment, the practical interest of these machines is still very limited; they are mainly exploratory devices that are not really used to carry out concrete work. Moreover, quantum computers are currently reserved for institutions that have substantial technological and financial resources.

Eventually, chips of this kind will perhaps serve as a vector to break this exclusivity and democratize quantum computing. The obstacles are still numerous; to start, it would already be necessary to produce a circuit much more powerfuland capable of operating at room temperature.

Technically, IBM’s Q System One is the first circuit-based quantum computer; but his chip was not yet self-sufficient, contrary to the proof of concept of the Australian researchers. © IBM Research

Indeed, to work, current quantum computers must be maintained at a temperature close to absolute zero. The challenge is therefore to find a way to overcome this constraint; but at present, no one has yet found the slightest clue in this direction. And this is just one isolated example among a mountain of limits (see our articles here, here and here) that still hamper the development of quantum computing.

It is therefore not tomorrow that this technology will become the norm. But it is undeniably an important step in this direction. In traditional computing, the first transistor appeared in 1947the first integrated circuit in 1958and the first personal computers in the years 1970. If quantum computing follows a comparable trajectory (which is anything but a guarantee), the long-awaited computing revolution may well come. within a few decades.

The text of the study is available here.

Leave a Comment

Your email address will not be published.