Intel has taken a key step by participating in Google and IBM marathons to build quantum computing systems. The tech giant has unveiled a superconducting quantum test chip with 49 qubits: enough qubits to eventually enable quantum computing that is beginning to exceed the practical limits of modern classical computers.

Intel’s announcement regarding the design and manufacture of its new 49-bit superconducting quantum chip, dubbed Tangle Lake, was pronounced by Brian Krzanich CEO. Intel, at the CES of 2018. Consumer electronics show in Las Vegas. This is a milestone that Google and IBM researchers have also targeted because it could usher in the moment of so-called ” quantum supremacy ” when Quantum computing can outperform conventional computing.

But Michael Mayberry vice president and general manager of Intel Labs, chose to describe quantum supremacy in different terms. “About 50 qubits is an interesting place from a scientific point of view, because you have reached the point where you can not completely predict or simulate the quantum behavior of the chip “, says he. ]

This new announcement puts Intel in good company for the marathon of quantum computing. IBM researchers announced that they had built a quantum chip prototype at 50 qubits in November 2017. Similarly, Google has already spoken of its ambition to achieve a superconducting quantum chip of 49 bits before the end of last year.

It will still be a long way before anyone realizes the commercial promise of quantum computing, which exploits the idea of ​​quantum bits (qubits) that can represent more than a state of the art. 39, information at the same time. The Intel Roadmap suggests that researchers could reach 1,000 – bit systems within 5 to 7 years. This seems a lot until you realize that many experts think that quantum computers will need at least a million bits to become useful from a commercial point of view.

But practical quantum computing also requires much more than ever larger sets of qubits. An important step is to implement a “surface code” error correction capable of detecting and correcting disturbances in the fragile quantum states of the individual qubits. Another step is to determine how to map software algorithms to quantum computing hardware. A third crucial problem concerns the engineering of the local electronics layout needed to control individual qubits and read the results of quantum computing.

 The Intel chips at 7 qubit, 17 qubit and 49 qubit. "Src =" http://spectrum.ieee.org/image/MzAwMjg0MQ.jpeg "/>
Photo: Intel

Here are the Intel chips 7-bit, 17-bit and 49-bit.


<p> The 49-bit Tangle Lake chip is based on the technological giant's previous work with  17-bit networks  that have the minimum number of bits needed to perform the code error correction of area. Intel has also developed packaging to avoid radio frequency interference with qubits, and uses flip chip technology that allows smaller and denser connections to get signals on and off chips. "We are focusing on a system, not just a larger number of qubit," says Mayberry. </p>
<p> The architecture of the superconducting qubit involves superconducting metal loops that require extremely cold temperatures of about 20 millikelvin (-273 degrees C). An "extended goal" of Intel is to increase these operating temperatures in future systems. </p>
<p> Overall, Intel has covered its bets on the various possible paths that could lead to practical quantum computing. The technology giant has partnered with  QuTech  and many other small businesses to build and test different hardware or software-hardware configurations for quantum computing. The company has also spread its investment in quantum computing between the superconducting quantum architecture and another architecture based on spin qubits in silicon. </p>
<p> Such spin qubits   are generally smaller than superconducting qubits and could potentially be made in a manner similar to how Intel and other chip manufacturers manufacture conventional computer transistors. This translates into a big potential advantage for thousands or millions of qubits, although the development on the superconducting qubits is generally greater than that of the spin qubits. On this last front, Intel has already understood how to make spin qubits based on the processes used to manufacture its 300 mm silicon wafers. </p>
<p><img alt= In addition to quantum computing, Intel has made steady progress in the development of neuromorphic computing which aims to mimic the functioning of biological brains. At the CES in 2018, Intel CEO Krzanich gave an update on Loihi, the company’s neuromorphic research chip that was unveiled in October 2017. These chips could provide specialized hardware counterpart for the deep learning algorithms that have dominated modern AI research.

Loihi’s claim to fame involves combining in-depth learning and on-chip inference for supposedly performing faster calculations with greater energy efficiency, says Mayberry. This could be a big problem because in-depth learning algorithms usually take a long time to train new datasets and make new inferences from the process.

Intel researchers recently tested the Loihi chip by training it on tasks such as recognizing a small set of objects in seconds. The company has not yet pushed the capabilities of the neuromorphic chip to its limit, says Mayberry. Nevertheless, he predicts that neuromorphic computing products could enter the market within 2 to 4 years, if customers can run their applications on the Loihi chip without requiring additional hardware modifications.

“Neither quantum computing nor neuromorphic computing will replace general purpose computing,” says Mayberry. “But they can improve it.”



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here