Intel has taken a key step by participating in Google and IBM marathons to build quantum computing systems. The tech giant has unveiled a superconducting quantum test chip with 49 qubits: enough qubits to eventually enable quantum computing that is beginning to exceed the practical limits of modern classical computers.
Intel’s announcement regarding the design and manufacture of its new 49-bit superconducting quantum chip, dubbed Tangle Lake, was pronounced by Brian Krzanich CEO. Intel, at the CES of 2018. Consumer electronics show in Las Vegas. This is a milestone that Google and IBM researchers have also targeted because it could usher in the moment of so-called ” quantum supremacy ” when Quantum computing can outperform conventional computing.
But Michael Mayberry vice president and general manager of Intel Labs, chose to describe quantum supremacy in different terms. “About 50 qubits is an interesting place from a scientific point of view, because you have reached the point where you can not completely predict or simulate the quantum behavior of the chip “, says he. ]
This new announcement puts Intel in good company for the marathon of quantum computing. IBM researchers announced that they had built a quantum chip prototype at 50 qubits in November 2017. Similarly, Google has already spoken of its ambition to achieve a superconducting quantum chip of 49 bits before the end of last year.
It will still be a long way before anyone realizes the commercial promise of quantum computing, which exploits the idea of quantum bits (qubits) that can represent more than a state of the art. 39, information at the same time. The Intel Roadmap suggests that researchers could reach 1,000 – bit systems within 5 to 7 years. This seems a lot until you realize that many experts think that quantum computers will need at least a million bits to become useful from a commercial point of view.
But practical quantum computing also requires much more than ever larger sets of qubits. An important step is to implement a “surface code” error correction capable of detecting and correcting disturbances in the fragile quantum states of the individual qubits. Another step is to determine how to map software algorithms to quantum computing hardware. A third crucial problem concerns the engineering of the local electronics layout needed to control individual qubits and read the results of quantum computing.
In addition to quantum computing, Intel has made steady progress in the development of neuromorphic computing which aims to mimic the functioning of biological brains. At the CES in 2018, Intel CEO Krzanich gave an update on Loihi, the company’s neuromorphic research chip that was unveiled in October 2017. These chips could provide specialized hardware counterpart for the deep learning algorithms that have dominated modern AI research.
Loihi’s claim to fame involves combining in-depth learning and on-chip inference for supposedly performing faster calculations with greater energy efficiency, says Mayberry. This could be a big problem because in-depth learning algorithms usually take a long time to train new datasets and make new inferences from the process.
Intel researchers recently tested the Loihi chip by training it on tasks such as recognizing a small set of objects in seconds. The company has not yet pushed the capabilities of the neuromorphic chip to its limit, says Mayberry. Nevertheless, he predicts that neuromorphic computing products could enter the market within 2 to 4 years, if customers can run their applications on the Loihi chip without requiring additional hardware modifications.
“Neither quantum computing nor neuromorphic computing will replace general purpose computing,” says Mayberry. “But they can improve it.”