Trending News
Home / International Defence Security and Technology / Technology / Quantum / Scientists solve critical challenge of error correction for building large scale fault tolerant quantum computers

Scientists solve critical challenge of error correction for building large scale fault tolerant quantum computers

‘The development of a “quantum computer” is one of the outstanding technological challenges of the 21st century. A quantum computer is a machine that processes information according to the rules of quantum physics, which govern the behaviour of microscopic particles at the scale of atoms and smaller said Dr Chris Ballance, a research fellow at Magdalen College, Oxford. It turns out that this quantum-mechanical way of manipulating information gives quantum computers the ability to solve certain problems far more efficiently than any conceivable conventional computer.

One such problem is related to breaking secure codes, while another is searching large data sets. Quantum computers are naturally well-suited to simulating other quantum systems, which may help, for example, our understanding of complex molecules relevant to chemistry and biology.’ Quantum computers has many applications in military too like efficient decoding of cryptographic codes like RSA, AI / Pattern recognition tasks like discriminating between missile and decoy, Bioinfromatics like efficient analysis of new bioengineered threat using MCMC (Markov Chain Monte Carlo) methods.

Such quantum systems are naturally fragile: they constantly evolve in uncontrolled ways due to unwanted interactions with the environment, leading to errors in the computation. One of the main difficulties of quantum computation is that decoherence destroys the information in a superposition of states contained in a quantum computer, thus making long computations impossible. Quantum error correction is used to protect quantum information from errors due to decoherence and other quantum noise.

Quantum error correction is essential if one is to achieve fault-tolerant quantum computation that can deal not only with noise on stored quantum information, but also with faulty quantum gates, faulty quantum preparation, and faulty measurements. Demonstrating error correction that actually works is the biggest remaining challenge for building a quantum computer.

Researchers have been developing quantum error correction  code would correct any errors in quantum data, and it would require measurement of only a few quantum bits, or qubits, at a time.

Physicists have applied the ability of machine learning algorithms to learn from experience to one of the biggest challenges currently facing quantum computing: quantum error correction, which is used to design noise-tolerant quantum computing protocols. In a new study, they have demonstrated that a type of neural network called a Boltzmann machine can be trained to model the errors in a quantum computing protocol and then devise and implement the best method for correcting the errors.

Quantum Error Correction

Classical error correction employs redundancy for instance by storing the information multiple times, and—if these copies are later found to disagree—just take a majority vote; Copying quantum information is not possible due to the no-cloning theorem. But it is possible to spread the information of one qubit onto a highly entangled state of several (physical) qubits , even hundreds or thousands of them.

Peter Shor first discovered this method of formulating a quantum error correcting code by storing the information of one qubit onto a highly entangled state of nine qubits. Quantum error-correction codes take advantage of these other qubits to uncover the errors without really resorting to copying the value of the original qubit. The basis of quantum error correction is measuring parity. The parity is defined to be “0” if both qubits have the same value and “1” if they have different values. Crucially, it can be determined without actually measuring the values of both qubits.

Scientists at the Yale University showed that it is possible to track quantum errors in real time. The team used ancilla or a more stable reporter atom that detected errors in the system without actually disturbing any qubits. During the experiment, researchers used a superconducting box. The box had the reporter atom as well as an unknown number of photons, which were cooled to about negative 459°F, a fraction of a degree above absolute zero. The ancilla just reports photon parity – whether there was a change from even to odd/odd to even photons in the box – and not exact numbers, according to Researchers.

Repetitive error correction

Researchers at the University of California, Santa Barbara (UCSB) and Google have demonstrated repetitive error correction in an integrated quantum device that consists of nine superconducting qubits. “Our nine-qubit system can protect itself from bit errors that unavoidably arise from noise and fluctuations from the environment in which the qubits are embedded,” explains team member Julian Kelly.

The researchers repetitively measured the parity between adjacent “data” qubits by making use of “measurement” qubits. “Each cycle, these measurement qubits interact with their surrounding data qubits using quantum logic gates and we can then measure them,” Kelly explains. “When an error occurs, the parity changes accordingly and the measurement qubit reports a different outcome. By tracking these outcomes, we can figure out when and where a bit error has occurred and correct for it.”

The more qubits that are involved in the process, the more information is available to identify and correct for errors, explains team member Austin Fowler. “Errors can occur at any time and in all types of qubits: data qubits, measurement qubits, during gate operation and even during measurements. We found that a five-qubit device is robust to any type of bit error occurring anywhere during an algorithm, but a nine-qubit device is better because it is robust to any combination of two-bit errors.”

Surface Code Architecture

The Google and UCSB team eventually hope to build a 2-D surface code architecture based on a checkerboard arrangement of qubits, so that “white squares” would represent the data qubits that perform operations and “black squares” would represent measurement qubits that can detect errors in neighboring qubits. The “measurement” qubits are entangled with neighboring “data” qubits, share information through a quantum connection.

The surface code architecture allows a lower accuracy of quantum logic operations, 99 percent instead of 99.999 percent in other quantum error-correction schemes. IBM researchers have also done pioneering work in making surface-code error correction work with superconducting qubits. One IBM group demonstrated a smaller three-qubit system capable of running surface code, although that system had a lower accuracy—94 percent.

Currently, groups are modifying the material properties of their qubits, improving lithography techniques and improving pulse-shaping techniques to make qubit lifetimes longer. This should increase the fidelity of the qubits and make implementing a surface code less resource-intensive.

Researchers prevent quantum errors from occurring by continuously watching a quantum system

A team led by Tim Taminiau managed to suppress  quantum errors through the so-called quantum Zeno effect. A team of scientists led by Tim Taminiau of QuTech, the quantum institute of TU Delft and TNO, has now experimentally demonstrated that errors in quantum computations can be suppressed by repeated observations of quantum bits. If an observable of a quantum state is measured, the system is projected into an eigenstate of this observable. For example, if a qubit in a superposition of ‘0’ and ‘1’ is observed, the qubit is projected into either ‘0’ or ‘1’ and will remain frozen in that state under repeated further observations.

Joint observables

While just freezing a quantum state by projecting a single qubit does not allow for computations, new opportunities arise when observing joint properties of multi-qubit systems. The projection of joint properties of qubits can be explained with the following analogy: consider grouping three-dimensional objects based on their two-dimensional projection. Shapes can still transform within a subgroup (for example between a cube and a cylinder), but unwanted changes (for example to a sphere) are suppressed by the constant observations of the 2D projection. Similarly, the projection of joint observables in multi-qubit systems generates quantum subspaces. In this way, unwanted evolution between different subspaces can be blocked, while the complex quantum states within one subspace allow for quantum computations.

Diamond

The QuTech scientists experimentally generated quantum Zeno subspaces in up to three nuclear spins in diamond. Joint observables on these nuclear spins are projected via a nearby electronic spin, generating protected quantum states in Zeno subspaces. The researchers show an enhancement in the time that quantum information is protected with increasing number of projections and derive a scaling law that is independent of the number of spins. The presented work allows for the investigation of the interplay of frequent observations and various noise environments. Furthermore, the projection of joint observables is the basis of most quantum error correction protocols, which are essential for useful quantum computations.

New Quantum error correction Protocol corrects virtually all errors in quantum memory, but requires little measure of quantum states.

The ideal quantum error correction code would correct any errors in quantum data, and it would require measurement of only a few quantum bits, or qubits, at a time. But until now, codes that could make do with limited measurements could correct only a limited number of errors — one roughly equal to the square root of the total number of qubits. So they could correct eight errors in a 64-qubit quantum computer, for instance, but not 10.

In a paper they’re presenting at the Association for Computing Machinery’s Symposium on Theory of Computing in June, researchers from MIT, Google, the University of Sydney, and Cornell University present a new code that can correct errors afflicting — almost — a specified fraction of a computer’s qubits, not just the square root of their number. And for reasonably sized quantum computers, that fraction can be arbitrarily large — although the larger it is, the more qubits the computer requires.

A quantum computation is a succession of states of quantum bits. The bits are in some state; then they’re modified, so that they assume another state; then they’re modified again; and so on. The final state represents the result of the computation.

In their paper, Harrow and his colleagues assign each state of the computation its own bank of qubits; it’s like turning the time dimension of the computation into a spatial dimension. Suppose that the state of qubit 8 at time 5 has implications for the states of both qubit 8 and qubit 11 at time 6. The researchers’ protocol performs one of those agreement measurements on all three qubits, modifying the state of any qubit that’s out of alignment with the other two.

Since the measurement doesn’t reveal the state of any of the qubits, modification of a misaligned qubit could actually introduce an error where none existed previously. But that’s by design: The purpose of the protocol is to ensure that errors spread through the qubits in a lawful way. That way, measurements made on the final state of the qubits are guaranteed to reveal relationships between qubits without revealing their values. If an error is detected, the protocol can trace it back to its origin and correct it.

It may be possible to implement the researchers’ scheme without actually duplicating banks of qubits. But, Harrow says, some redundancy in the hardware will probably be necessary to make the scheme efficient. How much redundancy remains to be seen: Certainly, if each state of a computation required its own bank of qubits, the computer might become so complex as to offset the advantages of good error correction.

But, Harrow says, “Almost all of the sparse schemes started out with not very many logical qubits, and then people figured out how to get a lot more. Usually, it’s been easier to increase the number of logical qubits than to increase the distance — the number of errors you can correct. So we’re hoping that will be the case for ours, too.”

Stephen Bartlett, a physics professor at the University of Sydney who studies quantum computing, doesn’t find the additional qubits required by Harrow and his colleagues’ scheme particularly daunting. “It looks like a lot,” Bartlett says, “but compared with existing structures, it’s a massive reduction. So one of the highlights of this construction is that they actually got that down a lot.”

Machine learning tackles quantum error correction

The physicists, Giacomo Torlai and Roger G. Melko at the University of Waterloo and the Perimeter Institute for Theoretical Physics, have published a paper on the new machine learning algorithm in a recent issue of Physical Review Letters.

“The idea behind neural decoding is to circumvent the process of constructing a decoding algorithm for a specific code realization (given some approximations on the noise), and let a neural network learn how to perform the recovery directly from raw data, obtained by simple measurements on the code,” Torlai told Phys.org. “With the recent advances in quantum technologies and a wave of quantum devices becoming available in the near term, neural decoders will be able to accommodate the different architectures, as well as different noise sources.”

As the researchers explain, a Boltzmann machine is one of the simplest kinds of stochastic artificial neural networks, and it can be used to analyze a wide variety of data. Neural networks typically extract features and patterns from raw data, which in this case is a data set containing the possible errors that can afflict quantum states.

Once the new algorithm, which the physicists call a neural decoder, is trained on this data, it is able to construct an accurate model of the probability distribution of the errors. With this information, the neural decoder can generate the appropriate error chains that can then be used to recover the correct quantum states.

The researchers tested the neural decoder on quantum topological codes that are commonly used in quantum computing, and demonstrated that the algorithm is relatively simple to implement. Another advantage of the new algorithm is that it does not depend on the specific geometry, structure, or dimension of the data, which allows it to be generalized to a wide variety of problems.

In the future, the physicists plan to explore different ways to improve the algorithm’s performance, such as by stacking multiple Boltzmann machines on top of one another to build a network with a deeper structure. The researchers also plan to apply the neural decoder to more complex, realistic codes.

“So far, neural decoders have been tested on simple codes typically used for benchmarks,” Torlai said. “A first direction would be to perform error correction on codes for which an efficient decoder is yet to be found, for instance Low Density Parity Check codes. On the long term I believe neural decoding will play an important role when dealing with larger quantum systems (hundreds of qubits). The ability to compress high-dimensional objects into low-dimensional representations, from which stems the success of machine learning, will allow to faithfully capture the complex distribution relating the errors arising in the system with the measurements outcomes.”

 

 

Quantum computing advances with control of entanglement

The tiny-scale quantum effects fall apart too easily to be practical for reliably powering computers. Now, a team of scientists in Japan may have overcome this obstacle. Using laser light, they have developed a precise, continuous control technology giving 60 times more success than previous efforts in sustaining the lifetime of “qubits,” the unit that quantum computers encode.

In order to meet various requirements for various applications, an important technological development is to increase the number of available entangled qubits. In particular, the researchers have shown that they can continue to create a quantum behavior known as the entangled state—entangling more than one million different physical systems, a world record that was only limited in their investigation by data storage space.

“We have demonstrated the deterministic generation and verification of a fully inseparable dual-rail continuous variable (CV) cluster state consisting of more than one million qumodes of light, by employing a time-domain multiplexing scheme. Compared to the previous work, the cluster state generator does not degrade during operation, owing to the continuous feedback of the optical system. We can in principle further increase the number of qumodes, but we stopped at around one million qumodes because of the data size for verification. Time domain multiplexing is one of the key technologies of CV information processing

“There is a problem of the lifetime of qubits for quantum information processing. We have solved the problem, and we can continue to do quantum information processing for any time period we want,” explained Akira Furusawa, of the Department of Applied Physics, School of Engineering at the University of Tokyo and lead researcher on the study. “The most difficult aspect of this achievement was continuous phase locking between squeezed light beams, but we have solved the problem.”

The report of their investigation appeared in the journal APL Photonics.

References and Resources also include

http://phys.org/news/2016-08-record-breaking-logic-gate-important-milestone.html

http://www.ibtimes.co.uk/quantum-computing-breakthrough-oxford-university-scientists-achieve-incredibly-accurate-logic-gate-1574965

http://phys.org/news/2016-09-quantum-advances-entanglement.html?utm_source=nwletter&utm_medium=email&utm_campaign=weekly-nwletter

http://phys.org/news/2016-10-quantum-errors.html

https://phys.org/news/2017-08-machine-tackles-quantum-error.html

http://news.mit.edu/2015/quantum-error-correction-0526

image_pdfimage_print

Check Also

daf30d03da42a5291604b3fcca07f0d0

Sucess of ultra secure, underwater Quantum Communications for submarines and Underwater Unmanned Vehicles will pave the way for global secured quantum network

Recent years, Quantum Key Distribution (QKD)  based on photons has made great progress both in …

error: Content is protected !!