Home / Technology / Manufacturing / Researchers overcoming Error correction, the critical challenge for building large scale fault tolerant quantum computers

Researchers overcoming Error correction, the critical challenge for building large scale fault tolerant quantum computers

In the grand tapestry of 21st-century technology, the development of a “quantum computer” stands out as one of the most challenging and revolutionary pursuits. Quantum computers derive their extraordinary power from the peculiar rules governing quantum bits, or qubits. Unlike their classical counterparts, which are confined to a binary state of 0 or 1, qubits revel in a unique state of superposition, holding values of 0 and 1 simultaneously. Furthermore, the entanglement of two qubits, despite their physical separation, adds an intriguing layer to the quantum realm.

The Quantum Advantage: Parallelism and Efficiency

These extraordinary properties pave the way for a game-changing method of calculation in quantum computers. The ability to consider multiple solutions to a problem simultaneously allows the cancellation of incorrect answers, amplifying the correct one. This parallelism enables quantum computers to swiftly converge on the correct solution without exhaustively exploring each possibility—an approach unimaginable for classical computers. This quantum advantage finds applications in various domains, including the military, cryptography, AI, pattern recognition, and bioinformatics.

Hurdles on the Quantum Odyssey

To fully realize the potential of quantum computers, existing prototypes must meet specific criteria. Firstly, they need to scale up, requiring a substantial increase in the number of qubits. Secondly, they must grapple with the inherent fragility of qubits, susceptible to errors induced by unwanted interactions with the environment. Factors such as electromagnetic fields, heat, or stray atoms contribute to errors that compromise the accuracy of quantum computations.

Quantum Decoherence and the Challenge of Error Correction

Unlike classical bits, qubits’ superposition of states is a double-edged sword. Quantum decoherence, where the information in a superposition of states collapses, poses a significant hurdle for sustained quantum computations.

One of the main difficulties of quantum computation is that decoherence destroys the information in a superposition of states contained in a quantum computer, thus making long computations impossible.  If a single atom that represents a qubit gets jostled, the information the qubit was storing is lost. Additionally, each step of a calculation has a significant chance of introducing error. As a result, for complex calculations, “the output will be garbage,” says quantum physicist Barbara Terhal of the research center QuTech in Delft, Netherlands. A quantum computer’s susceptibility to errors, whether from external disturbances or internal interactions, demands innovative solutions for error correction.

Quantum Error Correction: A Pioneering Frontier

Researchers have been devising a variety of methods for error correction. The idea behind many of these schemes is to combine multiple error-prone qubits to form one more reliable qubit. This is inspired by Classical error correction that employs redundancy for instance by storing the information multiple times, and—if these copies are later found to disagree—just take a majority vote. However, in contrast to the classical bits, copying of quantum information is not possible due to the no-cloning theorem, and it is not possible to get an exact diagnosis of qubit errors without destroying the stored quantum information.

Instead, quantum error correction schemes employ indirect measurements and redundancy in the form of logical qubits, spread across entangled physical qubits.

These schemes must detect and correct errors without directly measuring the qubits, since measurements collapse qubits’ coexisting possibilities into definite realities: plain old 0s or 1s that can’t sustain quantum computations.  So schemes for quantum error correction apply some workarounds. Rather than making outright measurements of qubits to check for errors, scientists perform indirect measurements, which “measure what error occurred, but leave the actual information [that] you want to maintain untouched and unmeasured.” For example, scientists can check if the values of two qubits agree with one another without measuring their values.

And rather than directly copying qubits, error-correction schemes store data in a redundant way, with information spread over multiple entangled qubits, collectively known as a logical qubit. When individual qubits are combined in this way, the collective becomes more powerful than the sum of its parts. Those logical qubits become the error-resistant qubits of the final computer. If your program requires 10 qubits to run, that means it needs 10 logical qubits — which could require a quantum computer with hundreds or even hundreds of thousands of the original, error-prone physical qubits. To run a really complex quantum computation, millions of physical qubits may be necessary.

Surface Code Architecture: A Quantum Beacon of Hope

One promising avenue in quantum error correction is the Surface Code architecture. Designed for superconducting quantum computers, this architecture organizes qubits in a 2D grid, with each qubit connected to its neighbors. The Surface Code architecture, particularly the XZZX variant, provides a scalable solution that can counteract quantum decoherence, making it a viable choice for future quantum experiments.

The  surface code is ideal for superconducting quantum computers, like the ones being built by companies including Google and IBM. The code is designed for qubits that are arranged in a 2-D grid in which each qubit is directly connected to neighboring qubits. That, conveniently, is the way superconducting quantum computers are typically laid out.

Surface code requires that different qubits have different jobs. Some are data qubits, which store information, and others are helper qubits, called ancillas. Measurements of the ancillas allow for checking and correcting of errors without destroying the information stored in the data qubits. The data and ancilla qubits together make up one logical qubit with, hopefully, a lower error rate. The more data and ancilla qubits that make up each logical qubit, the more errors that can be detected and corrected.

In 2015, Google researchers and colleagues performed a simplified version of the surface code, using nine qubits arranged in a line. That setup, reported in Nature, could correct a type of error called a bit-flip error, akin to a 0 going to a 1. A second type of error, a phase flip, is unique to quantum computers, and effectively inserts a negative sign into the mathematical expression describing the qubit’s state.

Now, researchers are tackling both types of errors simultaneously. Andreas Wallraff, a physicist at ETH Zurich, and colleagues showed that they could detect bit- and phase-flip errors using a seven-qubit computer. They could not yet correct those errors, but they could pinpoint cases where errors occurred and would have ruined a calculation, the team reported in a paper published in Nature Physics. That’s an intermediate step toward fixing such errors.

The surface code architecture allows a lower accuracy of quantum logic operations, 99 percent instead of 99.999 percent in other quantum error-correction schemes. IBM researchers have also done pioneering work in making surface-code error correction work with superconducting qubits. One IBM group demonstrated a smaller three-qubit system capable of running surface code, although that system had a lower accuracy—94 percent.


The Threshold Theorem: Overcoming Quantum Imperfections

The basis of quantum error correction is measuring parity. The parity is defined to be “0” if both qubits have the same value and “1” if they have different values. Crucially, it can be determined without actually measuring the values of both qubits. The error-correction method scientists choose must not introduce more errors than it corrects, and it must correct errors faster than they pop up.

The computer scientists Dorit Aharonov and Michael Ben-Or (and other researchers working independently) proved a year later that these codes could theoretically push error rates close to zero. The threshold theorem, a cornerstone in quantum information theory, asserts that a quantum computer with a physical error rate below a certain threshold can, through quantum error correction, suppress the logical error rate to arbitrarily low levels. This revelation has ignited optimism about the feasibility of practical quantum computers.

Current estimates put the threshold for the surface code on the order of 1%, though estimates range widely and are difficult to calculate due to the exponential difficulty of simulating large quantum systems.  At a 0.1% probability of a depolarizing error, the surface code would require approximately 1,000-10,000 physical qubits per logical data qubit, though more pathological error types could change this figure drastically.

Quantum Error Correction Advances

Researchers from MIT, Google, the University of Sydney, and Cornell University have introduced a groundbreaking quantum error correction code that can correct errors afflicting a specified fraction of a quantum computer’s qubits. Unlike previous codes limited by the square root of the total qubits, this code can address a larger fraction, making it applicable to reasonably sized quantum computers. The researchers treat each state of the quantum computation as a spatial dimension, assigning a bank of qubits to each state. Agreement measurements on these qubits modify their states to ensure lawful error propagation, aiding in error detection and correction without revealing specific qubit values.

While the protocol might require some redundancy in hardware for efficiency, the researchers believe that increasing logical qubits is easier than increasing error correction distances. Stephen Bartlett, a physics professor at the University of Sydney, considers the additional qubits needed by the scheme as a significant but manageable reduction compared to existing structures.

In May 2022, a team led by Thomas Monz achieved the first fault-tolerant implementation of a universal set of gates, crucial for programming all quantum algorithms. Using an ion trap quantum computer with 16 trapped atoms, they demonstrated computational operations on two logical quantum bits, including a CNOT gate and a T gate, essential for universality. This achievement marks progress in building practical quantum computers.

Real-Time Quantum Monitoring: A Quantum Leap

Recent research from Yale University showcases real-time quantum monitoring and feedback, challenging a century of quantum mechanics research. By continuously observing quantum systems, errors can be detected and reversed mid-flight, presenting a new frontier in quantum control and error prevention.

Recent Advances in Quantum Error Correction: Light at the End of the Tunnel?

Quantum computing’s potential to revolutionize various fields like materials science, drug discovery, and finance is undeniable. But its practical application hinges on overcoming a major roadblock: error correction. The fragile nature of qubits, prone to errors and noise, makes reliable computations a distant dream. However, recent advancements in quantum error correction are injecting a dose of optimism, suggesting the path towards fault-tolerant quantum computers may be getting brighter.

Recent Developments

1. Surface Codes Take Center Stage:

Surface codes have emerged as a leading contender for error correction due to their ability to correct errors without disturbing neighboring qubits. Recent breakthroughs include:

  • Google’s demonstration of surface code error correction with 43 logical qubits: This represents a significant step towards large-scale fault-tolerant systems.
  • Development of new topological surface codes: These codes offer improved noise resilience and can be implemented with various qubit technologies.

2. Fault-Tolerant Architectures Gain Momentum:

Designing hardware that inherently minimizes error propagation is another crucial approach. Recent advances include:

  • Diamond-based quantum processors: These offer exceptional stability and long coherence times, making them ideal candidates for fault-tolerant architectures.
  • Topological quantum computers: These leverage the unique properties of exotic materials to achieve inherent error correction.

3. Software Solutions Take Flight:

Error correction algorithms and protocols are crucial for efficient and scalable correction. Recent developments include:

Machine learning-powered error correction: This utilizes AI to identify and correct errors in real-time, potentially improving efficiency.

Machine learning has been applied to quantum error correction, showcasing the effectiveness of neural networks like Boltzmann machines. Swedish researchers developed a neural decoder using deep reinforcement learning, achieving performance comparable to hand-made algorithms. Q-CTRL, an Australian startup, focuses on reducing noise and errors in quantum computers through firmware design. Their approach aims to enhance the resilience of quantum computers and quantum sensors.

Physicists from the University of Waterloo and the Perimeter Institute for Theoretical Physics proposed a machine learning algorithm for quantum error correction. Utilizing a Boltzmann machine, they demonstrated its capability to model error probability distributions and generate error chains for quantum states recovery. The algorithm’s simplicity and generalizability make it a promising tool for larger quantum systems.

Hybrid classical-quantum error correction: This combines the strengths of classical and quantum computing for robust error detection and correction.

Harvard’s Breakthrough in Error Correction

A groundbreaking paper published in Nature unveils the potential of Harvard’s quantum computing platform in solving the persistent challenge of quantum error correction.

The foundation of the Harvard platform, developed over several years, lies in an array of laser-trapped rubidium atoms, each serving as a quantum bit or “qubit.” The innovation lies in the dynamic configuration of their “neutral atom array,” enabling the movement and connection of atoms, referred to as “entangling” in quantum physics, during computations. Two-qubit logic gates, entangling pairs of atoms, are crucial units of computing power.

Quantum algorithms involve numerous gate operations, which are susceptible to errors, rendering the algorithm ineffective. The team reports an unprecedented near-flawless performance of their two-qubit entangling gates, achieving error rates below 0.5 percent. This level of operation quality positions their technology on par with leading quantum computing platforms, such as superconducting qubits and trapped-ion qubits.

Harvard’s approach boasts significant advantages over competitors due to its large system sizes, efficient qubit control, and the ability to dynamically reconfigure atom layouts. Simon Evered, a Harvard Griffin Graduate School of Arts and Sciences student in Lukin’s group and the paper’s first author, emphasizes the potential for large-scale, error-corrected devices based on neutral atoms. The low error rates pave the way for quantum error-corrected logical qubits with even lower errors than individual atoms, marking a significant stride towards scalable and reliable quantum computing.

Photonic Quantum Computers

In September 2021, researchers from DTU Fotonik created a large photonic quantum information processor on a microchip, demonstrating error-correction protocols with photonic quantum bits. The chip’s design allows it to protect itself from errors using entanglement, a crucial step toward scalable quantum computers. Efforts are underway to increase the efficiency of photon sources on the chip to enable the construction of larger-scale quantum photonic devices.

Chalmers University of Technology researchers developed a technique to control quantum states of light in a three-dimensional cavity, addressing a major challenge in quantum computing. The method allows the generation of various quantum states of light, including the elusive cubic phase state, by manipulating a superconducting cavity with electromagnetic pulses. The achievement signifies progress in achieving precise control over quantum states, crucial for the development of practical quantum computers.

Experimentation and Verification:

Testing and validating error correction techniques are crucial for progress. Recent achievements include:

  • Demonstration of error correction protocols on real quantum hardware: This validates the theoretical concepts in practice and paves the way for further scaling.
  • Development of benchmark datasets for error correction: These datasets will enable researchers to compare and improve the performance of different error correction techniques.

The Path Forward: Collaborative Innovation

Despite these exciting advancements, significant challenges lie ahead. Scaling up error correction protocols to millions of qubits, optimizing algorithms for efficiency, and minimizing noise sources are crucial hurdles to overcome. However, the relentless pursuit of researchers and the rapid pace of innovation in the field give us reason to believe that the dream of fault-tolerant quantum computers is within reach.

As quantum researchers delve deeper into error correction methodologies, collaboration between academic institutions, industry players, and quantum experts becomes paramount. The journey towards fault-tolerant quantum computation necessitates refining qubit technologies, optimizing error correction techniques, and developing advanced quantum algorithms tailored for large-scale quantum computers.

Conclusion: Navigating the Quantum Seas

The quest for large-scale, fault-tolerant quantum computers is an odyssey filled with challenges and breakthroughs. From the theoretical foundations of quantum error correction to the experimental realization of scalable architectures, researchers are charting unexplored territories in the quantum realm. As we stand on the brink of a quantum revolution, each innovation and discovery brings us closer to unlocking the full potential of quantum computing—a potential that could reshape the landscape of computation and problem-solving in the years to come.




References and Resources also include:






About Rajesh Uppal

Check Also

China’s Quantum Satellites: Paving the Way for a Global Unhackable Ground and Space Network Infrastructure

Introduction: In an era of escalating cyber threats and the increasing vulnerability of critical infrastructures, …

error: Content is protected !!