We have been exploiting quantum phenomena since the 1950s with the development of lasers and semiconductors and all of the technologies that led to modern computer technology. The previous quantum technologies may have used quantum phenomena, but developers could often describe the technology’s performance in terms of classical bits of information, said Dr. Marco Lanzagorta, research physicist at Naval Research Lab. Today, technologies have advanced to a point where we can use quantum phenomena not just to make devices but to store, process, and analyze a new type of information. This revolution is a revolution in quantum information science.
Quantum computing and quantum information processing are next revolutionary technology expected to have immense impact. Quantum information technologies, such as quantum computers, cryptography, radars, clocks, and other quantum systems, rely on the properties of quantum mechanics, which describes the behavior of matter at the subatomic scale. For example, by taking advantage of superposition and entanglement in quantum computers, scientists are able to use new algorithms to solve complex problems exponentially faster than even the most advanced traditional computers in operation today.
Quantum computers will be able to perform tasks too hard for even the most powerful conventional supercomputer and have a host of specific applications, from code-breaking and cyber security to medical diagnostics, big data analysis and logistics. Quantum computers could accelerate the discovery of new materials, chemicals and drugs leading to dramatical reduction of the current high costs and long lead times involved in developing new drugs.
Types of quantum computing systems
Not all quantum information technologies are the same. There are a few different approaches to creating qubits and using them to store, process, and output information. Those different approaches have varied strengths and limitations that make them suitable for different uses and influence their transition from the lab to the market
Analogue quantum computers: Most associated with adiabatic quantum computers, quantum annealers, and direct quantum simulators, these types of quantum systems are some of the most developed systems to date. Because they are less capable of reducing noise, which impairs qubit quality, their functionality is currently limited to simpler and more specific use cases.
Noisy intermediate-scale quantum technology (NISQ): NISQ has been described as the next evolution in quantum computing. Although NISQ is unlikely to completely replace analogue quantum computers, NISQ systems are more capable of tolerating noise, meaning they may require fewer qubits before being commercially viable. While improvements against noise are a design feature of NSIQ systems, noise will still impose limitations on these systems.
Fully error-corrected quantum computers: By using specially designed algorithms and additional qubits, these computers emulate a noiseless system. Because they require additional qubits to correct errors produced by noise, these systems are even more challenging to develop and may take longer to make commercially viable than analogue or NISQ systems. A fully error-corrected system would be able to solve a variety of complex problems and simulations.
Quantum Computing race
Many groups are at the threshold of producing scalable quantum computers. In the competition are IBM, Google, Microsoft, Intel, Amazon, IonQ, Quantum Circuits, Rigetti Computing, Honeywell and newest one is China. In Oct 2017, at the IEEE Industry Summit on the Future of Computing in Washington D.C. , IBM announced the development of a quantum computer capable of handling 50 qubits (quantum bits) till then the largest and most powerful quantum computer.
Quantum supremacy is the idea that a sufficiently powerful quantum computer will be able to complete certain mathematical calculations that classical supercomputers cannot. At about 50 qubits, many say a quantum computer could achieve “quantum supremacy,” a term coined by John Preskill, a physicist at the California Institute of Technology in Pasadena, to denote a quantum computer that can do something beyond the ken of a classical computer, such as simulate molecular structures in chemistry and materials science, or tackle certain problems in cryptography or machine learning. Proving it would be a big deal because it could kick-start a market for devices that might one day crack previously unbreakable codes, boost AI, improve weather forecasts, or model molecular interactions and financial systems in exquisite detail.
When Google unveiled its unprecedented demonstration of quantum computational advantage over classical computing in late 2019, Chinese researchers reported progress toward their own milestone demonstration. Chinese physicists claimed to have broken the quantum computing world record, achieving quantum entanglement of 18 qubits, surpassing the previous record of 10. Physicist Pan Jianwei, at the University of Science and Technology of China, was able to achieve a stable 18-qubit state, a major step toward being able to do quantum processing. Pan is also the previous record holder, having previously achieved a 10 qubit state in 2017.
Now in Nov 2020, the Chinese team has shown a different but no less exciting path forward for quantum computing by achieving the second known demonstration of quantum computational advantage—also known as quantum supremacy. Led by quantum physicist Jian-Wei Pan at the University of Science and Technology of China, the group built a large tabletop setup consisting of lasers as the light source and beam splitters to help create the individual photons, along with hundreds of prisms and dozens of mirrors to provide the randomized paths for the photons to travel. “One of the challenges with this great demonstration is it’s not programmable,” says Christian Weedbrook, founder and CEO of the Toronto-based quantum computing startup Xanadu. “In terms of applications you do need to have a full programmability, and this has no programmability.”
For its part, Weedbrook’s startup Xanadu has been developing a programmable version of photonic quantum computing based on integrated silicon photonics that could fit on a computer chip rather than a tabletop. Certain applicants can already access 8 and 12 qubit versions of Xanadu’s machines through a cloud service—and will soon also have access to 24 qubit machines. We have a goal of demonstrating [fully programmable] quantum supremacy using our approach … sometime next year,” Weedbrook says.
Now Rigetti Computing is planning to deploy a 128-qubit quantum computing system, challenging Google, IBM, and Intel for leadership in this emerging technology. According to Rigetti, his company is in a “unique position” to advance this technology. CEO Chad Rigetti, writes: “Our 128-qubit chip is developed on a new form factor that lends itself to rapid scaling. Because our in-house design, fab, software, and applications teams work closely together, we’re able to iterate and deploy new systems quickly. Our custom control electronics are designed specifically for hybrid quantum-classical computers, and we have begun integrating a 3D signaling architecture that will allow for truly scalable quantum chips. Over the next year, we’ll put these pieces together to bring more power to researchers and developers.”
Microsoft has also doubled down on quantum computing bet. “People are really building things,” says Christopher Monroe, a physicist at the University of Maryland in College Park who co-founded the start-up IonQ in 2015. “I’ve never seen anything like that. It’s no longer just research.”
Quantum Computer technology
A quantum computer’s basic building block is the quantum bit, or qubit. In a classical computer, a bit can store either a 0 or a 1. A qubit can store not only 0 or 1 but also an in-between state called a superposition—which can assume lots of different values. One analogy is that if information were color, then a classical bit could be either black or white. A qubit when it’s in superposition could be any color on the spectrum, and could also vary in brightness. The upshot is that a qubit can store and process a vast quantity of information compared with a bit—and capacity increases exponentially as you connect qubits together. Storing all the information in the 53 qubits on Google’s Sycamore chip would take about 72 petabytes (72 billion gigabytes) of classical computer memory. It doesn’t take a lot more qubits before you’d need a classical computer the size of the planet.
There have been two leading approaches for building general purpose Quantum computer. One approach, adopted by Google, IBM, Rigetti and Quantum Circuits involves encoding quantum states as oscillating currents in superconducting loops. Other companies such as Honeywell and IonQ have been developing alternative quantum computing architectures based on trapped ions, that is to encode qubits in single ions held by electric and magnetic fields in vacuum traps.. And in Australia, Silicon Quantum Computing has been developing quantum computers based on spin-based silicon qubits.
In both Google’s and IBM’s quantum computers, the qubits themselves are controlled by microwave pulses. Tiny fabrication defects mean that no two qubits respond to pulses of exactly the same frequency. There are two solutions to this: vary the frequency of the pulses to find each qubit’s sweet spot, like jiggling a badly cut key in a lock until it opens; or use magnetic fields to “tune” each qubit to the right frequency. IBM uses the first method; Google uses the second. Each approach has pluses and minuses. Google’s tunable qubits work faster and more precisely, but they’re less stable and require more circuitry. IBM’s fixed-frequency qubits are more stable and simpler, but run more slowly.
“Right now I think both superconductors and ion traps have shown a lot of progress and demonstrated a large number of algorithms. The advantage of trapped ions is that every ion is the same. For these small chains [of ions in the trap] you do get this advantage of basically being able to achieve communication between any pair. In superconducting devices, typically, you are only able to talk to sort of neighbor qubits. So if you have an algorithm which requires a longer distance communication between qubits, there is some cost you have to pay to get the information from one to the other,” Kenneth Brown of Duke University.
Brown, Kim, and colleague Christopher Monroe’s (University of Maryland) have written a nice paper on the topic, Co-Designing a Scalable Quantum Computer with Trapped Atomic Ions. Superconducting circuitry exploits the significant advantages of modern lithography and fabrication technologies: it can be integrated on a solid-state platform and many qubits can simply be printed on a chip. However, they suffer from inhomogeneities and decoherence, as no two superconducting qubits are the same, and their connectivity cannot be reconfigured without replacing the chip or modifying the wires connecting them within a very low temperature environment.
Trapped atomic ions, on the other hand, feature virtually identical qubits, and their wiring can be reconfigured by modifying externally applied electromagnetic fields. However, atomic qubit switching speeds are generally much slower than solid state devices, and the development of engineering infrastructure for trapped ion quantum computers and the mitigation of noise and decoherence from the applied control fields is just beginning.”
A few companies have already been developing photonic quantum computers that may be closer to commercialization, even if they have not yet publicly achieved the Chinese team’s scale. The Palo Alto startup PsiQuantum has remained in stealth mode while raising at least $215 million from investors to develop their own version of a photonic quantum computer.
Still, the Chinese team’s demonstration has given researchers cautious but renewed optimism about photonic technology as a viable road for developing practical quantum computing—especially after many had assumed such large-scale photonics demonstrations would struggle to become a reality. “Photonics was always in the race, but I think that the press [coverage] and the dominance of ion trap and superconducting quantum computers was very strong,” Walther says. “So now photons are back.”
IBM’s 50 qubit general purpose quantum computer based on superconducting atoms introduced in 2017
In 2017, IBM established a landmark in computing by announcing a quantum computer that handles 50 quantum bits, or qubits. The system IBM has developed is still extremely finicky and challenging to use, as are those being built by others. In both the 50- and the 20-qubit systems, the quantum state is preserved for 90 microseconds—a record for the industry, but still an extremely short period of time. Other systems built so far have had limited capabilities and could perform only calculations that could also be done on a conventional supercomputer. A 50-qubit machine can do things that are extremely difficult to simulate without quantum technology.
Google’s New Quantum Processor claimed quantum supermacy in 2019
In 2019, researchers at Google’s quantum-computing laboratory in Santa Barbara, California, announced the first-ever demonstration of quantum advantage. They used their state-of-the-art Sycamore device, which has 53 quantum bits (qubits) made from superconducting circuits that are kept at ultracold temperatures. But some quantum researchers contested the claim, on the grounds that a better classical algorithm that would outperform the quantum one could exist. And researchers at IBM claimed that its classical supercomputers could in principle already run existing algorithms to do the same calculations in 2.5 days. To convincingly demonstrate quantum advantage, it should be unlikely that a significantly faster classical method could ever be found for the task being tested.
For its experiment, Google chose a benchmarking test called “random quantum circuit sampling.” It generates millions of random numbers, but with slight statistical biases that are a hallmark of the quantum algorithm. If Sycamore were a pocket calculator, it would be the equivalent of pressing buttons at random and checking that the display showed the expected results. Google simulated parts of this on its own massive server farms as well as on Summit, the world’s biggest supercomputer, at Oak Ridge National Laboratory. The researchers estimated that completing the whole job, which took Sycamore 200 seconds, would have taken Summit approximately 10,000 years.
IBM’s objection was that there are different ways to get a classical computer to simulate a quantum machine—and that the software you write, the way you chop up data and store it, and the hardware you use all make a big difference in how fast the simulation can run. IBM said Google assumed the simulation would need to be cut up into a lot of chunks, but Summit, with 280 petabytes of storage, is big enough to hold the complete state of Sycamore at once.
Google In March 2018, unveiled the world’s largest quantum computer processor to date. Dubbed Bristlecone, it’s a 72-qubit gate-based superconducting system, beating IBM which had developed 50-qubit processor. The Mountain View company’s Research at Google team created the 72-qubit processor by scaling its previous 9-qubit system based on tiny, superconducting circuits. It’s estimated that a single 50-qubit quantum computer would outperform today’s most powerful mainframes. Researchers at D-Wave, IBM, MIT Lincoln Lab, and elsewhere have also developed superconducting qubits of high quality.
The goal of the Google Quantum AI lab is to build a quantum computer that can be used to solve real-world problems. Our strategy is to explore near-term applications using systems that are forward compatible to a large-scale universal error-corrected quantum computer. In order for a quantum processor to be able to run algorithms beyond the scope of classical simulations, it requires not only a large number of qubits. Crucially, the processor must also have low error rates on readout and logical operations, such as single and two-qubit gates. The purpose of this gate-based superconducting system is to provide a testbed for research into system error rates and scalability of our qubit technology, as well as applications in quantum simulation, optimization, and machine learning.”
The guiding design principle for this device is to preserve the underlying physics of our previous 9-qubit linear array technology which demonstrated low error rates for readout (1%), single-qubit gates (0.1%) and most importantly two-qubit gates (0.6%) as our best result. This device uses the same scheme for coupling, control, and readout, but is scaled to a square array of 72 qubits. Operating a device such as Bristlecone at low system error requires harmony between a full stack of technology ranging from software and control electronics to the processor itself. Getting this right requires careful systems engineering over several iterations.
We chose a device of this size to be able to demonstrate quantum supremacy in the future, investigate first and second order error-correction using the surface code, and to facilitate quantum algorithm development on actual hardware.
If a quantum processor can be operated with low enough error, it would be able to outperform a classical supercomputer on a well-defined computer science problem, an achievement known as quantum supremacy. Google has also developed a benchmarking tool to quantify a quantum processor’s capabilities . We can assign a single system error by applying random quantum circuits to the device and checking the sampled output distribution against a classical simulation.
“Although no one has achieved this goal yet, we calculate quantum supremacy can be comfortably demonstrated with 49 qubits, a circuit depth exceeding 40, and a two-qubit error below 0.5%. We believe the experimental demonstration of a quantum processor outperforming a supercomputer would be a watershed moment for our field, and remains one of our key objectives.”
In Nov 2020, China claimed to have made the first definitive demonstration of ‘quantum advantage’
A team in China claims to have made the first definitive demonstration of ‘quantum advantage’ — exploiting the counter-intuitive workings of quantum mechanics to perform computations that would be prohibitively slow on classical computers. They have used beams of laser light to perform a computation which had been mathematically proven to be practically impossible on normal computers. The team achieved within a few minutes what would take half the age of Earth on the best existing supercomputers. Contrary to Google’s first demonstration of a quantum advantage, performed last year, their version is virtually unassailable by any classical computer. The results appeared in Science on 3 December1.
“We have shown that we can use photons, the fundamental unit of light, to demonstrate quantum computational power well beyond the classical counterpart,” says Jian-Wei Pan at the University of Science and Technology of China in Hefei. He adds that the calculation that they carried out — called the boson-sampling problem — is not just a convenient vehicle for demonstrating quantum advantage, but has potential practical applications in graph theory, quantum chemistry and machine learning. “This is certainly a tour de force experiment, and an important milestone,” says physicist Ian Walmsley at Imperial College London.
The Hefei team, led by Pan and Chao-Yang Lu, chose a different problem for its demonstration, called boson sampling. It was devised in 2011 by two computer scientists, Scott Aaronson and Alex Arkhipov4, then at the Massachusetts Institute of Technology in Cambridge. It entails calculating the probability distribution of many bosons — a category of fundamental particle that includes photons — whose quantum waves interfere with one another in a way that essentially randomizes the position of the particles. The probability of detecting a boson at a given position can be calculated from an equation in many unknowns.
But the calculation in this case is a ‘#P-hard problem’, which is even harder than notoriously tricky NP-hard problems, for which the number of solutions increases exponentially with the number of variables. For many tens of bosons, Aaronson and Arkhipov showed that there’s no classical shortcut for the impossibly long calculation. A quantum computer, however, can sidestep the brute-force calculation by simulating the quantum process directly — allowing bosons to interfere and sampling the resulting distribution. To do this, Pan and colleagues chose to use photons as their qubits. They carried out the task on a photonic quantum computer working at room temperature.
Starting from laser pulses, the researchers encoded the information in the spatial position and the polarization of particular photon states — the orientation of the photons’ electromagnetic fields. These states were then brought together to interfere with one another and generate the photon distribution that represents the output. The team used photodetectors capable of registering single photons to measure that distribution, which in effect encodes the calculations that are so hard to perform classically. In this way, Pan and colleagues could find solutions to the boson-sampling problem in 200 seconds. They estimate these would take 2.5 billion years to calculate on China’s TaihuLight supercomputer — a quantum advantage of around 1014.
“This is the first time that quantum advantage has been demonstrated using light or photonics,” says Christian Weedbrook, chief executive of quantum-computing startup Xanadu in Toronto, Canada, which is seeking to build practical quantum computers based on photonics. Walmsley says this claim of quantum advantage is convincing. “Because [the experiment] hews very closely to the original Aaronson–Arkiphov scheme, it is unlikely that a better classical algorithm can be found,” he says.
However, Weedbrook points out that as yet, and in contrast to Google’s Sycamore, the Chinese team’s photonic circuit is not programmable, so at this point “it cannot be used for solving practical problems”. But he adds that if the team is able to build an efficient enough programmable chip, several important computational problems could be solved. Among those are predicting how proteins dock to one another and how molecules vibrate, says Lu. Weedbrook notes that photonic quantum computing started later than the other approaches, but it could now “potentially leap-frog the rest”. At any rate, he adds, “It is only a matter of time before quantum computers will leave classical computers in the dust.”
Rigetti Computing is planning to deploy a 128-qubit quantum computing system
The startup Rigetti Computing is trying to build the hardware needed to power a quantum computer. The company aims to produce a prototype chip that is significantly more complex than those built by other groups working on fully programmable quantum computers. The following generation of chips should be able to accelerate some kinds of machine learning and run highly accurate chemistry simulations that might unlock new kinds of industrial processes, says Chad Rigetti, the startup’s founder and CEO. It is also working on software to make it easy for other companies to write code for its quantum hardware.
Rigetti says his company has worked out a qubit design that should be stable enough to scale up, and that can be made using conventional chip-manufacturing techniques. The startup is currently testing a three-qubit chip made using aluminum circuits on a silicon wafer, and the design due next year should have 40 qubits.
The quantum circuitry makes use of superconducting caps to for connectivity, while maintaining a vacuum to provide electromagnetic isolation between elements. The idea behind the caps is to lengthen coherence time and limit the effects of cross-talk and environmental noise. To scale this to a 128-qubit computer, the company has come up with a 16-qubit form factor comprised of two interconnected 8-qubit rings. Rigetti announced it’s working on a 128-qubit chip, which it expects to have completed and available on QCS by August 2019.
Rigetti has also provided a quantum computing development environment, known as Forest, to help developers build applications. The software includes a Quantum Virtual Machine that can simulate a quantum processor with up to 30 qubits. To date, Forest developers have run more than 65 million experiments on the platform, which counts researchers from Oak Ridge, Argonne, and Los Alamos national labs as among its user base.
In conjunction with the development of the 128-qubit hardware, the company will also be looking for ways to enhance the application software side. The areas they will initially focus on are quantum simulation, optimization, and machine learning. While this work is going on, Rigetti, the man, is trying keep the big picture in mind. “Quantum advantage comes from creating a solution that is faster, cheaper, or better quality,” he writes. “It’s an open question as to which industry will achieve the first commercially useful applications. But even a small performance improvement over classical machines can add tremendous value for researchers and businesses around the world …
Scientists at University of Sussex have ‘tamed’ some disruptive environmental effects on quantum computers
A team of scientists, led by Professor Winfried Hensinger at the University of Sussex, have made a major breakthrough concerning one of the biggest problems facing quantum computing: how to reduce the disruptive effects of environmental “noise” on the highly sensitive function of a large-scale quantum computer.
In the real-world, technological developments need to operate in imperfect conditions; what can be successfully tested in a highly controlled laboratory may fail when presented with realistic environmental factors, such as the fluctuations in voltage from an electronic component or stray electromagnetic fields emitted by everyday electronic equipment.
The University of Sussex’s Ion Quantum Technology Group have managed to dramatically reduce the effects of such environmental “noise” affecting trapped ion quantum computers, reporting their findings in an article published on 1 November 2018, in the prestigious journal Physical Review Letters. It means the team is one step closer to building a large-scale quantum computer with the capability to solve challenging real-world problems.
Small-scale quantum computers currently in existence only contain a handful of quantum bits – components of quantum computers that store information and can exist in multiple states, also referred to as qubits. As such, current quantum computers are small enough to be operated in a highly controlled environment inside a specialized laboratory. However, such machines do not have the processing power required to solve complex problems because of the limited number of qubits.
When built, large-scale quantum computers will be able to solve certain problems that would take even the fastest super computers billions of years to calculate. In order to create a quantum computer that can solve such problems, scientists will need to increase the number of qubits, which in turn will increase the size of the quantum computer. The problem is that the more qubits that are added, the more difficult it becomes to isolate the computer from any realistic “noise” that would disrupt the computing processes.
Hensinger’s team of University of Sussex physicists have made a quantum computing breakthrough that is capable of mitigating some of these problems. They collaborated with theoretical scientist Dr Florian Mintert and colleagues from Imperial College London, who proposed a theory of how one might be able to solve this problem by manipulating the strange quantum effects in use inside a quantum computer. The theory allows – making use of the strange properties of quantum physics – the execution of quantum computations in such a way that changes in the initial operational parameters of the machine do not lead to a substantial change in the end result of the computation. This in turn helps to insulate the quantum computer from the effects of environmental ‘noise’.
Dr Sebastian Weidt, senior scientist in the Sussex Ion Quantum Technology Group, explains the significance: “Realising this technique may have a profound impact on the ability to develop commercial ion trap quantum computers beyond use in an academic laboratory.” The Sussex team went to work to see whether they could actually implement this theory. They used complicated radio-frequency and microwave signals capable of manipulating the quantum effects inherent in individual charged atoms (ions), to demonstrate this in practical experiments. Their implementation is based on microwave technology, such as that present in mobile phones. Following months of intensive work in the laboratory, the Sussex scientists have managed to make this new method a reality, experimentally demonstrating its capabilities to substantially reduce the effect of “noise” on a trapped ion quantum computer.
Prof Hensinger, Head of the Ion Quantum Technology Group at the University of Sussex – which last year unveiled the first blueprint for a large-scale quantum computer – says: “With this advance we have made another practical step towards constructing quantum computers that can host millions of qubits. Such machines are capable of solving certain problems that even the fastest supercomputer may take billions of years to calculate and be of great benefit to humanity; they may be able to help us create new pharmaceuticals; find new cures for diseases, such as Dementia; create powerful tools for the financial sector; be of benefit to agriculture, through more efficient fertilizer production, among many other applications. We are only starting to understand the tremendous potential of these machines.”
Hensinger’s group is now utilising this new technique as they put the final touches to a powerful quantum computer prototype that is currently in their laboratory at the University of Sussex. Hensinger says: “It’s now time to translate academic achievements into the construction of practical machines. We’re in a fantastic position to do this at Sussex and my team is working round the clock to make large-scale quantum computing a future reality.”
Five qubit fully programmable computer based on ions
Scientists have created the first programmable and reprogrammable quantum computer, according to a new study. The technology could usher in a much-anticipated era of quantum computing, which researchers say could help scientists run complex simulations and produce rapid solutions to tricky calculations.
“Until now, there hasn’t been any quantum-computing platform that had the capability to program new algorithms into their system. They’re usually each tailored to attack a particular algorithm,” said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park. Now, Debnath and his colleagues have developed the first fully programmable and reprogrammable quantum computer.
The new device is made of five qubits. Each qubit is an ion, or electrically charged particle, trapped in a magnetic field. It consists of five ytterbium ions lined up and trapped in an electromagnetic field. The electronic state of each ion can be controlled by zapping it with a laser. This allows each ion to store a bit of quantum information.
The scientists can use lasers to manipulate these ions — five ytterbium atoms — infusing them with precise amounts of energy and influencing their interactions with each other. Because they are charged, the ions exert a force on each other, and this causes them to vibrate at frequencies that can be precisely controlled and manipulated. These vibrations are quantum in nature and allow the ions to become entangled. In this way, the quantum bits they hold can interact.
By controlling these interactions, physicists can carry out quantum logic operations. And quantum algorithms are simply a series of these logic operations one after the other. In this way, the researchers can program and reprogram the quantum computer with a variety of algorithms.
The researchers tested their device on three algorithms that quantum computers, as prior work showed, could execute quickly. One, the so-called Deutsch-Jozsa algorithm, is typically used only for tests of quantum-computing capabilities. Another, the Bernstein-Vazirani algorithm, can also be used to probe for errors in quantum computing. The last, the quantum Fourier transform algorithm, is an element in quantum-computing encryption-breaking applications. The Deutsch-Jozsa and Bernstein-Vazirani algorithms successfully ran 95 and 90 percent of the time, respectively. The quantum Fourier transform algorithm, which the researchers said is among the most complicated quantum calculations, had a 70 percent success rate, they said.
In the future, the researchers will test more algorithms on their device, Debnath said. “We’d like this system to serve as a test bed for examining the challenges of multiqubit operations, and find ways to make them better,” Debnath told Live Science. The team claims it can go much further. In particular, they say that their module is scalable—that several five-qubit modules can be connected together to form a much more powerful quantum computer.
References and Resources also include: