Quantum Random Number Generator for IoT security and impenetrable encryption of military communications

Random numbers are important in many fields of scientific research and real-life applications, such as fundamental physical research, computer science and the lottery industry.  They also serve as foundation in many security applications including encryption, authentication, signing, key wrapping and other cryptographic applications. The weakness of  random number generators can be exploited by Hackers to steal or guess keys.

“While the rolling of dice has been essential to games of chance throughout the ages, the importance of random numbers has never been more apparent. Aside from its application in generating random numbers for reliable lotteries and gaming platforms, a truly random number generator will provide impenetrable encryption for communications – be they military transmissions, secure banking, or online purchasing – that underpin the modern connected world,” noted  Dr. Sussman.

Most computer systems use a software random number generator (RNG) even though they are less secure, because a dedicated hardware RNG is costly, and bulky. Although software RNGs may be useful in some applications, they fail to provide an adequate security barrier in most applications where true randomness is required.

Hardware random number generator, is a device that generates random numbers from a physical process, rather than a computer program. Hardware random number generators or True Random Number Generator (TRNG) produce sequences of numbers that are not predictable, and therefore provide the greatest security when used to encrypt data.

Cryptography systems tend to rely on “pseudo random-number” generators (PRNGs) that uses deterministic algorithm to produce output sequences of numbers that are nearly random. These are used to create the “keys” that allow individuals to encrypt and decrypt sensitive information such as passwords and bank details.

The security of cryptographic protocols relies on the content of randomness of the keys, i.e. on the difficulty an adversary has to guess the key used to encrypt data. Although these pseudorandom sequences pass statistical pattern tests for randomness, by knowing the algorithm and the conditions used to initialize it, called the “seed”, the output can be predicted. Because the sequence of numbers produced by a PRNG is predictable, data encrypted with pseudorandom numbers is potentially vulnerable to cryptanalysis.

That’s why physicists prefer to use quantum processes to generate random numbers. These are thought to be random in principle and fundamental in nature which is important because it means there cannot be some underlying physical process that might introduce predictability.

The Tohoku University research group of Professor Keiichi Edamatsu and Postdoctoral fellow Naofumi Abe has demonstrated dynamically and statically unpolarized single-photon generation using diamond. This result is expected to play a crucial role in hardware random number generation using single photons (quantum dice or quantum coin toss), quantum cryptography and the testing of fundamental problems in quantum mechanics.

In their paper, published in Scientific Reports, the authors present the first demonstration that single-photon emission from a specially oriented compound defect (a nitrogen vacancy center) in diamond is dynamically and statically unpolarized with intrinsic randomness.


True Random Number Generator (TRNG)

Truly random number generators make measurements on physical systems that are inherently random – such as Shot noise, a quantum mechanical noise source in electronic circuits, a nuclear decay radiation source detected by a Geiger counter attached to a PC. Turbulence is thought to be entirely random so measuring that turbulent effects that the atmosphere has on a laser beam is another method of producing random numbers, albeit a rather slow one and one that could easily be biased by environmental factors.

However, existing measurement techniques tend to be either very expensive or too slow to be of practical use. Securing your mobile phone, for example, needs a generation rate of about 1 kbit/s.

EYL has developed a micro quantum random bit generator by extracting unpredictable randomness which naturally comes from radioactive isotopes inserted into a 5mm device, instead of using optical methods which are commercialized but prohibitively expensive and bulky. Since internally emitted alpha particles in the device are completely random, we can get perfect randomness from natural phenomena.

Using this technology, EYL provides not only USB, PCIe and server type quantum random number generator for security systems, but also it provides complimentary related hardware/software applications. This is the world s first patented technology of its kind, and can be deployed on security systems for diverse purposes with affordable pricing.

QuintessenceLabs harnesses diode ‘flaw’ for new quantum number generator

Quantum cybersecurity firm QuintessenceLabs (QLabs) has announced developing a full-entropy quantum random number generator (QRNG), by leveraging a “flaw” in diodes. QLabs said the flaw, a quantum property in diodes known as quantum tunnelling, is a phenomenon in which a particle travels across a barrier that — according to classical mechanics — it should not be able to cross

As a result, quantum tunnelling results in random fluctuations in the current flowing through the tunnel diode, since there is no way to determine beforehand how many charge carriers would ‘tunnel’ through at any instant time,” the company explained. For the latest release of its quantum random number generator qStream, QLabs has developed a way to measure and digitally process these fluctuations to generate “full-entropy” random numbers at a rate of 1Gbps.

QLabs launched its “first generation” of the qStream device in 2015, using lasers as the source of its quantum random number generation before switching to quantum tunnelling. The company said tunnel diodes can generate full entropy random numbers at the same rate as the first generation, but without the need of laser and photo-detector, which results in what QLabs explained as a “more compact and cost-effective” product, cutting the size of the QRNG hardware to a quarter, while delivering the same quality and speed.

SKT Develops World’s First Ultra-Small Quantum Random Number Generator

It is difficult to mass-produce current QRNG due to its expensive price. Also it is not suitable for Smartphones and IoT products due to its size. SK Telecom succeeded in making QRNG as same side as a fingernail and also lowered QRNG’s price.SK Telecom is going to release IoT product that is equipped with chips that optimize performance and stability within this year and is going to expand areas of application of QRNG towards entire IoT fields such as self-driving cars and Smart Meters.  If it succeeds in commercializing small chips that can be mass-produced, it will be recorded as the first ever QRNG that can be mass-produced. 

“We have implemented ‘QRNG (Quantum Random Number Generator’, which is one of major technologies of quantum information communication, and are planning to produce prototypes in March at the earliest.” said a representative for SK Telecom. It is predicted that QRNG will have huge impact on entirety of ICT (Information Communication Technology) industries as it has better abilities to prevent possibilities of hacking than current coding systems.

SK Telecom invested about $2.13 million (2.5 billion KRW) into IDQ (ID Quantique) that holds major patents for QRNG and has acquired rights to use IDQ’s patents exclusively.


QUANTIS QRNG – delivering true randomness with quantum random number generation

One of the most popular is to send a stream of photons through a beam splitter, which transmits or reflects them with a 50 percent probability. Simply counting the photons that are reflected or transmitted produces a random sequence of 0s and 1s.

That’s exactly how the world’s only commercially available quantum random number generator works. Quantis produces random numbers at a bite rate up to 16Mbps. That’s because single photon detectors cannot count any faster than this.

Recently, physicists have begun to utilize a new technique based on ways photons are generated inside lasers. There are two different ways photons are generated inside lasers. The first is by stimulated emission, which is a predictable process producing photons that all have the same phase. The second is spontaneous emission, an entirely random quantum process. These photons are usually treated as noise and are in any case swamped when the laser is operating at full tilt.

World’s Fastest Quantum Random Number Generator Unveiled in China

The spontaneous emission is dominant when the laser operates at its threshold level, before stimulated emission really takes hold. If it is possible to measure these photons, then it may be possible to exploit their random nature.

You-Qi and co have done exactly that. These guys have created a highly sensitive interferometer that converts fluctuations in the phase of photons into intensity changes. That’s important because intensity changes can be easily measured using conventional photodetectors that work at much higher rates than single photon detectors.

That has allowed the team to measure these random changes and digitize them at a rate of 80 Gbps. This data stream then has to be cleaned up in various ways to remove any biases introduced by the measurement process. But after this, the team is still able to produce random numbers at the rate of 68 Gbps.

“Our demonstration shows that high-speed quantum random number generators are ready for practical usage, say You-Qi and co. “Our quantum random number generator could be a practical approach for some specific applications such as QKD systems with a clock rate of over 10 GHz.”



Quantum random-number generator from a mobile phone

Bruno Sanguinetti and colleagues Anthony Martin, Hugo Zbinden andNicolas Gisin have used an eight-megapixel camera from a Nokia N9 smartphone to create a device that can deliver random numbers at 1.25 Gbit/s.

Colleagues at the University of Geneva in Switzerland, have created a quantum random-number generator (QRNG) that uses low-cost electronic components including a mobile-phone camera. Their device can deliver powerful cryptography and secure credit card transactions using mobile phone only.

The system exploits the fact that the camera is so sensitive that it can be used to count the number of photons that impinge on each of its individual pixels. The light is supplied by a conventional LED, in which electrons and holes combine to create photons. This is a quantum mechanical process and therefore the number of photons produced in a fixed period of time is not fixed, but is random.

The camera and LED are adjusted so that each pixel detects about 400 photons in a short exposure time. The photon numbers of all the camera pixels are combined in an “extractor” algorithm that outputs a sequence of random numbers. In the Swiss experiment, the camera was used to create a 1.25 Gbit/s stream of random numbers.

One worry about any random-number generator is that the numbers could be influenced in a predictable way by non-quantum (classical) effects in the system. This could lead to a measurement bias, for example, which could favour certain numbers over others. If a potential eavesdropper knows everything about the generator, they could in principle predict the classical component of its output. This would make it easier to crack the system.

However, when such biases are factored in, the team reckons that a user would have to generate a mindboggling 10 **118 random numbers before they would notice a deviation from a perfectly random sequence.

Sanguinetti told physicsworld.com that all of the components of his team’s QRNG could be integrated on a chip that would cost a few dollars and could be easily integrated in portable electronic devices, including mobile phones. “If there is a quantum technology that everyone will soon have, this is it,” he says.

Laing also believes that the technology could be used in quantum cryptography systems, which in principle are unbreakable: “A QRNG can also be a key component for quantum key distribution protocols, where the communicating parties must be careful to choose their measurements in a genuinely random way.”


Researchers from the University of Geneva have developed a self-testing quantum method for generating random numbers

Powerful quantum random number generators are today available commercially. However, one limitation of existing devices is that it is impossible for the user to independently verify that the numbers generated are in fact genuinely random and not, for example, composed of digits of π. The user must trust the device (and so its manufacturer) to function correctly, even after years of use. So, it makes sense to ask if current systems could be improved from this point of view.

“We wanted to create a device which can be continuously tested to ensure it functions correctly at all times and thus guarantee that the random numbers generated are reliable” says Nicolas Brunner. To achieve this, the UNIGE physicists have developed a “self-testing” quantum random number generator, which allows the user to verify in real time that the apparatus performs optimally and delivers unbiased random numbers.

“The generator should solve a tasks for which we have calibrated it. If the tasks is solved correctly, the output numbers are guaranteed to be random. If the apparatus does not find the correct solution, randomness is not guaranteed, and the user should then recalibrate the device. This avoids the risk of using numbers with little (or no) randomness for example to generate passwords, which hacker could then crack” professor Hugo Zbinden enthusiastically points out.

Indeed, the new generator allows to measure precisely the quality of the output random numbers. Perfectly random numbers can then be distilled and used for security applications, such as generating passwords which are safe against hacking.



References and Resources also include:







Intel and Australian teams realising vision of “Silicon core of Quantum Revolution after Microelectronics” with Government investment

A working quantum computer has the potential to transform the information economy and create the industries of the future, solving in hours or minutes problems that would take conventional computers – even supercomputers – centuries, and tackling otherwise intractable problems that even supercomputers could not solve. Applications include for software design, machine learning, scheduling and logistical planning, financial analysis, stock market modelling, software and hardware verification, climate modelling, rapid drug design and testing, and early disease detection and prevention.


Quantum bits, or qubits, are the basic building blocks of quantum computers, just as bits are that of modern computers.  To give an example, Google’s quantum computer has  reached 49 qubits, while IBM’s quantum computer has  now 50 qubits.


Researchers around the world have been exploring a range of different physical systems to act as qubits, including silicon-based nuclear spins, trapping and isolating ions by using electromagnetic fields, photons trapped in microwave cavities, nuclear spins, electron spins in quantum dots, superconducting loops and Josephson junctions, among others. Google’s Quantum AI Lab, have experimented with qubits based on superconducting metal circuits.


Spin qubits highly resemble the semiconductor electronics and transistors as we know them today. They deliver their quantum power by leveraging the spin of a single electron on a silicon device and controlling the movement with tiny, microwave pulses. Electron spins in silicon quantum dots are attractive systems for quantum computing owing to their long coherence times and the promise of rapid scaling of the number of dots in a system using semiconductor fabrication techniques.



Historically, silicon qubits have been shunned for two reasons: It’s difficult to control qubits manufactured on silicon, and it’s never been clear if silicon qubits could scale as well as other solutions.  However, as seen with small-scale demonstrations of quantum computers using other types of qubit, combining these elements leads to challenges related to qubit crosstalk, state leakage, calibration and control hardware.


In a paper published t in Nature, researchers at Delft University of Technology in the Netherlands and the University of Wisconsin–Madison say they were able to program a two-qubit machine based on spin qubits to execute a couple of algorithms that are typically employed to test the effectiveness of quantum machines, including one that could be used for searching a database. “Here we overcome these challenges by using carefully designed control techniques to demonstrate a programmable two-qubit quantum processor in a silicon device that can perform the Deutsch–Josza algorithm and the Grover search algorithm—canonical examples of quantum algorithms that outperform their classical analogues.”



Thomas Watson, one of the researchers, says the team’s advance was based on things such as finding better ways to calibrate the “gates” in the machine, or the basic quantum circuits. He thinks that silicon-based systems could ultimately allow qubits to be packed more densely together than other approaches. The closer qubits are to one another, the easier it is to get them to influence neighbors, which boosts machines’ computational power.


Intel Bets It Can Turn Everyday Silicon into Quantum Computing’s Wonder Material

Intel’s group has reported that they can now layer the ultra-pure silicon needed for a quantum computer onto the standard wafers used in chip factories.  A quantum computer would need to have thousands or millions of qubits to be broadly useful, and to get to hundreds of thousands of qubits, we will need incredible engineering reliability, and that is the hallmark of the semiconductor industry, according to Andrew Dzurak, who works on silicon qubits at the University of New South Wales in Australia says. Another reason to work on silicon qubits is that they should be more reliable than the superconducting equivalents.


Intel’s silicon qubits represent data in a quantum property called the “spin” of a single electron trapped inside a modified version of the transistors in its existing commercial chips. “The hope is that if we make the best transistors, then with a few material and design changes we can make the best qubits,” says Clarke.


Spin qubits, in comparison to their superconducting counterparts, offer a few advantages in addressing these challenges.

They’re small and strong: Spin qubits are much smaller in physical size and their coherence time is expected to be longer – an advantage as researchers aim to scale the system to the millions of qubits that will be required for a commercial system. Superconducting qubits are quite large and they operate in systems the size of 55-gallon drums, which makes it hard to scale up the design of the quantum system to the millions of qubits necessary to create a truly useful commercial system.


They can function at higher temperatures: Silicon spin qubits can operate at higher temperatures than superconducting qubits (1 kelvin as opposed to 20 millikelvin). This could drastically reduce the complexity of the system required to operate the chips by allowing the integration of control electronics much closer to the processor. Intel and academic research partner QuTech* are exploring higher temperature operation of spin qubits with interesting results up to 1K (or 50x warmer) than superconducting qubits. The team is planning to share the results at the American Physical Society (APS) meeting in March.


Intel manufacturing know-how: The design of the spin qubit processors highly resembles the traditional silicon transistor technologies. While there are key scientific and engineering challenges remaining to scale this technology, Intel has the equipment and infrastructure from decades of fabricating transistors at scale.


Intel has invented a spin qubit fabrication flow on its 300 mm process technology using isotopically pure wafers sourced specifically for the production of spin-qubit test chips. Fabricated in the same facility as Intel’s advanced transistor technologies, Intel is now testing the initial wafers. Within a couple of months, Intel expects to be producing many wafers per week, each with thousands of small qubit arrays.


The new process that helps Intel experiment with silicon qubits on standard chip wafers, developed with the materials companies Urenco and Air Liquide, should help speed up its research, says Andrew Dzurak, who works on silicon qubits at the University of New South Wales in Australia. Companies developing superconducting qubits also make them using existing chip fabrication methods. But the resulting devices are larger than transistors, and there is no template for how to manufacture and package them up in large numbers, says Dzurak.


Australian Teams

Australia’s first quantum computing company, Silicon Quantum Computing Pty Ltd, has been launched to advance the development and commercialisation of the University of New South Wales (UNSW Sydney)’s world-leading quantum computing technology. It will drive the development and commercialisation of a 10-qubit quantum integrated circuit prototype in silicon by 2022 as the forerunner to a silicon based quantum computer.


CQC2T is home to an incredibly strong team of silicon quantum computing researchers being the only group in the world that can make atomically precise devices in silicon. Led by UNSW Scientia Professor Michelle Simmons, the Centre’s teams have produced the longest coherence time qubits in the solid state, the ability to optically address single dopant atoms in silicon, the lowest noise silicon devices and the first two qubit gate in silicon.


Two teams from Australia are experimenting with two different types of Qbits, one team is using natural atom made of Phosphorous which contains two quantum bits electron and nuclear spin and on which they have achiever 99.99% accuracy that leads to 1 error in every 10,000 operations.


The second team is working with artificial atom that  harness silicon  to build a quantum processor with advantage of it being compatible with the microelectronics of existing computers. The spin of an electron or a nucleus in a semiconductor naturally implements the unit of quantum information – the qubit – while providing a technological link to the established electronics industry. However naturally occurring  silicon  contains about 92% 28Si, and other isotopes including 29Si at about 4.7%, which is a dominant factor for decoherence. For silicon then, coherence time can be drastically improved through the isotopic enrichment of the spin-zero nuclear species 28Si.


So far, the UNSW team has demonstrated a system with quantum bits, or qubits, only in a single atom. Useful computations will require linking qubits in multiple atoms. But the team’s silicon qubits hold their quantum state nearly a million times longer than do systems made from superconducting circuits, a leading alternative, UNSW physicist Guilherme Tosi told participants at the event. This helps the silicon qubits to perform operations with one-sixth of the errors of superconducting circuits. A second group from the UNSW has a less robust silicon design that has already demonstrated calculations that link up two qubits, a building block that paves the way for creating more-complex devices.


Systems will need to be scaled up to a large number of qubits to execute nontrivial quantum algorithms so that these quantum devices can simulate quantum systems efficiently, crack modern encryption codes, search through huge databases, as well as solve a wide range of optimization problems.


Australian engineers have created a new ultra stable quantum bit

Researchers at the University of New South Wales in Australia have designed a new type of quantum bit (qubit), which they say will enable large-scale quantum computing at a lower cost. The new ‘flip-flop qubits’ are able to communicate over distances of more than 150nm, which researcher leader Andrea Morello said might actually leave room to “cram other things between qubits.”  What the team have invented is a new way to define a ‘spin qubit’ that uses both the electron and the nucleus of the atom. Crucially, this new qubit can be controlled using electric signals, instead of magnetic ones,” said Prof Morello.


Morello and his team proposed a method of using both the electron and nucleus of a single phosphorous atom, to create a qubit inside a layer of silicon. ‘Pulling’ the electron away from the nucleus would extend the electric field that qubits use for entanglement. As well as leaving more space, the new chip designs would also overcome the need for atoms to be very precisely placed.


Even more important, though, is the fact that the new chips could be produced using existing manufacturing technology, which opens up the possibility of mass production. Morello said that this “makes the building of a quantum computer much more feasible.” “This new idea allows us to fabricate multi-qubit processes with current technology,” says Guilherme Tosi, the lead scientist.


Australian engineers team from  Australia’s University of New South Wales (UNSW)  had earlier reported to have created a new quantum bit which remains in a stable superposition for 10 times longer than previously achieved, dramatically expanding the time during which calculations could be performed in a future silicon quantum computer. “We have created a new quantum bit where the spin of a single electron is merged together with a strong electromagnetic field,” said Arne Laucht, a Research Fellow at the School of Electrical Engineering & Telecommunications at UNSW, and lead author of the paper. “This quantum bit is more versatile and more long-lived than the electron alone, and will allow us to build more reliable quantum computers.”


“The greatest hurdle in using quantum objects for computing is to preserve their delicate superpositions long enough to allow us to perform useful calculations,” said Andrea Morello, leader of the research team and a Program Manager in the Centre for Quantum Computation & Communication Technology (CQC2T) at UNSW.


“Our decade-long research program had already established the most long-lived quantum bit in the solid state, by encoding quantum information in the spin of a single phosphorus atom inside a silicon chip, placed in a static magnetic field,” said Andrea Morello. The results are striking: since the electromagnetic field steadily oscillates at a very high frequency, any noise or disturbance at a different frequency results in a zero net effect. The researchers achieved an improvement by a factor of 10 in the time span during which a quantum superposition can be preserved.


Specifically, they measured a dephasing time of T2*=2.4 milliseconds – a result that is 10-fold better than the standard qubit, allowing many more operations to be performed within the time span during which the delicate quantum information is safely preserved. “This new ‘dressed qubit’ can be controlled in a variety of ways that would be impractical with an ‘undressed qubit’,”, added Morello. “For example, it can be controlled by simply modulating the frequency of the microwave field, just like in an FM radio. The ‘undressed qubit’ instead requires turning the amplitude of the control fields on and off, like an AM radio.


“In some sense, this is why the dressed qubit is more immune to noise: the quantum information is controlled by the frequency, which is rock-solid, whereas the amplitude can be more easily affected by external noise”. Since the device is built upon standard silicon technology, this result paves the way to the construction of powerful and reliable quantum processors based upon the same fabrication process already used for today’s computers. What Laucht and colleagues did was push this further: “We have now implemented a new way to encode the information: we have subjected the atom to a very strong, continuously oscillating electromagnetic field at microwave frequencies, and thus we have ‘redefined’ the quantum bit as the orientation of the spin with respect to the microwave field.”



Quantum computer in Silicon

Scientists and engineers from the Australian Research Council Centre of Excellence for Quantum Computation and Communication Technology (CQC2T), are developing a scalable quantum computer in silicon. They found that a single atom of phosphorus could be used to tightly hold an electron, which also carries a “spin” (like a tiny magnet) that could be used as a quantum bit.

The idea of silicon quantum computing was first proposed in 1998 by Bruce Kane, a physicist at the University of Maryland, in College Park. Quantum computers based on familiar silicon could theoretically be manufactured in conjunction with the conventional semiconductor techniques found in today’s computer industry. A silicon approach to quantum computing also offers the advantage of strong stability and high coherence times for qubits. (High coherence times mean the qubits can continue holding their information for long enough to complete calculations.) Kane proposed using the quantum characteristic of spin in the nucleus of the phosphorus donor atom as the qubit.

Morello and Dzurak were among the physicists impressed by Kane’s proposal, but they chose to investigate electron spins instead, because electron spins in silicon have very long coherence times—that is, it takes a relatively long time for such a qubit to lose its information.

So far, the UNSW team has demonstrated a system with quantum bits, or qubits, only in a single atom. Australian teams demonstrated mastery of single qubits based on electron spin in 2012 and control of nuclear spin qubits in 2013. Useful computations will require linking qubits in multiple atoms.

But the team’s silicon qubits hold their quantum state nearly a million times longer than do systems made from superconducting circuits, a leading alternative, said UNSW physicist Guilherme Tosi reported by Nature. This helps the silicon qubits to perform operations with one-sixth of the errors of superconducting circuits.

The scheme showcased at the innovation forum by Tosi and fellow physicist Vivien Schmitt uses qubits that are the spins of the electrons and nuclei in phosphorus atoms embedded in a silicon lattice, and are controlled using a special system of electric fields. Because the spins respond only to very specific, tuneable frequencies, they are robust to electrical noise. That allows the qubits to keep their quantum states for one minute and to operate perfectly 99.9% of the time, said Tosi.

Moreover, the electrically controlled qubits can communicate with each other at larger distances than can the qubits in other silicon designs. That bodes well for scaling up because the qubits can be far enough apart to allow room for control and read-out instruments to be placed between them. The atoms also do not need to be placed precisely, so they would fit with existing microprocessor-fabrication techniques, added Tosi.

Researchers at the University of New South Wales (UNSW) built a two-qubit logic gate in silicon

A second group from the UNSW led by physicist Andrew Dzurak, uses as its qubits the spins of electrons in a set-up that is based on modified electrical transistors. Although the qubits are less robust than those in the Morello design, Dzurak’s team demonstrated two-qubit calculations last October.

Researchers  reported in the journal Nature that they have built a two-qubit logic gate containing two entangled qubits, based on spins of trapped electrons in silicon for the first time and thereby clearing the hurdle to making silicon-based quantum computer processors a reality.

Such trapped electrons can be integrated with existing CMOS technology, to create quantum computer chips that could store thousands, even millions of qubits on a single silicon processor chip. UNSW scientists have patented a design for a full-scale quantum chip that would hold millions of silicon qubits.

The UNSW researchers then simulated fundamental gate: the controlled NOT or CNOT operation through their two-qubit logic gate. Depending on the state of the control qubit, the CNOT gate changes an “up” spin into a “down” spin, and the other way around. The CNOT is the fundamental gate that can be combined to form complex quantum computations just as NAND gate is fundamental gate in conventional computers.

“For the creation of the two-qubit gate the researchers modified the design of a CMOS transistor. Two gates are placed next to each other on an insulating layer of silicon dioxide that separates them from a layer of almost pure silicon-28 isotope, writes Alexander Hellemans,” in IEEE spectrum.

Controlling the voltage of the gates allows the trapping of a single electron in the region under the gate. The quantum states of both electrons can be controlled by gigahertz-frequency pulses transmitted by the “electron spin resonance” (ESR) line, in combination with a 1.4 Tesla magnetic field.

The ESR line allows the spin state for each electron to be set independently for “one-qubit” operations. Voltage pulses entangle the two qubits, allowing them to operate as a CNOT gate; changing the spin of one electron results in changing the spin of the other electron.

The authors write in Nature, “Here we present a two-qubit logic gate, which uses single spins in isotopically enriched silicon and is realized by performing single- and two-qubit operations in a quantum dot system using the exchange interaction, as envisaged in the Loss–DiVincenzo proposal.

“We realize CNOT gates via controlled-phase operations combined with single-qubit operations. Direct gate-voltage control provides single-qubit addressability, together with a switchable exchange interaction that is used in the two-qubit controlled-phase gate. By independently reading out both qubits, we measure clear anticorrelations in the two-spin probabilities of the CNOT gate.”

Australian scientists design a full-scale architecture for a quantum computer in silicon

“Our Australian team has developed the world’s best qubits in silicon,” says University of Melbourne Professor Lloyd Hollenberg, Deputy Director of the CQC2T who led the work with colleague Dr Charles Hill. “However, to scale up to a full operational quantum computer we need more than just many of these qubits – we need to be able to control and arrange them in such a way that we can correct errors quantum mechanically.”

Australian scientists have designed a 3D silicon chip architecture based on single atom quantum bits, one of the final hurdles to scaling up to an operational quantum computer many thousands of qubits. Researchers detailed an architecture that sandwiches a 2-D layer of nuclear spin qubits between an upper and lower layer of control lines. Such triple-layer architecture enables a smaller number of control lines to activate and control many qubits all at the the same time.

By applying voltages to a sub-set of these wires, multiple qubits can be controlled in parallel, performing a series of operations using far fewer controls. Importantly, with their design, they can perform the 2D surface code error correction protocols in which any computational errors that creep into the calculation can be corrected faster than they occur.

“This architecture gives us the dense packing and parallel operation essential for scaling up the size of the quantum processor,” says Scientia Professor Sven Rogge, Head of the UNSW School of Physics. “Ultimately, the structure is scalable to millions of qubits, required for a full-scale quantum processor.”

In theory, the new architecture could pack about 25 million physical qubits within an array that’s 150 micrometers by 150 µm. But those millions of qubits would require just 10,000 control lines. By comparison, an architecture that tried to control each individual qubit would have required over 1000 times more control lines.

“We have demonstrated we can build devices in silicon at the atomic-scale and have been working towards a full-scale architecture where we can perform error correction protocols – providing a practical system that can be scaled up to larger numbers of qubits,” says UNSW Scientia Professor Michelle Simmons, study co-author and Director of the CQC2T.

If the team can pull off this low error rate in a larger system, it would be “quite amazing”, said Hartmut Neven, director of engineering at Google and a member of the panel. But he cautioned that in terms of performance, the system is far behind others. The team is aiming for ten qubits in five years, but both Google and IBM are already approaching this with superconducting systems. And in five years, Google plans to have ramped up to hundreds of qubits.


References and Resources also include:







NASA, Google and DOD exploring the massive potential of new D-Wave’s 2000 qubit quantum processor

Quantum computing and quantum information processing are next revolutionary technology expected to have immense impact. Quantum computers will be able to perform tasks too hard for even the most powerful conventional supercomputer and have a host of specific applications, from code-breaking and cyber security to medical diagnostics, big data analysis and logistics. Quantum computers could accelerate the discovery of new materials, chemicals and drugs. They could dramatically reduce the current high costs and long lead times involved in developing new drugs.


Canadian firm D-Wave has released  the new quantum computer that will be able to handle some 2,000 quantum bits (qubits), roughly double the usable number found in the processor in the existing D-Wave 2X system, and be capable of solving certain problems 1,000x faster than its predecessor. D-Wave is the only company selling a quantum computer. It sold its first system in 2011 .  “We’ve been on a trajectory, which has been doubling the number of qubits pretty much every year,” said Colin Williams, director of business development and strategic partnerships at D-Wave. D-Wave’s quantum computers are being already used by the Los Alamos National Laboratory, Google, NASA, and Lockheed Martin. D-Wave’s goal is to upgrade all those systems.


The current  D-Wave 2X™ quantum computing system features a 1000+ qubit quantum processor and numerous design improvements that resulted  in larger problem sizes, faster performance and higher precision. “Breaking the 1000 qubit barrier marks the culmination of years of research and development by our scientists, engineers and manufacturing team,” said D-Wave CEO Vern Brownell. “It is a critical step toward bringing the promise of quantum computing to bear on some of the most challenging technical, commercial, scientific, and national defense problems that organizations face.”


Some applications for D-Wave’s quantum computer include machine learning, financial simulations, and coding optimization. For example, the quantum computers could be used to build classifiers for better speech recognition or labeling of images, Vern Brownell, D-Wave’s CEO said. Algorithms play a big role in making D-Wave’s quantum computers effective. “Our belief is that machine learning is the killer app for quantum computing,” Brownell said.


They are working on a 5000 qubit system now and will likely install it with a customer in less than 2 years. They are also working to broaden connectivity on their chips.


D-Wave Two is not a universal quantum computer, It has been designed specifically to perform a process called “quantum annealing”, which is a technique for finding the global minimum of a complicated mathematical expressions hence expected to be capable of solving optimization and sorting problems exponentially faster than a classical computer.


The success of classical heuristic search algorithms often depends on the balance between global search for good regions of the solution space (exploration) and local search that refines known good solutions (exploitation). While local
refinement of known solutions is not available to the canonical forward quantum annealing algorithm, D-Wave has  recently developed a reverse annealing feature that makes this possible by annealing backward from a specified state, then forward to a new state.


Reverse annealing allows users to start systematic searches for the best answer based upon their understanding of the solution space. It circumvents limitations on the number of qubits or the entanglement time by starting new searches in different locations. This enables the use of quantum annealing for the refinement of classical states via local search, making it possible to use quantum annealing as a component in more sophisticated hybrid algorithms.


D-Wave has modifications to their architecture which COULD convert their annealing system into a general purpose quantum computing system. It will require more control over the qubits. It would still be annealing. It would be programmable and allow for multiple quantum programs at the same time and allow for repeatable answer runs, said Marcos de López



D-Wave Systems Inc., has also taken strategic initiative to build and foster a quantum software development ecosystem by releasing  an open-source, quantum software tool as part of  The new tool, qbsolv, enables developers to build higher-level tools and applications leveraging the quantum computing power of systems provided by D-Wave, without the need to understand the complex physics of quantum computers. The promise of qbsolv and quantum acceleration is to enable faster solutions to larger and more complex problems.


D-Wave, NASA and DOD   explore  massive potential of Quantum Computers

One of the toughest problems in mathematics is known as the traveling salesperson problem, which asks to find the shortest route between a list of cities.The traveling salesperson problem is also pervasive. Practically anytime you want to make a complex process more efficient, you need to do this kind of combinatorial optimization. Logistics businesses need to solve a version of it every time they plan a route. Semiconductor manufacturers encounter similar issues when they design and manufacture chips.


“D-Wave has begun to work with investment managers on the related problem of designing portfolios. In order to generate the maximum returns for a given risk profile, a fund manager needs to not only choose among the thousands of available securities, but also minimize transaction costs by achieving the most optimal portfolio in the minimum number of trades,” writes Greg Satell in Forbes.


In each case, D-Wave’s quantum systems allow us to swallow complexity whole, rather than using shortcuts that reduce efficiency. Jeremy Hilton, Senior Vice President, Systems, at D-Wave says “Complex processes are all around us. By using quantum computing to operate them more effectively, we can make just about everything we do run more smoothly.”


“Scientists at Harvard have found that quantum computers will allow us to map proteins much as we do genes today. D-Wave has also formed a partnership with DNA-SEQ to use its quantum computers to explore how to analyze entire genomes to create more effective therapies,” writes Greg Satell.


Current machine learning algorithms generate misclassification errors because of the limited capacity of conventional computers, data is lost in the training process. D-Wave is working with a number of partners, such as NASA, to help train artificial intelligence systems to reflect human thought processes far more completely than is possible with conventional computers, which will help to minimize mistakes.


Scientists at Google, NASA and USRA have been using it to explore the potential for quantum computing and its applicability to a broad range of complex problems such as web search, speech recognition, planning and scheduling, air-traffic management and robotic missions to other planets.


Computing giants believe quantum computers could make their artificial-intelligence software much more powerful and unlock scientific leaps in areas like materials science, according to MIT Technology Review. NASA hopes quantum computers could help schedule rocket launches and simulate future missions and spacecraft.


NASA’s QuAIL team aims to demonstrate that quantum computing and quantum algorithms may someday dramatically improve the agency’s ability to solve difficult optimization problems for missions in aeronautics, Earth and space sciences, and space exploration.


“Quantum computers enable us to use the laws of physics to solve intractable mathematical problems,” said Marcos de López de Prado, Senior Managing Director at Guggenheim Partners and a Research Fellow at Lawrence Berkeley National Laboratory’s Computational Research Division. “This is the beginning of a new era, and it will change the job of the mathematician and computer scientist in the years to come.”


The Space and Naval Warfare Systems Center Pacific in San Diego is working with one of the few quantum computers in existence to assess its applicability to military computing problems. “Some of those problems would be cooperative communication and ad hoc networks, time division multiple access message scheduling, or algorithms for data storage and energy data retrieval with underwater autonomous robots—optimization-type problems,”said Dr. Joanna Ptasinski, an electronics engineer at SPAWAR. SPAWAR provides the Navy and other military branches with essential capabilities in the areas of command and control, communications, computers, intelligence, surveillance, and reconnaissance.




D-Wave performs forward and reverse Quantum Annealing

There are two major approaches to quantum computing systems – annealing and the gate model. Gate model systems actually require longer coherence times, more qubit control, etc. than D-Wave does. D-Wave forces the quantum states into a digital state with each annealing cycle, some 5000 – 10,000 times per second.


Currently IBM, Google and Rigetti have created or are creating about 50 qubit gate systems with no error correction. They are not universal quantum computing systems but approximate gate model systems. These system may be good at one or two applications, that’s unknown so far, said Bo Ewald, who is D-Wave System’s President of International Business. Bo previously worked at Cray, Silicon Graphics and Los Alamos National Laboratory.


D-Wave Two is not a  universal  quantum computer, It has been designed specifically to perform a process called “quantum annealing”, which is a technique for finding the global minimum of a complicated mathematical expressions hence expected to be capable of solving optimization and sorting problems exponentially faster than a classical computer. It is especially useful for solving problems can be expressed as an energy landscape, and the solution to the problem is the lowest point in that landscape such as minimizing error in a voice recognition system, controlling risk in a financial portfolio, or reducing energy loss in an electrical grid. In comparison, a universal quantum computer is one that, in theory, can perform any computation exponentially faster than a classical computer.


Annealing relies on “quantum tunnelling” and lets an initially simple system evolve very slowly towards the desired result. This involves encoding a problem into the states of some quantum bits (qubits) that have specifically assigned interactions. These interactions are traditionally classical, in that they are either on or off. The qubits are then put in a superposition of states and the system gradually evolves – ensuring that all the qubits always remain in the lowest energy “valley” or ground state – until the global minimum of the function is found.


D-Wave’s quantum computer runs a quantum annealing algorithm to find the lowest points, corresponding to optimal or near optimal solutions, in a virtual “energy landscape.” While there are different ways in which users can submit problems to the system, at the level of the machine instruction of the quantum processor the system solves a Quadratic Unconstrained Binary Optimization Problem (QUBO), where binary variables are mapped to qubits and correlations between variables are mapped to couplings between qubits.


D-Wave does not have error correction and neither will the current generation of quantum gate systems. D-Wave repeatedly solves problems to get around the lack of error correction. 1000 problem runs generates a distribution of solutions. Those solutions are easily checked to see which is the best solution.


Research on quantum error correction suggests that for 2000 qubits one would need 1 million error correcting qubits and another paper said for 1000 qubits one would need 6.3 million error correcting qubits. The vast number of qubits needed for an error corrected system is why no error correcting system has been implemented for any quantum gate.


“There has been a fair amount of discussion in both the media and the scientific community about the impact of environmental noise (heat, vibration, magnetism etc) and the need for quantum error correction in quantum computing, and what that means for the D-Wave technology. One of the most attractive characteristics of quantum annealing systems such as the D-Wave system, is that they are more robust against decoherence from certain types of environmental noise than other quantum systems, such as those built on a gate model, would be,” according to Ewald.



D-Wave performance

In 2014, a group of researchers from EHT Zurich, Google, Microsoft, the University of Southern California and the University of California at Santa Barbara subjected D-Wave machine to 1,000 randomly chosen cost-function problems as well as to the classical annealer. They found that, overall, the D-Wave did not exhibit quantum speedup on the set of problems used. However D-Wave asserts that Troyer’s group chose an inappropriate set of problems to perform this test, and that the D-Wave machine would distinguish itself if subjected to the harder problems.


Unlike “conventional” quantum computers – which are kept in a fragile quantum state throughout the calculation – quantum annealing involves making a transition from a quantum to classical system which results in very short coherence time, on the order of a few nanoseconds, while the total time to perform of one annealing run is 20 microseconds. Since the qubits are coherent for only a fraction of the total time, hence they could not show a quantum speedup.


D-Wave construction

The D-Wave quantum processor is built from a lattice of tiny loops of the metal niobium, each of which is one quantum bit, or qubit. When niobium is cooled down below 9.2 Kelvin it becomes a superconductor and starts to exhibit quantum mechanical effects. By circulating current either clockwise or counter-clockwise, the superconducting qubit emits a magnetic field pointing downward or upward, encoding a logical 1 or 0. During quantum annealing, current flows clockwise and counter-clockwise simultaneously.


The annealing, is performed by adjusting the coupling between the rings, the coupling between the bits represents an energy, so the sum of all these can be measured. The smaller the sum is, the better your solution. D-Wave chip optimizes all of its bits at the same time as the couplings between them sweep in an analog fashion between pre-programmed beginning and endpoints.


Every additional qubit doubles the search space of the processor. At 1000 qubits, the new processor considers (2)^1000 possibilities simultaneously, a search space which dwarfs the (2)^512 possibilities available to the 512-qubit D-Wave Two. ‪In fact, the new search space contains far more possibilities than there are ‪particles in the observable universe.


D-Wave quantum processor breaks 1000 qubit barrier to Address Larger and More Complex Problems

The laws of quantum mechanics including “superposition” of states, along with the quantum effects of entanglement and quantum tunneling, enable quantum computers to consider and manipulate many combinations of bits simultaneously.


At 1000+ qubits, the D-Wave 2X quantum processor evaluates all 2 exp(1000) possible solutions simultaneously as it converges on optimal or near optimal solutions, more possibilities than there are particles in the observable universe.


The D-Wave 2X demonstrates a factor of up to 15x gains over highly specialized classical solvers in nearly all classes of problems examined. Measuring only the native computation time of the D-Wave 2X quantum processor shows performance advantages of up to 600x over these same solvers, according to D-Wave.


“For the high-performance computing industry, the promise of quantum computing is very exciting. It offers the potential to solve important problems that either can’t be solved today or would take an unreasonable amount of time to solve,” said Earl Joseph, IDC program vice president for HPC.


Beyond the much larger number of qubits, other significant innovations include:


  • Lower Operating Temperature: While the previous generation processor ran at a temperature close to absolute zero, the new processor runs 40% colder. The lower operating temperature enhances the importance of quantum effects, which increases the ability to discriminate the best result from a collection of good candidates.


  • Reduced Noise: Through a combination of improved design, architectural enhancements and materials changes, noise levels have been reduced by 50% in comparison to the previous generation. The lower noise environment enhances problem-solving performance while boosting reliability and stability.


  • Increased Control Circuitry Precision: In the testing to date, the increased precision coupled with the noise reduction has demonstrated improved precision by up to 40%. To accomplish both while also improving manufacturing yield is a significant achievement.


  • Advanced Fabrication: The new processors comprise over 128,000 Josephson junctions (tunnel junctions with superconducting electrodes) in a 6-metal layer planar process with 0.25μm features, believed to be the most complex superconductor integrated circuits ever built.


  • New Modes of Use: The new technology expands the boundaries of ways to exploit quantum resources. In addition to performing discrete optimization like its predecessor, firmware and software upgrades will make it easier to use the system for sampling applications.


The physical footprint of the system is approximately 10′ x 7′ x 10′ (L x W x H). It houses a sophisticated cryogenic refrigeration system, shielding and I/O systems that support a single thumbnail-sized quantum processor. Most of the physical volume of the current system is due to the size of the refrigeration system and to provide easy service access.


In order for the quantum effects to play a role in computation, the quantum processor must operate in an extreme isolated environment. The refrigerator and many layers of shielding create an internal environment with a temperature close to absolute zero that is isolated from external magnetic fields, vibration, and external RF signals of any form.

Google implements digitized adiabatic quantum computing

While this method works fairly well for some problems, it is not the one-size-fits-all quantum computation technique that many scientists hope to create. Adiabatic quantum computing suffers from errors and noise because the process does not allow for error correction to take place during a computation. This becomes a major problem when the system is scaled up and errors accumulate.


D-Wave approach is that of Analogue quantum computation, that consists of a continuous dynamics to get to the optimal solution of the problem. This dynamics can be slow, as is the case of adiabatic quantum computing based on so-called quantum annealing.


In contrast, Digital quantum computation splits up the problem to be resolved in terms of quantum logic gates in a way similar to that of a conventional computer. A key component of many of the universal quantum computing architectures being developed is that their digital logic gates can be made fault-tolerant and error correction can take place while a calculation is being processed.


Google researchers have implemented the digitized adiabatic quantum computing, that combines the generality of the adiabatic algorithm with the universality of the digital approach, using a superconducting circuit with nine qubits. In this pioneering experiment superconducting quantum bits were used to digitize an analogue quantum computer in a way similar to what is done with communication signals in conventional technologies.


This strategy will enable optimization problems to be universally solvable; they are useful in fields as general as finance, and also in the design of new materials and products for the pharmaceutical industry.


Google combines two approaches to quantum computing

“Another problem with adiabatic quantum computing is the classical nature of the interactions – which puts a low limit on the number of other qubits that a qubit can interact with. For a quick computation, you would ideally want multiple interactions simultaneously taking place between all of the qubits. But in a complex computation, it would be nearly impossible to accurately keep track of these interactions. However, reducing the connectivity has a major impact on the system’s computational abilities.”


“A complementary approach is digital quantum computing, which enables the construction of arbitrary interactions and is compatible with error correction but uses quantum circuit algorithms that are problem-specific.”


John Martinis, Rami Barends and colleagues at Google’s research laboratories in Santa Barbara, California, together with physicists at the University of California, Santa Barbara and the University of the Basque Country in Bilbao, have combined the advantages of both approaches by implementing digitized adiabatic quantum computing in a superconducting system. By digitizing an adiabatic quantum computation, the team has a greater degree of control over the interactions between qubits, and they can also correct for errors while a computation is executed.


In the current work, the Google researchers have adapted their previously built superconducting nine-qubit chip, where interactions are controlled by connected logic gates that encode a problem.  The spin system is formed by a superconducting circuit with nine qubits. The team simulated a row of spin-coupled magnetic atoms in a chain. The atoms can have an aligned (ferromagnetic) or anti-aligned (anti-ferromagnetic).


The qubits are the cross-shaped structures, patterned out of an Al layer on top of a sapphire substrate, and arranged in a linear chain. Each qubit is capacitively coupled to its nearest neighbours, and can be individually controlled and measured. The researchers address each qubit individually via current pulses that tune into the inherent resonant frequency of their qubits, which have variable frequencies between 4 GHz and 5.5 GHz. Crucially, by tuning the frequencies of the qubits we can implement a tunable controlled-phase entangling gate, which together with the single qubit gates forms our digitized approach.


“In our architecture we can steer this frequency, much like you would tune a radio to a broadcast,” says Barends. He explains that they can tune the frequency of one qubit to that of another. “By moving qubit frequencies to or away from each other, interactions can be turned on or off. The exchange of quantum information resembles a relay race, where the baton can be handed down when the runners meet,” he adds. The team can even tune a qubit so that it is simultaneously in a superposition of being aligned and anti-aligned.


Barends tells physicsworld.com that “as a demonstration, we have implemented non-stoquastic interactions, something that is not possible with present-day analogue systems. This is important because problems that involve interacting electrons, like in quantum chemistry, are non-stoquastic.”


“This demonstration of digitized quantum adiabatic computing in the solid state opens a path to solving complex problems, and we hope it will motivate further research into the efficient synthesis of adiabatic algorithms, on small-scale systems with noise as well as future large-scale quantum computers with error correction,” write the authors.


References and Resources also include:



Quantum Machine Learning to usher new era in AI by exponential speedup of internet search, fraud detection, and brain mapping

Quantum computing and quantum information processing is expected to have immense impact by performing tasks too hard for even the most powerful conventional supercomputer and have a host of specific applications, from code-breaking and cyber security to medical diagnostics, big data analysis and logistics.

Quantum computers, run on a subatomic level using quantum bits (or qubits) that can represent a 0 and a 1 at the same time. A processor using qubits could theoretically solve problems exponentially more quickly than a traditional computer for a small set of specialized problems.

Machine Learning (ML) is a subfield of Artificial Intelligence which attempts to endow computers with the capacity of learning from data, so that explicit programming is not necessary to perform a task. ML algorithms allow computers to extract information and infer patterns from the record data so computers can learn from previous examples to make good predictions about new ones.Machine Learning (ML) has now become a pervasive technology, underlying many modern applications including internet search, fraud detection, gaming, face detection, image tagging, brain mapping, check processing and computer server health-monitoring.

Quantum machine learning is a new subfield within the field of quantum information that combines the speed of quantum computing with the ability to learn and adapt, as offered by machine learning.Quantum machine learning algorithms aim to use the advantages of quantum computation in order to improve classical methods of machine learning, for example by developing efficient implementations of expensive classical algorithms on a quantum computer.

Researchers from Caltech and the University of Southern California (USC) report the first application of quantum computing to machine learning. By employing quantum-compatible machine learning techniques, they developed a method of extracting a rare Higgs boson signal from copious noise data. Higgs is the particle that was predicted to imbue elementary particles with mass and was discovered at the Large Hadron Collider in 2012. The quantum program seeks patterns within a data set to tell meaningful data from junk. The new quantum machine learning method is found to perform well even with small data sets, unlike the standard counterparts.

Quantum computer learns to ‘see’ trees

Scientists have trained a quantum computer to recognize trees a step towards using such computers for complicated machine learning problems like pattern recognition and computer vision. For example, team member Ramakrishna Nemani, an earth scientist at NASA’s Advanced Supercomputer Division in Mountain View, California says the study lays the groundwork for better climate forecasting.

By poring over NASA’s satellite imagery, quantum processors could take a machine learning approach to uncover new patterns in how weather moves across the world over the course of weeks, months, or even years, he says. “Say you’re living in India—you might get an advance notice of a cyclone 6 months ahead of time because we see a pattern of weather in northern Canada.”

In the new study, physicist Edward Boyda of St. Mary’s College of California in Moraga and colleagues fed hundreds of NASA satellite images of California into the D-Wave 2X processor, which contains 1152 qubits. The researchers asked the computer to consider dozens of features—hue, saturation, even light reflectance—to determine whether clumps of pixels were trees as opposed to roads, buildings, or rivers. They then told the computer whether its classifications were right or wrong so that the computer could learn from its mistakes, tweaking the formula it uses to determine whether something is a tree.

After it was trained, the D-Wave was 90% accurate in recognizing trees in aerial photographs of Mill Valley, California, the team reports in PLOS ONE. “Classification is a tricky problem; there are short trees, tall trees, trees next to each other, next to buildings—all sorts of combinations,” says Nemani.

Rigetti  Demonstrates Unsupervised Machine Learning Using Their 19-Qubit Processor

Researchers at Rigetti Computing, a company based in Berkeley, California, used one of its prototype quantum chips—a superconducting device housed within an elaborate super-chilled setup—to run what’s known as a clustering algorithm. Clustering is a machine-learning technique used to organize data into similar groups. Rigetti is also making the new quantum computer—which can handle 19 quantum bits, or qubits—available through its cloud computing platform, called Forest.

The company’s scientists published a paper about the demonstration called “Unsupervised Machine Learning on a Hybrid Quantum Computer.” The abstract lays out the problem space and their approach :

Machine learning techniques have led to broad adoption of a statistical model of computing. The statistical distributions natively available on quantum processors are a superset of those available classically. Harnessing this attribute has the potential to accelerate or otherwise improve machine learning relative to purely classical performance.

A key challenge toward that goal is learning to hybridize classical computing resources and traditional learning techniques with the emerging capabilities of general purpose quantum processors.  Scientists demonstrated such hybridization by training a 19-qubit gate model processor to solve a clustering problem, a foundational challenge in unsupervised learning. “We use the quantum approximate optimization algorithm in conjunction with a gradient-free Bayesian optimization to train the quantum machine. This quantum/classical hybrid algorithm shows robustness to realistic noise, and we find evidence that classical optimization can be used to train around both coherent and incoherent imperfections.”


Quantum Machine Learning algorithms

Recently, however, a new family of quantum algorithms has come along to challenge this relatively-narrow view of what a quantum computer would be useful for. Not only do these new algorithms promise exponential speedups over classical algorithms, but they do so for eminently-practical problems, involving machine learning, clustering, classification, and finding patterns in huge amounts of data, writes Scott Aaronson. “The algorithm at the center of the “quantum machine learning” mini-revolution is called HHL, after my colleagues Aram Harrow, Avinatan Hassidim, and Seth Lloyd, who invented it in 2008.”

“HHL attacks one of the most basic problems in all of science: namely, solving a system of linear equations. Given an n × n real matrix A and a vector b, the goal of HHL is to (approximately) solve the system Ax = b for x, and to do so in an amount of time that scales only logarithmically with n, the number of equations and unknowns. Classically, this goal seems hopeless, since n2 steps would be needed even to examine all the entries of A, and n steps would be needed even to write down the solution vector x. By contrast, by exploiting the exponential character of the wave function, HHL promises to solve a system of n equations in only about log n steps.”

In the years since HHL, quantum algorithms achieving “exponential speedups over classical algorithms” have been proposed for other major application areas, including k-means clustering , support vector machines , data fitting, and even computing certain properties of Google PageRank vectors.

Quantum algorithm can analyse huge matrices enabling AI  to think faster

The first quantum linear system algorithm was proposed in 2009 by a different group of researchers. That algorithm kick-started research into quantum forms of machine learning, or artificial intelligence.

A linear system algorithm works on a large matrix of data. For example, a trader might be trying to predict the future price of goods. The matrix may capture historical data about price movements over time and data about features that could be influencing these prices, such as currency exchange rates. The algorithm calculates how strongly each feature is correlated with another by ‘inverting’ the matrix. This information can then be used to extrapolate into the future.

“There is a lot of computation involved in analysing the matrix. When it gets beyond say 10,000 by 10,000 entries, it becomes hard for classical computers,” explains Zhao. This is because the number of computational steps goes up rapidly with the number of elements in the matrix: every doubling of the matrix size increases the length of the calculation eight-fold.

The 2009 algorithm could cope better with bigger matrices, but only if the data in them is what’s known as ‘sparse’. In these cases, there are limited relationships among the elements, which is often not true of real-world data.

Zhao, Prakash and Wossnig present a new algorithm that is faster than both the classical and the previous quantum versions, without restrictions on the kind of data it works for. Anupam Prakash at the Centre for Quantum Technologies, National University of Singapore, and collaborator Leonard Wossnig, then at ETH Zurich and the University of Oxford. Zhao is a PhD student with the Singapore University of Technology and Design.

As a rough guide, for a 10,000 square matrix, the classical algorithm would take on the order of a trillion computational steps, the first quantum algorithm some 10,000s of steps and the new quantum algorithm just 100s of steps. The algorithm relies on a technique known as quantum singular value estimation.

There have been a few proof-of-principle demonstrations of the earlier quantum linear system algorithm on small-scale quantum computers. Zhao and his colleagues hope to work with an experimental group to run a proof-of-principle demonstration of their algorithm, too. They also want to do a full analysis of the effort required to implement the algorithm, checking what overhead costs there may be.

To show a real quantum advantage over the classical algorithms will need bigger quantum computers. Zhao estimates that “We’re maybe looking at three to five years in the future when we can actually use the hardware built by the experimentalists to do meaningful quantum computation with application in artificial intelligence.”


Quantum machine learning over infinite dimensions

Researchers have recently generalized quantum machine learning to the more complex, but still remarkably practical, infinite-dimensional systems. Researchers present the critical subroutines of quantum machine learning algorithms for an all-photonic continuous-variable quantum computer that achieve an exponential speedup compared to their equivalent classical counterparts. Finally, they also map out an experimental implementation which can be used as a blueprint for future photonic demonstrations.

Physicists have developed a quantum machine learning algorithm that can handle infinite dimensions—that is, it works with continuous variables (which have an infinite number of possible values on a closed interval) instead of the typically used discrete variables (which have only a finite number of values).

The researchers, HoiKwan Lau et al., have published a paper on generalizing quantum machine learning to infinite dimensions in Physical Review Letters. As the physicists explain, quantum machine learning is a new subfield within the field of quantum information that combines the speed of quantum computing with the ability to learn and adapt, as offered by machine learning.

One of the biggest advantages of having a quantum machine learning algorithm for continuous variables is that it can theoretically operate much faster than classical algorithms. Since many science and engineering models involve continuous variables, applying quantum machine learning to these problems could potentially have far-reaching applications.

Although the results of the study are purely theoretical, the physicists expect that the new algorithm for continuous variables could be experimentally implemented using currently available technology. The implementation could be done in several ways, such as by using optical systems, spin systems, or trapped atoms.

References and Resources also include:

Global race for Unhackable Quantum Satellite based Communication networks among countries like China, Japan,United States, Canada, and EU

China’s quantum satellite has produced its first successful result. In a paper published in Science, researchers from the Chinese Academy of Sciences announced the satellite had successfully distributed entangled photons between three different terrestrial base stations, separated by as much as 1,200 kilometers on the ground. The result is the longest entanglement ever demonstrated, and the first that spanned between the Earth and space. Researchers say the system “opens up a new avenue to both practical quantum communications and fundamental quantum optics experiments at distances previously inaccessible on the ground.

China  launched the world’s first quantum communications satellite officially known as Quantum Experiments at Space Scale, or QUESS, satellite. The launch took place at 17:40 UTC Monday (16th Aug 2016) from the Jiuquan Satellite Launch Centre in the Gobi Desert, with a Long March 2D rocket sending the 620 kilogram (1,367 pound) satellite to a 600 kilometer (373 mile) orbit at an inclination of 97.79 degrees. “In its two-year mission, QUESS is designed to establish ‘hack-proof’ quantum communications by transmitting uncrackable keys from space to the ground,” Xinhua news agency said. China then plans to put additional satellites into orbit China hopes to complete a QKD system linking Asia and Europe by 2020, and have a worldwide quantum Network.

“The newly-launched satellite marks a transition in China’s role – from a follower in classic information technology development to one of the leaders guiding future achievements,” Pan Jianwei, the project’s chief scientist, told the agency. Quantum communications holds “enormous prospects” in the field of defense, it added.

In November 2015, at the 18th Party 8 Congress’ 5th Plenum, Xi Jinping included quantum communications in his list of major science and technology projects that are prioritized for major breakthroughs by 2030, given their importance from the perspective of China’s long-term strategic requirement.

Many other countries like United States, Canada, Japan, and some EU countries are all racing to develop quantum communication networks as they are virtually un-hackable. Researchers from these countries are closely watching the China’s tests. Researchers at the National Institute of Information and Communications Technology (NICT) in Japan and recently published in the journal Nature Photonics, demonstrated Satellite based “unhackable” Quantum Key Distribution, or QKD.

The biggest challenge, Alexander Ling, principal investigator at the Centre for Quantum Technologies in Singapore said, is being able to orient the satellite with pinpoint accuracy to a location on Earth where it can send and receive data without being affected by any disturbances in Earth’s atmosphere. Ling said. “You’re trying to send a beam of light from a satellite that’s 500 kilometres (310 miles) above you.”

For more information on China’s quantum satellite: http://idstch.com/home5/international-defence-security-and-technology/technology/quantum/china-leading-the-global-race-of-satellite-based-quantum-communications-for-its-military/

Satellite based Quantum key cryptography

Quantum technology is considered to be unbreakable and impossible to hack. A unique aspect of quantum cryptography is that Heisenberg’s uncertainty principle ensures that if Eve attempts to intercept and measure Alice’s quantum transmissions to Bob, her activities must produce an irreversible change in the quantum states that are retransmitted to Bob. These changes will introduce an anomalously high error rate in the transmissions between Alice and Bob, allowing them to detect the attempted eavesdropping.

Quantum key distribution (QKD), establishes highly secure keys between distant parties by using single photons to transmit each bit of the key. Photons are ideal for propagating over long-distances in free-space and are thus best suited for quantum communication experiments between space and ground. The unit of quantum information is the “qubit” (a bit of information “stamped” in a quantum physical property, for instance the polarization of a photon).

QKD thus solves the long-standing problem of securely transporting cryptographic keys between distant locations. “Even if the keys were transmitted across hostile territory, their integrity could be unambiguously verified upon receipt,” say Thomas Jennewein, Brendon Higgins and Eric Choi in SPIE.

Fiber optic based QKD systems are commercially available today, however are point to point links and limited to the order of few hundreds kms because of current optical fiber and photon detector technology. One way to overcome this limitation is by bringing quantum communication into space. An international team led by the Austrian physicist Anton Zeilinger has successfully transmitted quantum states between the two Canary Islands of La Palma and Tenerife, over a distance of 143 km. The previous record, set by researchers in China was 97 km. The process called quantum teleportation allows the state of one of the two entangled photons to be changed immediately without delay by changing the state of other photon even though they may be widely separated.

Japanese Researchers demonstrate Satellite based “unhackable” Quantum Key Distribution, or QKD

Researchers at the National Institute of Information and Communications Technology (NICT) in Japan and recently published in the journal Nature Photonics, demonstrated Satellite based “unhackable”  Quantum Key Distribution, or QKD.

“The main advantage [of QKD] is the unconditional security,” said team leader Alberto Carrasco-Casado. “When quantum computers are developed, the security of conventional communications will be compromised, since current cryptography is based only on computational complexity. “The development of practical quantum computers is only a matter of time, which has made quantum communication a hot topic in the last few years, and the tendency is foreseen to increase in the future,” he added.

To demonstrate this secured, high-capacity transmission of data between an Earth-based station and a satellite in low-Earth orbit (LEO), Carrasco-Casado’s team used the quantum-communication transmitter, called SOTA (Small Optical TrAnsponder), on board the microsatellite SOCRATES (Space Optical Communications Research Advanced Technology Satellite) that was launched by the Japan Aerospace Exploration Agency (JAXA) in 2014.

Weighing only 13 lbs. (6 kilograms), SOTA is the smallest quantum communications transmitter ever tested. Orbiting above Earth at 372 miles (600 kilometers), SOCRATES was traveling at over 15,000 mph (7 kilometers per second) when SOTA would establish contact with a 1-meter telescope located in Tokyo’s Koganei city. The received signal was then guided to a quantum receiver to decode the information using a QKD protocol, the researchers wrote in their study.

SOTA encoded the individual photons with 1 bit of data — either a “1” or a “0” — achieved by switching the photons between two polarized states — a method known as a “single-photon regime.” SOTA then beamed pulses of laser at a rate of 10 million bits per second. On reaching the ground station, the laser signal was extremely weak (the researchers say that, on average, only 0.1 laser photons were received per pulse), but the quantum receiver was still able to detect the signal and decode the information over a low level of noise.

In order to realize quantum communication and quantum cryptography with such a weak signal, a key step is to accurately time-stamp the signals, so that they are clearly recognized in the quantum receiver. Therefore, it is necessary to accurately synchronize the signals between SOCRATES and the OGS to detect the transmitted bits without errors. It is also necessary to carry out a polarization-axis matching, because the reference frames change,
due to the relative motion between the satellite and the ground station. Only Japan and China have been able to demonstrate these technologies in space, but China did it by using a 600-kg-class satellite, while Japan did it by using a 50-kg-class satellite.

Since the satellite moves at a fast speed relative to the OGS (about 7 km/s), the wavelength of the laser signal shifts due to the Doppler effect to a shorter wavelength when approaching the OGS, and to a longer wavelength when moving away from the OGS. Because of this Doppler effect, it is necessary to carry out an accurate time synchronization to be able to correctly detect the long sequences of bits without errors. In the China quantum-communication experiment, this synchronization was realized by using a dedicated laser transmitting a synchronization signal. By contrast, NICT was able to carry out this synchronization by using the quantum signal itself. A special synchronization sequence of about 32,000-bits was used in the quantum-communication signal for this purpose, and the quantum receiver was able to perform not only the quantum communication, but also the synchronization and the polarization-axis matching directly, by using only the weak quantum signal. In this experiment, NICT succeeded in demonstrating for the first time that quantum-communication technology can be implemented in small satellites.

The technology developed in this project demonstrated that satellite quantum communication can be implemented by using low-cost lightweight microsatellites. Therefore, it is expected that many research institutes and companies, which are interested in this technology, will accelerate the practical application of quantum communication from space. In addition, since it was proved that long-distance communication is possible with very-low electric power, this will open up a path to speed up deep-space optical communication with exploration spacecraft.

In the future, we plan to further increase the transmission speed and improve the precision of the tracking technology, to maximize the secure key delivery from space to ground by using quantum cryptography enabling a truly-secure global communication network, whose confidentiality is currently threatened by the upcoming development of quantum computers.

UK and Singapore’s Quantum satellite device tests technology for global quantum network

Researchers from the National University of Singapore (NUS) and the University of Strathclyde, UK, have become the first to test in orbit technology for satellite-based quantum network nodes.

They have put a compact device carrying components used in quantum communication and computing into orbit. Their device, dubbed SPEQS, creates and measures pairs of light particles, called photons. Results from space show that SPEQS is making pairs of photons with correlated properties – an indicator of performance.

Team-leader Alexander Ling, an Assistant Professor at the Centre for Quantum Technologies (CQT) at NUS said, “This is the first time anyone has tested this kind of quantum technology in space.” The team had to be inventive to redesign a delicate, table-top quantum setup to be small and robust enough to fly inside a nanosatellite only the size of a shoebox. The whole satellite weighs just 1.65-kilogramm.

The group’s first device is a technology pathfinder. It takes photons from a BluRay laser and splits them into two, then measures the pair’s properties, all on board the satellite. To do this it contains a laser diode, crystals, mirrors and photon detectors carefully aligned inside an aluminum block. This sits on top of a 10 centimetres by 10 centimetres printed circuit board packed with control electronics.

Further testing and refinement may lead to a way to use entangled photons beamed from satellites to connect points on opposite sides of the planet. A fleet of nanosatellites carrying sources of entangled photons would be used to enable private encryption keys between any two points on Earth.

Even with the success of the more recent mission, a global network is still a few milestones away. The team’s roadmap calls for a series of launches, with the next space-bound SPEQS slated to produce entangled photons. SPEQS stands for Small Photon-Entangling Quantum System.

With later satellites, the researchers will try sending entangled photons to Earth and to other satellites. The team are working with standard “CubeSat” nanosatellites, which can get relatively cheap rides into space as rocket ballast. Ultimately, completing a global network would mean having a fleet of satellites in orbit and an array of ground stations.

In the meantime, quantum satellites could also carry out fundamental experiments – for example, testing entanglement over distances bigger than Earth-bound scientists can manage. “We are reaching the limits of how precisely we can test quantum theory on Earth,” said co-author Dr Daniel Oi at the University of Strathclyde.


Canada ‘s University of Waterloo Institute for Quantum Computing (IQC) carried out Airborne demonstration of a  Satellite quantum key distribution 

In a study published in Quantum Science and Technology, researchers from the University of Waterloo have shown that it’s possible to transmit quantum information from a ground station to a moving aircraft. “Here, we demonstrate QKD from a ground transmitter to a receiver prototype mounted on an airplane in flight. We have specifically designed our receiver prototype to consist of many components that are compatible with the environment and resource constraints of a satellite. Coupled with our relocatable ground station system, optical links with distances of 3–10 km were maintained and quantum signals transmitted while traversing angular rates similar to those observed of low-Earth-orbit satellites.”

The system was tested using an aircraft that mimicked how high or low a satellite might appear in the sky. The aircraft, with the name Twin Otter, carried out 14 passes over the facility at varying distances. Only half of the passes were successful in establishing a quantum link, and in six out of those seven passes, the team was successful in extracting the quantum key.

“This is an extremely important step that finally demonstrates our technology is viable,” team leader Professor Thomas Jennewein added. “We achieved optical links at similar angular rates to those of low-Earth-orbit satellites, and for some passes of the aircraft over the ground station, links were established within 10 seconds of position data transmission. We saw link times of a few minutes and received quantum bit error rates typically between three and five percent, generating secure keys up to 868 kb in length.”

Under similar conditions, the uplink configuration has a lower key generation rate than the downlink, owing to atmospheric turbulence affecting the beam path earlier in the propagation. Importantly, an uplink also possesses a number of advantages over a downlink, including relative simplicity of the satellite design, not requiring high-rate true random number generators, relaxed requirements on data processing and storage (only the photon reception events need be considered, which are many orders of magnitude fewer than the source events), and the flexibility of being able to incorporate and explore various different quantum source types with the same receiver apparatus (which would have major associated costs were the source located on the satellite, as for downlink).

Recently, China launched a quantum science satellite that aims to perform many quantum experiments with optical links between space and ground. However its exact capabilities are unverified as no details or results have been published at this time.

Earlier a team led by Professor Thomas Jennewein at the University of Waterloo’s Institute for Quantum Computing (IQC) completed a successful laboratory demonstration of a form, fit and function prototype of a Quantum Key Distribution Receiver (QKDR) suitable for airborne experiments and ultimately Earth orbiting satellite missions.

The team designed and built the QKDR under a $600,000 contract from the Canadian Space Agency (CSA). The prototype QKDR needed to accommodate the payload constraints of a microsatellite-class mission. That included using only 10 W of power and weighing less than 12 kg.

Through radiation testing at TRIUMF located at the University of British Columbia, it was shown that with adequate shielding and cooling the QKDR detector devices can survive and operate in the space radiation environment for at least one year and possibly up to 10 years. The team also defined a credible path-to-flight for all key technologies including the miniaturized integrated optics, detectors and data processing electronics for the satellite payload.

Utilising orbiting satellites, therefore, has potential to allow the establishment of global QKD networks, with ‘quantum’ satellites acting as intermediaries. Such satellites could operate as untrusted nodes linking two ground stations simultaneously, or trusted nodes connecting any two ground stations on Earth at different times


ISRO Collaborating With Research Institute to Develop Secure Quantum Communications in Space

Raman Research Institute (RRI) in Bengaluru has joined hands with the Indian Space Research Organisation (ISRO) to develop the quantum technologies that ISRO’s satellites would need to establish such a network. 

Under the memorandum of understanding signed recently between RRI and ISRO Space Applications Centre (ISAC) also in Bengaluru, the latter will fund the Quantum Information and Computing (QuiC) laboratory at RRI for developing the quantum technology tools.

“This is India’s first step towards quantum communications between ground and satellites,” said Urbasi Sinha, who heads the QuiC laboratory and has been pioneering fundamental quantum experiments “using single and entangled photons”.


QEYSSat (Quantum EncrYption and Science Satellite) microsatellite mission

Researchers from University of Waterloo have proposed microsatellite mission called QEYSSat (Quantum EncrYption and Science Satellite) through a series of conceptual and technical studies funded primarily by the Canadian Space Agency (CSA). QEYSSat’s mission objectives are to demonstrate the generation of encryption keys through the creation of quantum links between ground and space, and also to conduct fundamental science tests of long-distance quantum entanglement (the intriguing phenomenon in which the joint quantum state of, for example, two particles cannot be factored into a product of individual particle states).

“The quantum signals for QEYSSat will be generated in photon sources located on the ground. An optical transmitter on the ground will point the beam of photons toward the satellite. (QKD can be carried out via such quantum uplinks, along with ordinary classical communication with the satellite,” said the researchers. An important aspect of this mission concept is to keep the complex source technologies on the ground and ensure that the satellite is simple and cost-effective. This approach also allows the quantum link to be implemented using various different types of quantum sources, including entangled photons and weak coherent pulses.

“Placing the quantum receiver in space, however, poses some technical challenges of its own. In particular, the expected link losses will be higher for the uplink than they would be for a downlink because atmospheric turbulence perturbs the photons at the start of their journey up to the satellite. In addition, the dark counts of single-photon detectors will rise due to radiation exposure in orbit,” write the researchers in SPIE.

The current platform for the QEYSSat mission proposal is based on a microsatellite, to be located in a low Earth orbit at an altitude of about 600km. The payload would have an optical receiver with 40cm aperture as the main optics.

“The QEYSSat payload will include the capability to analyze and detect single optical photons with high efficiency and accuracy. Each arriving photon will be analyzed in a polarization analyzer and detected in single-photon detectors. Onboard data acquisition will register all detection events and record their time-stamps to subnanosecond precision, for processing later on the ground.”

To show the viability of this mission concept, Researchers have conducted several theoretical and experimental studies, including a comprehensive link performance analysis, as well as QKD experiments over high transmission losses and over a rapidly fluctuating channel.

Typical QKD experiments operate with 20–30dB of losses, but for a satellite link the losses are expected to be about 40dB or more. Researchers studied how to implement a QKD protocol in the case of such high transmission losses and operated a system successfully with losses up to nearly 60dB.

“Fluctuations caused by turbulence will be particularly important when sending quantum signals to a satellite. We showed that quantum communication using single photons is still possible even when the channel transmission is strongly fluctuating, down to complete drop outs of the signal. We also showed that in the extreme case of very high transmission losses, we can improve the signal-to-noise ratio and keep performing QKD by applying a threshold filter to the data in post-processing,” write the researchers in SPIE.

European scientists have proposed a quantum communications experiment that could be sent to the international space station.


References and Resources also include





Single photon detector (SPD) critical technology for quantum computers and communications, and submarine detection

Light is widely used for communications, carrying phone conversations and video signals through fiber-optic cables around the world in pulses composed of many photons. Light is also being used in  optical wireless communication, a form of free space communications consisting of a LASER at source and detector at the destination.  Both military and civilian users have started planning Laser communication systems from terrestrial short-range systems, to high data rate Aircraft and Satellite communications, unmanned aerial vehicles (UAVs) to high altitude platforms (HAPs), near-space communications for relaying high data rates from moon, and deep space communications from mars.

The detectors  which detect these signals are most critical elements that determine the performance of wide range of civilian and military systems. These include systems such as light or laser detection and ranging (LIDAR or LADAR), photography, astronomy, quantum information processing, advanced metrology, quantum optics, medical imaging, microscopy, quantum and classical optical communications including underwater Blue-Green communications, and environmental sensing.

As the state of the art in these fields has advanced, so have the performance requirements of the constituent detectors. A single photon is the indivisible minimum energy unit of light, and therefore, detectors with the capability of single-photon detection are the ultimate tools for weak light detection. Single photon detectors  have found application in various research fields such as quantum information, quantum optics, optical communication, and deep space communications.

There has been concerted effort to advance single-photon detection technologies to achieve higher efficiency, lower noise, higher speed and timing resolution, as well as to improve other properties, such as photon number resolution, imaging, and sensitivity to lower energy photons. High-bandwidth, high-sensitivity, compact and readily available photon-counting detector is a key technology for many future scientific developments and improved DoD application capabilities, according to DARPA.

Engineers have shown that a widely used method of detecting single photons can also count the presence of at least four photons at a time. The researchers say this discovery will unlock new capabilities in physics labs working in quantum information science around the world, while providing easier paths to developing quantum-based technologies.

Detector technologies

Depending on the wavelength regime of interest, different technologies have been utilized, such as silicon avalanche photodiodes (APDs) for visible wavelengths, photomultiplier tubes, or InGaAs-based APDs for the telecommunication range. In recent years, superconducting nanowire single-photon detectors (SNSPDs) have been shown to be promising alternatives, particularly when they are integrated directly onto waveguides and into photonic circuits. Apart from these, there are also some new technologies like hybrid photodetectors, visible light photon counters, frequency up-conversion, quantum dots & defects and carbon nanotubes.

 Semiconductor Single-Photon Avalanche Photodiodes (SPAD)

SPADs are currently the mainstream solution for single-photon detection in practical applications. SPAD device is operated in Geiger mode, for which biasing above the breakdown voltage results in a self-sustaining avalanche in response to the absorption of just a single photon. This electron cascade and multiplication effect, significantly amplifies the response and allows for an easy measurement of the response pulses.

In the visible light range, the best known and most widely used are Si avalanche photodiodes (APD’s). Detection of single-photon infrared (IR) radiation remains a major technological challenge because IR photons carry significantly less energy than those of visible light, making it difficult to engineer an efficient electron cascade. The most successful Si APD’s have their sensitivity restricted by the bandgap, while APD’s based on narrow-gap semiconductors exhibit unacceptably large dark counts.

The best quantum efficiency (QE) reported for InGaAs APD’s is 16% at 1.2 µm, but the large, 0.5-ns jitter and high, 10 -per-second dark counts make them not attractive for several important applications, including practical quantum communication systems.

The typical structure of an InGaAs/InP single-photon detector is made by a separate absorption and multiplication (SAM) region where a low-bandgap material (InGaAs) is used to absorb NIR photons and a compatible highbandgap material (InP) is used for avalanche multiplication through a high electric field.

Some tasks require free-running operation of the detector because the arrival time of the photons is unknown or they are spread over a long time slot (tens of microsecond). Free-running operation of InGaAs/InP detectors is challenging due to afterpulsing effects, where spontaneous dark detections can occur shortly after previous photon detections, due to trapping phenomena.

To minimize the afterpulsing effect, the avalanche current must be reduced since this reduces the probability that a trap gets filled in the first place. An appropriate circuit, referred to as quenching electronics, is necessary to rapidly suppress the avalanche by lowering the reverse bias down and to restore the SPAD to its armed state to detect the next incoming photon. The rapid quenching also reduces afterpulsing, therefore the quenching electronics plays a key role in a SPAD system. The after-pulsing effects in InGaAs APDs, make them ill-suited for applications requiring high duty-cycle and high-rate detection.

Usually, InGaAs APDs are operated in gated mode in which a periodic shot duration bias, synchronized to input photon timing, is applied. In the gated mode, however, InGaAs APDs cannot detect photons with a random input timing.

The InGaAs APDs can be operated at temperatures accessible via thermoelectric cooling, making them ideal for applications requiring compact photon-counting solutions.

NIST Patents Single-Photon Detector for Potential Encryption and Sensing Apps

Individual photons of light now can be detected far more efficiently using a device patented  by a team including the National Institute of Standards and Technology (NIST), whose scientists have overcome longstanding limitations with one of the most commonly used type of single-photon detectors. Their invention could allow higher rates of transmission of encrypted electronic information and improved detection of greenhouse gases in the atmosphere.

Semiconductor Single-Photon Avalanche Photodiodes (SPAD) based on indium-gallium-arsenide semiconductors is widely used in quantum cryptography research because it can detect photons at the particular wavelengths (colors of light) that travel through fiber. Unfortunately, when the detector receives a photon and outputs a signal, sometimes an echo of electronic noise is induced within the detector. Traditionally, to reduce the chances of this happening, the detector must be disabled for some time after each detection, limiting how often it can detect photons.

Usually, InGaAs APDs are operated in gated mode in which a periodic shot duration bias, synchronized to input photon timing, is applied. In the gated mode, however, InGaAs APDs cannot detect photons with a random input timing.

The team, which also includes scientists working at the California Institute of Technology and the University of Maryland, has patented a method to detect the photons that arrive when the gates are either open or closed. The NIST team had developed a highly sensitive way to read tiny signals from the detector, a method that is based on electronic interferometry, or the combining of waves such that they cancel each other out.

The approach allows readout of tiny signals even when the voltage pulses that open the gate are large, and the team found that these large pulses allow the detector to be operated in a new way. The pulses turn on the detector during the gate as usual. But in between gate openings the pulses turn the detector off so well that signals produced by absorbing a photon can linger for a while in the device. Then the next time the gate opens, these lingering signals can be amplified and read out.

The added ability to detect photons that arrive when the gate is closed increases the detector’s efficiency, an improvement that would be particularly beneficial in applications in which photons could arrive at any moment, such as atmospheric scanning and topographic mapping.

The new detector can count individual photons at a very high maximum rate—several hundred million per second—and at higher than normal efficiency, while maintaining low noise. Its efficiency is at least 50 percent for photons in the near infrared, the standard wavelength range used in telecommunications. Commercial detectors operate with only 20 to 30 percent efficiency.

Superconductor single photon detectors

Superconducting SPDs include superconducting nanowire singlephoton detectors (SNSPD), transition edge sensors and superconducting tunnel junctions.

Superconducting nanowire single-photon detector (SNSPD) has emerged as the fastest single-photon detector (SPD) for photon counting. The SNSPD consists of a thin (≈ 5 nm) and narrow (≈ 100 nm) superconducting nanowire. The nanowire is cooled well below its superconducting critical temperature and biased with a DC current that is close to but less than the superconducting critical current of the nanowire.

The absorption of a single photon in superconducting nanowires results in creation of hotspot, and subsequently, the superconducting current density increases due to the size expansion of the hotspot. Once the superconducting current density in the nanowires reaches the critical value, the nanowires are changed from the superconducting state to the normal resistance state. This transition generates a voltage signal of single-photon detection.

The primary advantages of SNSPDs are low dark count rate, high photon count rate and very accurate time resolution. The detection efficiency was low (at the level of a few percent) for early generation devices, but recently, this parameter has been significantly improved through the efforts of the SNSPD community.

SNSPDs tend to be expensive because they need very low temperatures to operate while photomultiplier tubes do not have high detection efficiency and are costly too. SNSPDs have wide spectral range from visible to mid IR , far beyond that of the Si single-photon avalanche photodiode (SPAD) and the SNSPD is superior to the InGaAs SPAD in terms of signal-to-noise ratio.


Cooling technology challenges

Most SNSPDs are made of niobium nitride (NbN), which offers a relatively high superconducting critical temperature (≈ 10 K) and a very fast cooling time (<100 picoseconds). NbN devices have demonstrated device detection efficiencies as high as 67% at 1064 nm wavelength with count rates in the hundreds of MHz. NbN devices have also demonstrated jitter – the uncertainty in the photon arrival time – of less than 50 picoseconds, as well as very low rates of dark counts, i.e. the occurrence of voltage pulses in the absence of a detected photon.

This detector operates at the boiling point of liquid helium (4.2 K), this temperature can be reached by by immersing it in liquid helium (He) or mounting the device in a cryogenic probe station. Liquid He is expensive, hazardous and demands trained personnel for correct use. This technique is satisfactory for testing superconducting devices in a low temperature physics laboratory; however if the ultimate goal is to provide a working device for users in other scientific fields or in military applications, alternative cooling methods must be sought.

Operating SNSPDs in a closed-cycle refrigerator offers a solution to this problem. The circulating fluid is high pressure, high purity He gas which is enclosed inside the refrigerator allowing continuous operation and eliminating repeated cryogenic handling.

The requirement of very low temperatures limit the operation of these devices only on the ground, which limits the use of SNSPDs to ground-based applications. For example, in the Lunar Laser Communication Demonstration project of the National Aeronautics and Space Administration, G-M cryocooler-based SNSPD systems were adopted at the employed ground station. Meanwhile, semiconducting single photon detectors without complicated cryocoolers were used for the satellite.

Researchers from Chinese Academy of Sciences (CAS),  have developed  a hybrid cryocooler that is compatible with space applications, which incorporates a two-stage high-frequency pulse tube (PT) cryocooler and a 4He Joule–Thomson (JT) cooler.

“To make a practical SNSPD system for space applications, we chose a superconducting NbTiN ultrathin film, which can operate sufficiently well above 2 K, to fabricate the SNSPDs, instead of using WSi, which usually requires sub-1-K temperatures. The hybrid cryocooler successfully cooled an NbTiN SNSPD down to a minimum temperature of 2.8 K. The NbTiN SNSPD showed a maximum SDE of over 50% at a wavelength of 1550 nm and a SDE of 47% at a DCR of 100 Hz. Therefore, these results experimentally demonstrate the feasibility of space applications for this SNSPD system,” write the authors.


Single-photon detector can count to four

Duke University, the Ohio State University and industry partner Quantum Opus, have discovered a new method for using a photon detector called a superconducting nanowire single-photon detector (SNSPD). In the new setup, the researchers pay special attention to the specific shape of the initial spike in the electrical signal, and show that they can get enough detail to correctly count at least four photons traveling together in a packet.

“Here, we report multi-photon detection using a conventional single-pixel SNSPD, where photon-number resolution arises from a time- and photon-number-dependent resistance 𝑅hsRhs of the nanowire during an optical wavepacket detection event. The different resistances give rise to different rise times of the generated electrical signal, which can be measured using a low-noise read-out circuit.”

“Photon-number-resolution is very useful for a lot of quantum information/communication and quantum optics experiments, but it’s not an easy task,” said Clinton Cahall, an electrical engineering doctoral student at Duke and first author of the paper. “None of the commercial options are based on superconductors, which provide the best performance. And while other laboratories have built superconducting detectors with this ability, they’re rare and lack the ease of our setup as well as its sensitivity in important areas such as counting speed or timing resolution.”


Chinese Superconducting Nanowire Single-Photon Detector Sets Efficiency Record

Researchers have demonstrated the fabrication and operation of a superconducting nanowire single-photon detector (SNSPD) with detection efficiency that they believe is the highest on record. The photodetector is made of polycrystalline NbN with system detection efficiency of 90.2 percent for 1550-nm-wavelength photons at 2.1 K. In experiments, the system detection efficiency saturated at 92.1 percent when the temperature was lowered to 1.8 K. The research team believes that such results could pave the way for the practical application of SNSPD for quantum information and other high-end applications.

For their SNSPD device, researchers from the Shanghai Institute of Microsystem and Information Technology and the Chinese Academy of Sciences used an integrated distributed Bragg reflector (DBR) cavity offering near unity reflection at the interface while performing systematic optimization of the NbN nanowire’s meandered geometry. This approach enabled researchers to simultaneously achieve the stringent requirements for coupling, absorption and intrinsic quantum efficiency.

The device exhibited timing jitters down to 79 picoseconds (ps), almost half that of previously reported WSi-SNSPD, promising additional advantages in applications requiring high timing precision.  Extensive efforts have been made to develop SNSPDs based on NbN, targeted at operating temperatures above 2 K, which are accessible with a compact, user-friendly cryocooler. Achieving a detection efficiency of more than 90 percent has required the simultaneous optimization of many different factors, including near perfect optical coupling, near perfect absorption and near unity intrinsic quantum efficiency.

The device has been applied to the quantum information frontier experiments at the University of Science and Technology of China


Graphene single photon detectors

Current detectors are efficient at detecting incoming photons that have relatively high energies, but their sensitivity drastically decreases for low frequency, low energy photons. In recent years, graphene has shown to be an exceptionally efficient photo-detector for a wide range of the electromagnetic spectrum, enabling new types of applications for this field.

Thus, in a recent paper published in the journal Physical Review Applied, and highlighted in APS Physics, ICFO researcher and group leader Prof. Dmitri Efetov, in collaboration with researchers from Harvard University, MIT, Raytheon BBN Technologies and Pohang University of Science and Technology, have proposed the use of graphene-based Josephson junctions (GJJs) to detect single photons in a wide electromagnetic spectrum, ranging from the visible down to the low end of radio frequencies, in the gigahertz range.

In their study, the scientists envisioned a sheet of graphene that is placed in between two superconducting layers. The so created Josephson junction allows a supercurrent to flow across the graphene when it is cooled down to 25 mK. Under these conditions, the heat capacity of the graphene is so low, that when a single photon hits the graphene layer, it is capable of heating up the electron bath so significantly, that the supercurrent becomes resistive – overall giving rise to an easily detectable voltage spike across the device. In addition, they also found that this effect would occur almost instantaneously, thus enabling the ultrafast conversion of absorbed light into electrical signals, allowing for a rapid reset and readout.

The results of the study confirm that we can expect a rapid progress in integrating graphene and other 2-D materials with conventional electronics platforms, such as in CMOS-chips, and shows a promising path towards single-photon-resolving imaging arrays, quantum information processing applications of optical and microwave photons, and other applications that would benefit from the quantum-limited detection of low-energy photons.


DARPA’s Fundamental Limits of Photon Detection—or Detect—program

Current photon detectors, such as semiconductor detectors, superconductor detectors, and biological detectors have various strengths and weaknesses as measured against eight technical metrics, including what physicists refer to as timing jitter; dark count; maximum rate; bandwidth; efficiency; photon-number resolution; operating temperature; and array size. There is currently no single detector that simultaneously excels at all eight characteristics. The fully quantum model developed and tested in Detect will help determine the potential for creating such a device.

“We want to know whether the basic physics of photon detection allows us, at least theoretically, to have all of the attributes we want simultaneously, or whether there are inherent tradeoffs,” Kumar said. “And if tradeoffs are necessary, what combination of these attributes can I maximize at the same time?”

“The goal of the Detect program is to determine how precisely we can spot individual photons and whether we can maximize key characteristics of photon detectors simultaneously in a single system,” said Prem Kumar, DARPA program manager. “This is a fundamental research effort, but answers to these questions could radically change light detection as we know it and vastly improve the many tools and avenues of discovery that today rely on light detection.”

Photons in the visible range fill at the minimum a cubic micron of space, which might seem to make them easy to distinguish and to count. The difficulty arises when light interacts with matter. A cubic micron of conventional photon-detection material has more than a trillion atoms, and the incoming light will interact with many of those atoms simultaneously. That cloud of atoms has to be modeled quantum mechanically to conclude with precision that a photon was actually there. And modeling at that massive scale hasn’t been possible—until recently.

“For decades we saw few significant advances in photon detection theory, but recent progress in the field of quantum information science has allowed us to model very large and complicated systems,” Kumar said. Advances in nano-science have also been critical, he added. “Nano-fabrication techniques have come a long way. Now not only can we model, but we can fabricate devices to test those models.”

The Fundamental Limits of Photon Detection (Detect) Program will establish the first-principles limits of photon detector performance by developing new models of photon detection in a variety of technology platforms, and by testing those models in proof-of-concept experiments.


DARPA SBIR to improve upon nanowire single-photon detector performance

DARPA issued SBIR project in 2014 to further improve upon the current state-of-the-art in nanowire single-photon detector performance while advancing the supporting technologies to allow for a compact, turn-key commercial system.

New results in superconducting nanowire devices have shown that high detection rates, low dark-count rates (DCRs), and high efficiency are all possible simultaneously with operating temperatures between 1 and 4 K.

Despite these results, further performance improvements are needed. For example, detection efficiency (DE) above 90% and bandwidth (BW) approaching 1 GHz has yet to be achieved simultaneously. In addition, innovations leading to a reduction in the system footprint and improved operability will provide better accessibility of such technologies to the relevant scientific and engineering communities.

The final system should provide multiple (>2), independent single-pixel detectors with performance superior to all current commercially available options (DE>90%, BW~1GHz, DCR< 1 Hz) in a ~5U 19™ rack-mount package. To achieve these goals, work under this SBIR may include the following: efforts to increase fabrication yields through the use of new materials or fabrication techniques, new device designs to improve bandwidth and sensitivity, efforts to reduce system SWaP through compact, application-specific cooling systems, electronics, and packaging.

The detectors developed under this SBIR will have applications for the DoD which include secure communications and active stand-off imaging systems. The improved availability and SWaP will allow the use of these detectors in all relevant government labs and open the door to new fieldable systems. For example, low power, portable optical communication links exceeding RF system bandwidths by 10-100x may be possible using the technology developed under this SBIR.

Photomultiplier (PMT) Tubes

A PMT consists of a photocathode and a series of dynodes in an evacuated glass enclosure. When a photon of sufficient energy strikes the photocathode, it ejects a photoelectron due to the photoelectric effect. The photocathode material is usually a mixture of alkali metals, which make the PMT sensitive to photons throughout the visible region of the electromagnetic spectrum. The photocathode is at a high negative voltage, typically -500 to -1500 volts.

The photoelectron is accelerated towards a series of additional electrodes called dynodes. These electrodes are each maintained at successively less negative potentials. Additional electrons are generated at each dynode. This cascading effect creates 105 to 107 electrons for each photoelectron that is ejected from the photocathode. The amplification depends on the number of dynodes and the accelerating voltage. This amplified electrical signal is collected at an anode at ground potential, which can be measured.

PMTs can have large active areas, but they suffer from low efficiency (~10%), high jitter (~150-ps) and high dark count rate. They are fragile, bulky, and sensitive to magnetic fields, require very high operating voltages, and are not conducive to making large format detector arrays. Moreover, their sensitivity in the SWIR spectral band is poor. Although PMT still plays an important role in some applications today, as with many vacuum tube-based devices, this 80-year-old technology is gradually being replaced by newer solid-state devices.


References and Resources also include:






World’s first Quantum satellite launched by China will enable it to build global unhackable ground and space network infrastructure

China  launched the world’s first quantum communications satellite officially known as Quantum Experiments at Space Scale, or QUESS, satellite. The launch took place on 16th Aug 2016 from the Jiuquan Satellite Launch Centre in the Gobi Desert, with a Long March 2D rocket sending the 620 kilogram (1,367 pound) satellite to a 600 kilometer (373 mile) orbit at an inclination of 97.79 degrees. “In its two-year mission, QUESS is designed to establish ‘hack-proof’ quantum communications by transmitting uncrackable keys from space to the ground,” Xinhua news agency said. These quantum keys  can be used to encrypt messages sent between cities thousands of miles apart. China then plans to put additional satellites into orbit China hopes to complete a QKD system linking Asia and Europe by 2020, and have a worldwide quantum Network.

China’s quantum satellite has  started produced  successful results. In a paper published  in Science, researchers from the Chinese Academy of Sciences announced the satellite had successfully distributed entangled photons between three different terrestrial base stations, separated by as much as 1,200 kilometers on the ground. The result is the longest entanglement ever demonstrated, and the first that spanned between the Earth and space. Researchers say the system “opens up a new avenue to both practical quantum communications and fundamental quantum optics experiments at distances previously inaccessible on the ground. Then, for good measure, on Sept. 29, 2017, they held a 75-minute videoconference between researchers in the two cities, also encrypted via quantum key.

“The newly-launched satellite marks a transition in China’s role – from a follower in classic information technology development to one of the leaders guiding future achievements,” Pan Jianwei, the project’s chief scientist, told the agency. Quantum communications holds “enormous prospects” in the field of defense, it added.

In November 2015, at the 18th Party 8 Congress’ 5th Plenum, Xi Jinping included quantum communications in his list of major science and technology projects that are prioritized for major breakthroughs by 2030, given their importance from the perspective of China’s long-term strategic requirement.

Many other countries like United States, Canada, Japan, and some EU countries are all racing to develop quantum communication networks as they are virtually un-hackable. Researchers from these countries are closely watching the China’s tests.

Satellite based Quantum key cryptography

Quantum technology is considered to be unbreakable and impossible to hack. A unique aspect of quantum cryptography is that Heisenberg’s uncertainty principle ensures that if Eve attempts to intercept and measure Alice’s quantum transmissions to Bob, her activities must produce an irreversible change in the quantum states that are retransmitted to Bob. These changes will introduce an anomalously high error rate in the transmissions between Alice and Bob, allowing them to detect the attempted eavesdropping.

Quantum key distribution (QKD), establishes highly secure keys between distant parties by using single photons to transmit each bit of the key. Photons are ideal for propagating over long-distances in free-space and are thus best suited for quantum communication experiments between space and ground. The unit of quantum information is the “qubit” (a bit of information “stamped” in a quantum physical property, for instance the polarization of a photon).

QKD thus solves the long-standing problem of securely transporting cryptographic keys between distant locations. “Even if the keys were transmitted across hostile territory, their integrity could be unambiguously verified upon receipt,” say Thomas Jennewein, Brendon Higgins and Eric Choi in SPIE.

Fiber optic based QKD systems are commercially available today, however are point to point links and limited to the order of few hundreds kms because of current optical fiber and photon detector technology. One way to overcome this limitation is by bringing quantum communication into space. An international team led by the Austrian physicist Anton Zeilinger has successfully transmitted quantum states between the two Canary Islands of La Palma and Tenerife, over a distance of 143 km. The previous record, set by researchers in China was 97 km. The process called quantum teleportation allows the state of one of the two entangled photons to be changed immediately without delay by changing the state of other photon even though they may be widely separated.

The biggest challenge, Alexander Ling, principal investigator at the Centre for Quantum Technologies in Singapore said, is being able to orient the satellite with pinpoint accuracy to a location on Earth where it can send and receive data without being affected by any disturbances in Earth’s atmosphere.  Ling said. “You’re trying to send a beam of light from a satellite that’s 500 kilometres (310 miles) above you.”

Record breaking accomplishments

In the spacecraft’s first record-breaking accomplishment, reported June 16 in Science, the satellite used onboard lasers to beam down pairs of entangled particles, to two cities in China, where the particles were captured by telescopes (SN: 8/5/17, p. 14). The quantum link remained intact over a separation of 1,200 kilometers between the two cities — about 10 times farther than ever before. The feat revealed that the strange laws of quantum mechanics, despite their small-scale foundations, still apply over incredibly large distances.

Next, scientists tackled quantum teleportation, a process that transmits the properties of one particle to another particle (SN Online: 7/7/17). Micius teleported photons’ quantum properties 1,400 kilometers from the ground to space — farther than ever before, scientists reported September 7 in Nature. Despite its sci-fi name, teleportation won’t be able to beam Captain Kirk up to the Enterprise. Instead, it might be useful for linking up future quantum computers, making the machines more powerful.


Quantum encrypted Video Chat

Pan’s team has demonstrated a more important and practical capability: the establishment of a secure quantum communication channel between distant parties, the technology that made the quantum-encrypted video chat possible. By employing the satellite as a photon emitter and relay, team members in Graz, Austria, and Xinglong, China, developed and shared a 100-kilobyte key that they used to securely exchange photos and hold a video conference.

Micius initiated the link between Graz and Xinglong through a combination of quantum and classical signals, following a version of the BB84 protocol devised by Charles Bennett and Gilles Brassard. As the satellite passed over a station, it emitted photons that were each prepared in a random polarization state; the station performed one of two polarization measurements on each received photon. Using the measurement types and results for the exchanged photons, the satellite and station established a unique key. Once Micius developed a key with both stations, it performed a logic operation (specifically, an exclusive OR) on the two strings of bits and sent the results via a classical radio channel to one of the stations. The station with that extra information compared those bits with its own key to determine its counterpart’s key.

To test the secure connection, the Austrian and Chinese researchers exchanged 5-kilobyte JPEG images that were indecipherable without the shared quantum key. They then used most of the remaining bits of the key to ensure the security of a 75-minute intercontinental video conference

In a paper published in the Nov. 17 Physical Review Letters, the researchers performed another type of quantum key distribution, using entangled particles to exchange keys between the ground and the satellite.


China’s Quantum Satellite Mission Objectives

The major goal is to test the possibilities of relaying quantum “keys” carried by photons, or light particles, over 500 to 1,200 kilometers from a satellite to ground stations to create a new kind of information transmission network that cannot be hacked without detection. The satellite will enable secure communications between Beijing and Urumqi, Xinhua said. Other missions include quantum teleportation and quantum entanglement, both for the first time in space.

“Initial tests on the satellite have reached a transmission rate that will allow us to finish these experiments within several weeks, so we will have time to add new experiments,” Pan said. He said the plans include more complex quantum tests between Micius and five ground stations across China this year, and then cross-continental quantum communication experiments to establish links with ground stations in Austria, Italy and Canada in 2018.

Pan Jianwei, the projects’ chief scientist also said that the 2,000-km quantum communication main network between Beijing and Shanghai will be fully operational in the second half of this year. The network would be used by the central government, military and critical business institutions like banks. Government agencies and banks in cities along the route can use it first.

“There are many bottlenecks in the information security. The Edward Snowden case has told us that the information in the transmission networks are exposed to risks of being monitored and being attacked by hackers,” Pan said. In 2012, Pan’s group built the world’s first metropolitan area quantum network in Hefei, linking 46 nodes to allow real-time voice communications, text messages and file transfers. The quantum satellite is part of the country’s Strategic Priority Program on Space Science that started in 2011 and planned to launch four satellites by the end of the year.

The 620kg QUES satellite would seek breakthroughs in cryptography and test laws of quantum mechanics like teleportation and quantum entanglement on a global scale. The experimental satellite would contain a quantum key communicator, quantum entanglement emitter, entanglement source, processing unit and a laser communicator.

The aim of the new experiment conducted by a team led by physicist Pan Jian-Wei from the University of Science and Technology of China in Hefei is: “To see if we can establish quantum key distribution [the encoding and sharing of a secret cryptographic key using the quantum properties of photons between a ground station in Beijing and the satellite, and between the satellite and Vienna. Then we can see whether it is possible to establish a quantum key between Beijing and Vienna, using the satellite as a relay.”

The second step will be to perform long-distance entanglement distribution, over about 1,000 kilometres. We have technology on the satellite that can produce pairs of entangled photons. We beam one photon of an entangled pair to a station in Delingha, Tibet, and the other to a station in Lijiang or Nanshan. The distance between the two ground stations is about 1,200 kilometres. Previous tests were done on the order of 100 kilometres.

“In principle, quantum entanglement can exist for any distance. But we want to see if there is some physical limit… we hope to build some sort of macroscopic system in which we can show that the quantum phenomena can still exist,” Pan told Nature, in describing the theoretical premises for the experiment.

This could potentially facilitate super-fast, long-range communications, as well as lead to the creation of unbreakable quantum communication networks.

China has collaborated with the Austrian Academy of Sciences to provide the optical receivers at a ground station in Vienna, while three more stations have also been planned across Austria. Eventually, the Chinese team is planning to launch about 10 additional satellites, which would fly in formation to allow for coverage across more areas of the globe.

Military Capability

“China is completely capable of making full use of quantum communications in a regional war,” China’s leading quantum-communications scientist, Pan Jianwei, said. “The direction of development in the future calls for using relay satellites to realize quantum communications and control that covers the entire army.”

Matthew Luce, a researcher with Defense Group Inc.’s Center for Intelligence Research and Analysis, thinks “A functional satellite-based quantum communication system would give the Chinese military the ability to operate further afield without fear of message interception.”

Militaries have become dependent on Satellites that provide intelligence of adversary’s activities by capturing high resolution images, radar and communication signals, providing wide area real time communications among battle troops and command and control. However, Satellites are vulnerable to jamming, cyber-attacks and other ASAT weapons. China is also developing technologies like electronic warfare, DEW and other ASAT weapons that can disrupt its adversary’s satellites. By developing satellite based quantum cryptology China shall be able to gain information superiority over other countries as it would be able to collect, process, and disseminate an uninterrupted flow of information while exploiting or denying its adversary’s ability to do the same.

Although the Chinese government has not revealed the projects budget, scientists told state media that the construction cost would be ¥100m (£10.17m) for every 10,000 users, according to the South China Morning Post.

Global Satellite Quantum Network

China is also first country to release a detailed schedule to put this technology to large-scale use. Communications satellite would be a first step toward building a quantum communications network in the sky. China hopes to complete a Asia-Europe intercontinental quantum key distribution in 2020 and build a global quantum communication network by 2030.

The team’s future plans also include making use of China’s future space station, Tiangong, which is expected to be created by the end of the decade, to conduct “upgraded” quantum experiments. “We will have a quantum experiment on the space station and it will make our studies easier because we can from time to time upgrade our experiment (unlike on the quantum satellite).

Quantum Communication between Earth and Moon

In the future, Pan also hopes to create a signal transmitting system that could facilitate communication between the Earth and the Moon. “In the future, we also want to see if it is possible to distribute entanglement between Earth and the Moon. We hope to use the [China’s Moon program] to send a quantum satellite to one of the gravitationally-stable points in the Earth-Moon system,” he told the weekly.

“I think China has an obligation not just to do something for ourselves — many other countries have been to the Moon, have done manned spaceflight — but to explore something unknown,” Pen said. The scientist also predicted that the world will soon enter a quantum era with a revolution in quantum physics taking the world by storm and leading to the creation of super-fast quantum computers and large quantum communication networks, China’s People’s Daily reported.




References and resources also  include:

Scientists solve critical challenge of error correction for building large scale fault tolerant quantum computers

‘The development of a “quantum computer” is one of the outstanding technological challenges of the 21st century. A quantum computer is a machine that processes information according to the rules of quantum physics, which govern the behaviour of microscopic particles at the scale of atoms and smaller said Dr Chris Ballance, a research fellow at Magdalen College, Oxford. It turns out that this quantum-mechanical way of manipulating information gives quantum computers the ability to solve certain problems far more efficiently than any conceivable conventional computer.

One such problem is related to breaking secure codes, while another is searching large data sets. Quantum computers are naturally well-suited to simulating other quantum systems, which may help, for example, our understanding of complex molecules relevant to chemistry and biology.’ Quantum computers has many applications in military too like efficient decoding of cryptographic codes like RSA, AI / Pattern recognition tasks like discriminating between missile and decoy, Bioinfromatics like efficient analysis of new bioengineered threat using MCMC (Markov Chain Monte Carlo) methods.

In order to reach their full potential, today’s quantum computer prototypes have to meet specific criteria: First, they have to be made bigger, which means they need to consist of a considerably higher number of quantum bits. Second, they have to be capable of processing errors.  Quantum systems are naturally fragile: they constantly evolve in uncontrolled ways due to unwanted interactions with the environment, leading to errors in the computation.

One of the main difficulties of quantum computation is that decoherence destroys the information in a superposition of states contained in a quantum computer, thus making long computations impossible. Quantum error correction is used to protect quantum information from errors due to decoherence and other quantum noise.

Quantum error correction is essential if one is to achieve fault-tolerant quantum computation that can deal not only with noise on stored quantum information, but also with faulty quantum gates, faulty quantum preparation, and faulty measurements. Demonstrating error correction that actually works is the biggest remaining challenge for building a quantum computer.

A study led by physicists at Swansea University in Wales, carried out by an international team of researchers and published in the journal Physical Review X shows that ion-trap technologies available today are suitable for building large-scale quantum computers. The scientists introduce trapped-ion quantum error correction protocols that detect and correct processing errors.

Researchers have been developing quantum error correction  code that would correct any errors in quantum data, and it would require measurement of only a few quantum bits, or qubits, at a time.

Physicists have applied the ability of machine learning algorithms to learn from experience to one of the biggest challenges currently facing quantum computing: quantum error correction, which is used to design noise-tolerant quantum computing protocols. In a new study, they have demonstrated that a type of neural network called a Boltzmann machine can be trained to model the errors in a quantum computing protocol and then devise and implement the best method for correcting the errors.

Quantum Error Correction

Classical error correction employs redundancy for instance by storing the information multiple times, and—if these copies are later found to disagree—just take a majority vote. However, copying quantum information is not possible due to the no-cloning theorem, but it is possible to spread the information of one qubit onto a highly entangled state of several (physical) qubits , even hundreds or thousands of them.

Peter Shor first discovered this method of formulating a quantum error correcting code by storing the information of one qubit onto a highly entangled state of nine qubits. Quantum error-correction codes take advantage of these other qubits to uncover the errors without really resorting to copying the value of the original qubit. The basis of quantum error correction is measuring parity. The parity is defined to be “0” if both qubits have the same value and “1” if they have different values. Crucially, it can be determined without actually measuring the values of both qubits.

Scientists at the Yale University showed that it is possible to track quantum errors in real time. The team used ancilla or a more stable reporter atom that detected errors in the system without actually disturbing any qubits. During the experiment, researchers used a superconducting box. The box had the reporter atom as well as an unknown number of photons, which were cooled to about negative 459°F, a fraction of a degree above absolute zero. The ancilla just reports photon parity – whether there was a change from even to odd/odd to even photons in the box – and not exact numbers, according to Researchers.

Trapping ions in a maze

Scientists have developed comparable schemes for quantum computers, where quantum information is encoded in several entangled physical quantum bits.”Here we exploit quantum mechanical properties for error detection and correction,” explains Markus Müller from Swansea University, Wales. “If we can keep the noise below a certain threshold, we will be able to build quantum computers that can perform quantum computations of arbitrary complexity by increasing the number of entangled quantum bits accordingly.”

Markus Müller and his colleague Alejandro Bermudez Carballo explain that in order to achieve this goal, the capabilities of the technological platforms have to be optimally exploited. “For beneficial error correction we need quantum circuits that are stable and work reliably under realistic conditions even if additional errors occur during the error correction,” explains Bermudez. They introduced new variants of fault-tolerant protocols and investigated how these can be implemented with currently available operations on quantum computers.

The researchers found that a new generation of segmented ion traps offers ideal conditions for the process: Ions can be shuttled quickly across different segments of the trap array. Precisely timed processes allow parallel operations in different storage and processing regions. By using two different types of ions in a trap, scientists may use one type as carriers of the data qubits while the other one may be used for error measurement, noise suppression and cooling.

Building on the experimental experience of research groups in Innsbruck, Mainz, Zurich und Sydney the researchers defined criteria that will allow the scientists to determine whether the quantum error correction is beneficial. By using this information they can guide the development of future ion-trap quantum computers with the goal to realize a logical quantum bit in the near future that, owed to error correction, exceeds the properties of a pure physical quantum bit.

Simon Benjamin’s research group at the University of Oxford showed through complex numerical simulations of the new error correction protocols how the hardware of next generation ion-trap quantum computers has to be built to be able to process information fault-tolerantly. “Our numerical results clearly show that state-of-the-art ion-trap technologies are well suited to serve as platforms for constructing large-scale fault-tolerant quantum computers,” explains Benjamin.

Repetitive error correction

Researchers at the University of California, Santa Barbara (UCSB) and Google have demonstrated repetitive error correction in an integrated quantum device that consists of nine superconducting qubits. “Our nine-qubit system can protect itself from bit errors that unavoidably arise from noise and fluctuations from the environment in which the qubits are embedded,” explains team member Julian Kelly.

The researchers repetitively measured the parity between adjacent “data” qubits by making use of “measurement” qubits. “Each cycle, these measurement qubits interact with their surrounding data qubits using quantum logic gates and we can then measure them,” Kelly explains. “When an error occurs, the parity changes accordingly and the measurement qubit reports a different outcome. By tracking these outcomes, we can figure out when and where a bit error has occurred and correct for it.”

The more qubits that are involved in the process, the more information is available to identify and correct for errors, explains team member Austin Fowler. “Errors can occur at any time and in all types of qubits: data qubits, measurement qubits, during gate operation and even during measurements. We found that a five-qubit device is robust to any type of bit error occurring anywhere during an algorithm, but a nine-qubit device is better because it is robust to any combination of two-bit errors.”

Surface Code Architecture

The Google and UCSB team eventually hope to build a 2-D surface code architecture based on a checkerboard arrangement of qubits, so that “white squares” would represent the data qubits that perform operations and “black squares” would represent measurement qubits that can detect errors in neighboring qubits. The “measurement” qubits are entangled with neighboring “data” qubits, share information through a quantum connection.

The surface code architecture allows a lower accuracy of quantum logic operations, 99 percent instead of 99.999 percent in other quantum error-correction schemes. IBM researchers have also done pioneering work in making surface-code error correction work with superconducting qubits. One IBM group demonstrated a smaller three-qubit system capable of running surface code, although that system had a lower accuracy—94 percent.

Currently, groups are modifying the material properties of their qubits, improving lithography techniques and improving pulse-shaping techniques to make qubit lifetimes longer. This should increase the fidelity of the qubits and make implementing a surface code less resource-intensive.

Researchers prevent quantum errors from occurring by continuously watching a quantum system

A team led by Tim Taminiau managed to suppress  quantum errors through the so-called quantum Zeno effect. A team of scientists led by Tim Taminiau of QuTech, the quantum institute of TU Delft and TNO, has now experimentally demonstrated that errors in quantum computations can be suppressed by repeated observations of quantum bits. If an observable of a quantum state is measured, the system is projected into an eigenstate of this observable. For example, if a qubit in a superposition of ‘0’ and ‘1’ is observed, the qubit is projected into either ‘0’ or ‘1’ and will remain frozen in that state under repeated further observations.

Joint observables

While just freezing a quantum state by projecting a single qubit does not allow for computations, new opportunities arise when observing joint properties of multi-qubit systems. The projection of joint properties of qubits can be explained with the following analogy: consider grouping three-dimensional objects based on their two-dimensional projection. Shapes can still transform within a subgroup (for example between a cube and a cylinder), but unwanted changes (for example to a sphere) are suppressed by the constant observations of the 2D projection. Similarly, the projection of joint observables in multi-qubit systems generates quantum subspaces. In this way, unwanted evolution between different subspaces can be blocked, while the complex quantum states within one subspace allow for quantum computations.


The QuTech scientists experimentally generated quantum Zeno subspaces in up to three nuclear spins in diamond. Joint observables on these nuclear spins are projected via a nearby electronic spin, generating protected quantum states in Zeno subspaces. The researchers show an enhancement in the time that quantum information is protected with increasing number of projections and derive a scaling law that is independent of the number of spins. The presented work allows for the investigation of the interplay of frequent observations and various noise environments. Furthermore, the projection of joint observables is the basis of most quantum error correction protocols, which are essential for useful quantum computations.

New Quantum error correction Protocol corrects virtually all errors in quantum memory, but requires little measure of quantum states.

The ideal quantum error correction code would correct any errors in quantum data, and it would require measurement of only a few quantum bits, or qubits, at a time. But until now, codes that could make do with limited measurements could correct only a limited number of errors — one roughly equal to the square root of the total number of qubits. So they could correct eight errors in a 64-qubit quantum computer, for instance, but not 10.

In a paper they’re presenting at the Association for Computing Machinery’s Symposium on Theory of Computing in June, researchers from MIT, Google, the University of Sydney, and Cornell University present a new code that can correct errors afflicting — almost — a specified fraction of a computer’s qubits, not just the square root of their number. And for reasonably sized quantum computers, that fraction can be arbitrarily large — although the larger it is, the more qubits the computer requires.

A quantum computation is a succession of states of quantum bits. The bits are in some state; then they’re modified, so that they assume another state; then they’re modified again; and so on. The final state represents the result of the computation.

In their paper, Harrow and his colleagues assign each state of the computation its own bank of qubits; it’s like turning the time dimension of the computation into a spatial dimension. Suppose that the state of qubit 8 at time 5 has implications for the states of both qubit 8 and qubit 11 at time 6. The researchers’ protocol performs one of those agreement measurements on all three qubits, modifying the state of any qubit that’s out of alignment with the other two.

Since the measurement doesn’t reveal the state of any of the qubits, modification of a misaligned qubit could actually introduce an error where none existed previously. But that’s by design: The purpose of the protocol is to ensure that errors spread through the qubits in a lawful way. That way, measurements made on the final state of the qubits are guaranteed to reveal relationships between qubits without revealing their values. If an error is detected, the protocol can trace it back to its origin and correct it.

It may be possible to implement the researchers’ scheme without actually duplicating banks of qubits. But, Harrow says, some redundancy in the hardware will probably be necessary to make the scheme efficient. How much redundancy remains to be seen: Certainly, if each state of a computation required its own bank of qubits, the computer might become so complex as to offset the advantages of good error correction.

But, Harrow says, “Almost all of the sparse schemes started out with not very many logical qubits, and then people figured out how to get a lot more. Usually, it’s been easier to increase the number of logical qubits than to increase the distance — the number of errors you can correct. So we’re hoping that will be the case for ours, too.”

Stephen Bartlett, a physics professor at the University of Sydney who studies quantum computing, doesn’t find the additional qubits required by Harrow and his colleagues’ scheme particularly daunting. “It looks like a lot,” Bartlett says, “but compared with existing structures, it’s a massive reduction. So one of the highlights of this construction is that they actually got that down a lot.”

Machine learning tackles quantum error correction

The physicists, Giacomo Torlai and Roger G. Melko at the University of Waterloo and the Perimeter Institute for Theoretical Physics, have published a paper on the new machine learning algorithm in a recent issue of Physical Review Letters.

“The idea behind neural decoding is to circumvent the process of constructing a decoding algorithm for a specific code realization (given some approximations on the noise), and let a neural network learn how to perform the recovery directly from raw data, obtained by simple measurements on the code,” Torlai told Phys.org. “With the recent advances in quantum technologies and a wave of quantum devices becoming available in the near term, neural decoders will be able to accommodate the different architectures, as well as different noise sources.”

As the researchers explain, a Boltzmann machine is one of the simplest kinds of stochastic artificial neural networks, and it can be used to analyze a wide variety of data. Neural networks typically extract features and patterns from raw data, which in this case is a data set containing the possible errors that can afflict quantum states.

Once the new algorithm, which the physicists call a neural decoder, is trained on this data, it is able to construct an accurate model of the probability distribution of the errors. With this information, the neural decoder can generate the appropriate error chains that can then be used to recover the correct quantum states.

The researchers tested the neural decoder on quantum topological codes that are commonly used in quantum computing, and demonstrated that the algorithm is relatively simple to implement. Another advantage of the new algorithm is that it does not depend on the specific geometry, structure, or dimension of the data, which allows it to be generalized to a wide variety of problems.

In the future, the physicists plan to explore different ways to improve the algorithm’s performance, such as by stacking multiple Boltzmann machines on top of one another to build a network with a deeper structure. The researchers also plan to apply the neural decoder to more complex, realistic codes.

“So far, neural decoders have been tested on simple codes typically used for benchmarks,” Torlai said. “A first direction would be to perform error correction on codes for which an efficient decoder is yet to be found, for instance Low Density Parity Check codes. On the long term I believe neural decoding will play an important role when dealing with larger quantum systems (hundreds of qubits). The ability to compress high-dimensional objects into low-dimensional representations, from which stems the success of machine learning, will allow to faithfully capture the complex distribution relating the errors arising in the system with the measurements outcomes.”


References and Resources also include








New technology breakthroughs enabling global ultra-secure Quantum networks for financial, government and Military

Quantum key distribution (QKD), establishes highly secure keys between distant parties by using single photons to transmit each bit of the key. A unique aspect of quantum cryptography is that Heisenberg’s uncertainty principle ensures that any attempts to intercept and measure quantum transmissions, will introduce an anomalously high error rate in the transmissions between Alice and Bob, allowing them to detect the attempted eavesdropping. QKD is suitable for use in any key distribution application that has high security requirements including financial transactions, electoral communications, law enforcement, government, and military applications.


Currently Most Quantum Communication links are direct point-to-point links through telecom optical fibers and, ultimately limited to about 300-500 km due to losses in the fiber. Long distance fibre optic communications exploit the low loss of silica fibres in the 1.3 μm and 1.55 μm wavelength bands. In optical fibers, the wavelength of 1550 nm is very convenient, as it experiences the lowest absorption losses of the whole spectrum, and can be detected with Indium-Gallium-Arsenide avalanche photodiodes (InGaAs APD) in the single photon counting regime.


The next important milestone, is development of large scale QKD network to extend QKD from point-to-point configuration to multi-user and large-scale scenario. China has also operationalised the 2,000-km quantum communication main network between Beijing and Shanghai using quantum repeaters.


Another way to overcome distance limitation is by bringing quantum communication into space. An international team led by the Austrian physicist Anton Zeilinger has successfully transmitted quantum states between the two Canary Islands of La Palma and Tenerife, over a distance of 143 km.


Free space QKD channels based on Free space laser communication and have several advantages over the optical fiber. Firstly, the atmosphere is an almost non birefringent medium which guarantees the preservation of photon polarization. Secondly, there is a relatively low absorption loss in the atmosphere for certain wavelengths. This fact enables us to achieve a longer communication range. For free-space quantum channels, the transmission wavelength is usually chosen around 780 nm, which corresponds to the quantum efficiency peak of Silicon Avalanche Photodiodes (Si APD).


Recently, China launched a quantum science satellite and performing many quantum experiments with optical links between space and ground. In one of the experiment the Micius’ satellite used quantum key distribution for secure video chat between one ground station near Vienna,  with one near Beijing.


However, establishing global QKD networks would require combining satellite networks with fiber optic links and free space links all dissimilar in wavelength.


“Interfacing fundamentally different quantum systems is key to building future hybrid quantum networks. Such heterogeneous networks offer capabilities superior to those of their homogeneous counterparts, as they merge the individual advantages of disparate quantum nodes in a single network architecture. However, few investigations of optical hybrid interconnections have been carried out, owing to fundamental and technological challenges such as wavelength and bandwidth matching of the interfacing photons,” write authors in nature.

The ICFO researchers have developed a solution and solved the challenge of a reliable transfer of quantum states between different quantum nodes via single photons. A single photon needs to interact strongly and in a noise-free environment with the heterogeneous nodes or matter systems, which generally function at different wavelengths and bandwidths. As Nicolas Maring states “it’s like having nodes speaking in two different languages. In order for them to communicate, it is necessary to convert the single photon’s properties so it can efficiently transfer all the information between these different nodes.”


Quantum internet goes hybrid

In a recent study published in Nature, ICFO researchers led by ICREA Prof. Hugues de Riedmatten report an elementary “hybrid” quantum network link and demonstrate photonic quantum communication between two distinct quantum nodes placed in different laboratories, using a single photon as information carrier.



In their study, the ICFO researchers used two very distinct quantum nodes: the emitting node was a laser-cooled cloud of Rubidium atoms and the receiving node a crystal doped with Praseodymium ions. From the cold gas, they generated a quantum bit (qubit) encoded in a single photon with a very-narrow bandwidth and a wavelength of 780 nm. They then converted the photon to the wavelength of 1552 nm to demonstrate that this network could be completely compatible with the current telecom C-band range.


Subsequently, they sent it through an optical fiber from one lab to the other. Once in the second lab, the photon’s wavelength was converted to 606 nm in order to interact correctly and transfer the quantum state to the receiving doped crystal node. Upon interaction with the crystal, the photonic qubit was stored in the crystal for approximately 2.5 microseconds and retrieved with very high fidelity.


For this experiment the researchers used a photon encoding technique called time-bin encoding, which is very well suited to communicating qubits and preventing interference. Our results open up the prospect of optically connecting quantum nodes with different capabilities and represent an important step towards the realization of large-scale hybrid quantum networks.




References and Resources also include:



IBM, Rigetti, MIT, Microsoft & others taking big strides in building large scale programmable quantum computers

Quantum computing and quantum information processing are next revolutionary technology expected to have immense impact. Quantum computers will be able to perform tasks too hard for even the most powerful conventional supercomputer and have a host of specific applications, from code-breaking and cyber security to medical diagnostics, big data analysis and logistics. Quantum computers could accelerate the discovery of new materials, chemicals and drugs. They could dramatically reduce the current high costs and long lead times involved in developing new drugs.

Many groups are  at the threshold of producing scalable quantum computers. The leader of Google’s quantum computing lab, has predicted that it can build chips with about 100 reliable qubits in a couple of years. Its  group has built a nine-qubit machine based on tiny, superconducting circuits and hopes to scale up to 49 within a year—an important threshold.

Researchers at D-Wave, IBM, MIT Lincoln Lab, and elsewhere have also developed superconducting qubits of high quality.

At the IEEE Industry Summit on the Future of Computing in Washington D.C. in Oct 2017, IBM announced the development of a quantum computer capable of handling 50 qubits (quantum bits) so far the largest and most powerful quantum computer ever built. At about 50 qubits, many say a quantum computer could achieve “quantum supremacy,” a term coined by John Preskill, a physicist at the California Institute of Technology in Pasadena, to denote a quantum computer that can do something beyond the ken of a classical computer, such as simulate molecular structures in chemistry and materials science, or tackle certain problems in cryptography or machine learning.

Rigetti is also making the new quantum computer—which can handle 19 quantum bits, or qubits—available through its cloud computing platform, called Forest.

Now Microsoft has also  doubled down on quantum computing bet. “People are really building things,” says Christopher Monroe, a physicist at the University of Maryland in College Park who co-founded the start-up IonQ in 2015. “I’ve never seen anything like that. It’s no longer just research.”

Canadian firm D-Wave has released the new quantum computer that will be able to handle some 2,000 quantum bits (qubits), roughly double the usable number found in the processor in the existing D-Wave 2X system, and be capable of solving certain problems 1,000x faster than its predecessor.

However, D-Wave is not a universal quantum computer, it has been designed specifically to perform a process called “quantum annealing”, which is a technique for finding the global minimum of some complicated mathematical expressions hence expected to be capable of solving optimization and sorting problems exponentially faster than a classical computer.

Quantum Computer Approaches

There have been two leading approaches for building general purpose Quantum computer. One approach, adopted by Google, IBM, Rigetti and Quantum Circuits involves encoding quantum states as oscillating currents in superconducting loops. The other, pursued by IonQ and several major academic labs, is to encode qubits in single ions held by electric and magnetic fields in vacuum traps.

Recently, researchers at Cornell University pitted two quantum computers against each other in an epic virtual battle. The challenge was to perform and solve an algorithm to compare and determine which quantum computer is the most effective. Both computers are state-of-the-art 5-qubit quantum computers which operate on two entirely different platforms.

“Recently, two architectures, superconducting transmon qubits and trapped ions, have reached a new level of maturity. They have become fully programmable multi-qubit machines that provide the user with the flexibility to implement arbitrary quantum circuits from a high-level interface. This makes it possible for the first time to test quantum computers irrespective of their particular physical implementation,” write N. M. Linke and others.

Despite their differences, the researchers discovered a way to program the computers in such a way that is blind to the operating hardware. The results determined one computer was more reliable, and the other could carry out operations faster.

“For a long time, the devices were so immature that you couldn’t really put two five-qubit gadgets next to each other and perform this kind of comparison,” says Simon Benjamin, a physicist at the University of Oxford in the United Kingdom, who is not affiliated with the study. “It’s a sign that this technology is maturing.” In addition, the results suggest that co-designing particular quantum applications with the hardware itself will be paramount in successfully using quantum computers in the future.


 Quantum computing making big strides

The startup Rigetti Computing is trying to build the hardware needed to power a quantum computer. The company aims to produce a prototype chip by the end of 2017 that is significantly more complex than those built by other groups working on fully programmable quantum computers. The following generation of chips should be able to accelerate some kinds of machine learning and run highly accurate chemistry simulations that might unlock new kinds of industrial processes, says Chad Rigetti, the startup’s founder and CEO. It is also working on software to make it easy for other companies to write code for its quantum hardware.

Rigetti says his company has worked out a qubit design that should be stable enough to scale up, and that can be made using conventional chip-manufacturing techniques. The startup is currently testing a three-qubit chip made using aluminum circuits on a silicon wafer, and the design due next year should have 40 qubits.


 IBM’s five qubit general purpose quantum computer based on superconducting atoms

In analogy to atoms given by nature, the man-made superconducting circuits in the IBM quantum computer can be thought of as “artificial atoms” . They are transmon qubits , or superconducting islands connected by Josephson junctions and shunt capacitors that provide superpositions of charge states which are insensitive to charge fluctuations.

The device used here has a range of qubit frequencies between 5 and 5.4 GHz. The qubits are connected to each other and the classical control system by microwave resonators. State preparation  and readout, as well as single- and two qubit gates , are achieved by applying tailored microwave signals to this network and measuring the response. Qubits are resolved in the frequency domain during addressing and readout. The publicly accessible system runs autonomously, not requiring any human intervention over many weeks.

“This is a very exciting time,” says Daniel Lidar, director of the Center for Quantum Information Science and Technology at the University of Southern California. “This is not incremental; we’re really starting to see various groups working with superconducting qubits taking big strides forward.” However, it is still far from clear when useful, large-scale quantum chips might be made, says Lidar. A serious attempt at building one remains an expensive undertaking, he says.

MIT Just Unveiled A Technique to Mass Produce Quantum Computers

Researchers have found a way to make the creation of qubits simpler and more precise. The team hopes that this new technique could, one day, allow for the mass production of quantum computers. Researchers from MIT, Harvard University, and Sandia National Laboratories unveiled a simpler way of using atomic-scale defects in diamond materials to build quantum computers in a way that could possibly allow them to be mass produced.

For this process, defects are they key. They are precisely and perfectly placed to function as qubits and hold information. Previous processes were difficult, complex, and not precise enough. This new method creates targeted defects in a much simpler manner. Experimentally, defects created were, on average, at or under 50 nanometers of the ideal locations.

The significance of this cannot be overstated. “The dream scenario in quantum information processing is to make an optical circuit to shuttle photonic qubits and then position a quantum memory wherever you need it,” says Dirk Englund, an associate professor of electrical engineering and computer science, in an interview with MIT. “We’re almost there with this. These emitters are almost perfect.”

One of the main remaining hurdles is how these computers will read the qubits. But these diamond defects aim to solve that problem because they naturally emit light, and since the light particles emitted can retain superposition, they could help to transmit information.

The research goes on to detail how the completion of these diamond materials better allowed for the amplification of the qubit information. By the end, the researchers found that the light emitted was approximately 80-90 percent as bright as possible.


Five qubit fully programmable computer based on ions

Scientists have created the first programmable and reprogrammable quantum computer, according to a new study. The technology could usher in a much-anticipated era of quantum computing, which researchers say could help scientists run complex simulations and produce rapid solutions to tricky calculations.

“Until now, there hasn’t been any quantum-computing platform that had the capability to program new algorithms into their system. They’re usually each tailored to attack a particular algorithm,” said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.

Now, Debnath and his colleagues have developed the first fully programmable and reprogrammable quantum computer. The new device is made of five qubits. Each qubit is an ion, or electrically charged particle, trapped in a magnetic field.

It consists of five ytterbium ions lined up and trapped in an electromagnetic field. The electronic state of each ion can be controlled by zapping it with a laser. This allows each ion to store a bit of quantum information.

The scientists can use lasers to manipulate these ions — five ytterbium atoms — infusing them with precise amounts of energy and influencing their interactions with each other. Because they are charged, the ions exert a force on each other, and this causes them to vibrate at frequencies that can be precisely controlled and manipulated. These vibrations are quantum in nature and allow the ions to become entangled. In this way, the quantum bits they hold can interact.

By controlling these interactions, physicists can carry out quantum logic operations. And quantum algorithms are simply a series of these logic operations one after the other. In this way, the researchers can program and reprogram the quantum computer with a variety of algorithms.

The researchers tested their device on three algorithms that quantum computers, as prior work showed, could execute quickly. One, the so-called Deutsch-Jozsa algorithm, is typically used only for tests of quantum-computing capabilities. Another, the Bernstein-Vazirani algorithm, can also be used to probe for errors in quantum computing. The last, the quantum Fourier transform algorithm, is an element in quantum-computing encryption-breaking applications.

The Deutsch-Jozsa and Bernstein-Vazirani algorithms successfully ran 95 and 90 percent of the time, respectively. The quantum Fourier transform algorithm, which the researchers said is among the most complicated quantum calculations, had a 70 percent success rate, they said.

In the future, the researchers will test more algorithms on their device, Debnath said. “We’d like this system to serve as a test bed for examining the challenges of multiqubit operations, and find ways to make them better,” Debnath told Live Science.

The team claims it can go much further. In particular, they say that their module is scalable—that several five-qubit modules can be connected together to form a much more powerful quantum computer.

Japanese boffins try ‘token passing’ to scale quantum calculations

A Japanese team has published what it believes is a solution to the problem of scale. Quantum gates are complex creatures with many more components than their classical equivalents, so instead of trying to cram enough gates into a small space to perform calculations, the University of Tokyo proposal is to send photons around in a ring, re-using one gate to act on different photons in turn.

If need be, the light pulses can travel around the loop indefinitely, according to professor Akira Furusawa and assistant professor Shuntaro Takeda, who came up with the scheme, without losing the quantum information they carry.

Because of this, the pair make a fairly bold claim: “This approach potentially enables scalable, universal, and fault tolerant quantum computing, which is hard to achieve by either qubit or CV [continuous variable – El Reg] scheme alone.”

The paper, published at Physical Review Letters and also available at arXiv (PDF), also notes that the scheme is compatible with existing quantum error-correction techniques. In a media release (here) Professor Furusawa says his team is working on automating the error-correction process. The release adds that his previous optical-based quantum computing system needed 6.3m2 and 500 mirrors and lenses, and could only handle a single pulse at a time.

Furusawa’s paper notes that the gate sequence is electrically programmed, making it fast, and it notes that “all the basic building blocks of our architecture are already available”, meaning the work should be replicable. Furusawa said in the canned statement: “We’ll start work to develop the hardware, now that we’ve resolved all problems except how to make a scheme that automatically corrects a calculation error.”



Construction of large scale practical quantum computers radically simplified

While Quantum computing on a small scale using trapped ions (charged atoms) can be carried out by aligning individual laser beams onto individual ions with each ion forming a quantum bit. However, a large-scale quantum computer would need billions of quantum bits, therefore requiring billions of precisely aligned lasers, one for each ion.

Instead, scientists at Sussex have invented a simple method where voltages are applied to a quantum computer microchip (without having to align laser beams) – to the same effect. Professor Winfried Hensinger and his team also succeeded in demonstrating the core building block of this new method with an impressively low error rate at their quantum computing facility at Sussex.

“We’ve reduced the difficulty of building a quantum computer to the equivalent of building a classical computer. In a classical computer, you have transistors and they apply voltage to execute a classical logic gate,” Winfried Hensinger, professor of quantum technologies at the University of Sussex’s Ion Quantum Technology Group, told IBTimes UK.

“We use microwave radiation, bathe the entire quantum computer in microwaves, then we have local magnetic field gradients within the actual processing zones, and by applying a voltage, we shift the position of the ion so it either interacts with the global microwaves or not.”

Professor Hensinger said: “This development is a game changer for quantum computing making it accessible for industrial and government use. We will construct a large-scale quantum computer at Sussex making full use of this exciting new technology.”


Microsoft  building quantum computer based on topological qubit

Researchers at Microsoft are working on an entirely new topological quantum computer, which uses exotic materials to limit errors. Microsoft’s new hires include Leo Kouwenhoven, a professor at the Delft University of Technology in the Netherlands; Charles Marcus, a professor at the University of Copenhagen; Matthias Troyer, a professor at ETH Zurich; and David Reilly, a professor at the University of Sydney in Australia.

Microsoft’s approach to building a quantum computer is based on a type of qubit – or unit of quantum information – called a topological qubit. The Microsoft team believes that topological qubits are better able to withstand challenges such as heat or electrical noise, allowing them to remain in a quantum state longer. That, in turn, makes them much more practical and effective. “A topological design is less impacted by changes in its environment,” Holmdahl said.

At the same time as Microsoft is working to build a quantum computer, it is also creating the software that could run on it. The goal is to have a system that can begin to efficiently solve complex problems from day one. “Similar to classical high-performance computing, we need not just hardware but also optimized software,” Troyer said.

To the team, that makes sense: The two systems can work together to solve certain problems, and the research from each can help the other side. “A quantum computer is much more than the qubits,” Reilly said. “It includes all of the classical hardware systems, interfaces and connections to the outside world.”

For more information on topological materials: http://idstch.com/home5/international-defence-security-and-technology/technology/materials/topological-materials-promising-ultra-low-power-faster-computers-microsofts-topological-quantum-computers/


References and Resources  also include: