Trending News
Home / International Defence Security and Technology / Technology / ICT / Quantum Machine Learning to user new era in AI by exponential speedup of internet search, fraud detection, and brain mapping

Quantum Machine Learning to user new era in AI by exponential speedup of internet search, fraud detection, and brain mapping

Quantum computing and quantum information processing is expected to have immense impact by performing tasks too hard for even the most powerful conventional supercomputer and have a host of specific applications, from code-breaking and cyber security to medical diagnostics, big data analysis and logistics.

Quantum computers, run on a subatomic level using quantum bits (or qubits) that can represent a 0 and a 1 at the same time. A processor using qubits could theoretically solve problems exponentially more quickly than a traditional computer for a small set of specialized problems.

Machine Learning (ML) is a subfield of Artificial Intelligence which attempts to endow computers with the capacity of learning from data, so that explicit programming is not necessary to perform a task. ML algorithms allow computers to extract information and infer patterns from the record data so computers can learn from previous examples to make good predictions about new ones. Machine Learning (ML) has now become a pervasive technology, underlying many modern applications including internet search, fraud detection, gaming, face detection, image tagging, brain mapping, check processing and computer server health-monitoring. Quantum machine learning algorithms aim to use the advantages of quantum computation in order to improve classical methods of machine learning, for example by developing efficient implementations of expensive classical algorithms on a quantum computer.

Researchers from Caltech and the University of Southern California (USC) report the first application of quantum computing to machine learning. By employing quantum-compatible machine learning techniques, they developed a method of extracting a rare Higgs boson signal from copious noise data. Higgs is the particle that was predicted to imbue elementary particles with mass and was discovered at the Large Hadron Collider in 2012. The quantum program seeks patterns within a data set to tell meaningful data from junk. The new quantum machine learning method is found to perform well even with small data sets, unlike the standard counterparts.

Quantum Computing Startup Demonstrates Unsupervised Machine Learning Using Their 19-Qubit Processor

Researchers at Rigetti Computing, a company based in Berkeley, California, used one of its prototype quantum chips—a superconducting device housed within an elaborate super-chilled setup—to run what’s known as a clustering algorithm. Clustering is a machine-learning technique used to organize data into similar groups. Rigetti is also making the new quantum computer—which can handle 19 quantum bits, or qubits—available through its cloud computing platform, called Forest.

The company’s scientists published a paper about the demonstration called “Unsupervised Machine Learning on a Hybrid Quantum Computer.” The abstract lays out the problem space and their approach :

Machine learning techniques have led to broad adoption of a statistical model of computing. The statistical distributions natively available on quantum processors are a superset of those available classically. Harnessing this attribute has the potential to accelerate or otherwise improve machine learning relative to purely classical performance. A key challenge toward that goal is learning to hybridize classical computing resources and traditional learning techniques with the emerging capabilities of general purpose quantum processors. Here, we demonstrate such hybridization by training a 19-qubit gate model processor to solve a clustering problem, a foundational challenge in unsupervised learning. We use the quantum approximate optimization algorithm in conjunction with a gradient-free Bayesian optimization to train the quantum machine. This quantum/classical hybrid algorithm shows robustness to realistic noise, and we find evidence that classical optimization can be used to train around both coherent and incoherent imperfections.

 

Quantum computer learns to ‘see’ trees

Scientists have trained a quantum computer to recognize trees a step towards using such computers for complicated machine learning problems like pattern recognition and computer vision. For example, team member Ramakrishna Nemani, an earth scientist at NASA’s Advanced Supercomputer Division in Mountain View, California says the study lays the groundwork for better climate forecasting.

By poring over NASA’s satellite imagery, quantum processors could take a machine learning approach to uncover new patterns in how weather moves across the world over the course of weeks, months, or even years, he says. “Say you’re living in India—you might get an advance notice of a cyclone 6 months ahead of time because we see a pattern of weather in northern Canada.”

In the new study, physicist Edward Boyda of St. Mary’s College of California in Moraga and colleagues fed hundreds of NASA satellite images of California into the D-Wave 2X processor, which contains 1152 qubits. The researchers asked the computer to consider dozens of features—hue, saturation, even light reflectance—to determine whether clumps of pixels were trees as opposed to roads, buildings, or rivers. They then told the computer whether its classifications were right or wrong so that the computer could learn from its mistakes, tweaking the formula it uses to determine whether something is a tree.

After it was trained, the D-Wave was 90% accurate in recognizing trees in aerial photographs of Mill Valley, California, the team reports in PLOS ONE. “Classification is a tricky problem; there are short trees, tall trees, trees next to each other, next to buildings—all sorts of combinations,” says Nemani.

 

Quantum Machine Learning

Recently, however, a new family of quantum algorithms has come along to challenge this relatively-narrow view of what a quantum computer would be useful for. Not only do these new algorithms promise exponential speedups over classical algorithms, but they do so for eminently-practical problems, involving machine learning, clustering, classification, and finding patterns in huge amounts of data, writes Scott Aaronson. “The algorithm at the center of the “quantum machine learning” mini-revolution is called HHL, after my colleagues Aram Harrow, Avinatan Hassidim, and Seth Lloyd, who invented it in 2008.”

“HHL attacks one of the most basic problems in all of science: namely, solving a system of linear equations. Given an n × n real matrix A and a vector b, the goal of HHL is to (approximately) solve the system Ax = b for x, and to do so in an amount of time that scales only logarithmically with n, the number of equations and unknowns. Classically, this goal seems hopeless, since n2 steps would be needed even to examine all the entries of A, and n steps would be needed even to write down the solution vector x. By contrast, by exploiting the exponential character of the wave function, HHL promises to solve a system of n equations in only about log n steps.”

In the years since HHL, quantum algorithms achieving “exponential speedups over classical algorithms” have been proposed for other major application areas, including k-means clustering , support vector machines , data fitting, and even computing certain properties of Google PageRank vectors.

Quantum machine learning over infinite dimensions

Researchers have recently generalized quantum machine learning to the more complex, but still remarkably practical, infinite-dimensional systems. Researchers present the critical subroutines of quantum machine learning algorithms for an all-photonic continuous-variable quantum computer that achieve an exponential speedup compared to their equivalent classical counterparts. Finally, they also map out an experimental implementation which can be used as a blueprint for future photonic demonstrations.

Physicists have developed a quantum machine learning algorithm that can handle infinite dimensions—that is, it works with continuous variables (which have an infinite number of possible values on a closed interval) instead of the typically used discrete variables (which have only a finite number of values).

The researchers, HoiKwan Lau et al., have published a paper on generalizing quantum machine learning to infinite dimensions in Physical Review Letters. As the physicists explain, quantum machine learning is a new subfield within the field of quantum information that combines the speed of quantum computing with the ability to learn and adapt, as offered by machine learning.

One of the biggest advantages of having a quantum machine learning algorithm for continuous variables is that it can theoretically operate much faster than classical algorithms. Since many science and engineering models involve continuous variables, applying quantum machine learning to these problems could potentially have far-reaching applications.

Although the results of the study are purely theoretical, the physicists expect that the new algorithm for continuous variables could be experimentally implemented using currently available technology. The implementation could be done in several ways, such as by using optical systems, spin systems, or trapped atoms.

References and Resources also include:

image_pdfimage_print

Check Also

DARPA-HIVE-2-c1e3841b05a09ee1

DARPA’s 1000X efficient graph analytics processor enables real-time identification of cyber threats, and vastly improved situational awareness

Today large amounts of data is collected from numerous sources, such as social media, sensor …

error: Content is protected !!