Home / Technology / AI & IT / New programming languages for quantum computers that take full advantage of the computers’ capabilities

New programming languages for quantum computers that take full advantage of the computers’ capabilities

Quantum computing and quantum information processing are next revolutionary technology expected to have immense impact. Quantum information technologies, such as quantum computers, cryptography, radars, clocks, and other quantum systems, rely on the properties of quantum mechanics, which describes the behavior of matter at the subatomic scale. For example, by taking advantage of superposition and entanglement in quantum computers, scientists are able to use new algorithms to solve complex problems exponentially faster than even the most advanced traditional computers in operation today.

 

Quantum computers will be able to perform tasks too hard for even the most powerful conventional supercomputer and have a host of specific applications, from code-breaking and cyber security to medical diagnostics, big data analysis and logistics. Quantum computers could accelerate the discovery of new materials, chemicals and drugs leading to dramatical reduction of the current high costs and long lead times involved in developing new drugs.

 

Researchers at D-Wave, IBM, MIT, Lincoln Lab, and elsewhere are racing ahead to develop quantum computers with larger and larger superconducting qubits of high quality. IBM is making its prototype quantum computer available over the cloud, so it can be used to start testing quantum code, though the limited number of qubits means that it’s still too slow for useful for more than computing research.

 

However, making the full use of quantum computers for above applications require development of quantum software, programming languages and algorithms. Quantum programming is the process of assembling sequences of instructions, called quantum programs, that are capable of running on a quantum computer.

 

There are different levels of programming: from assembly languages (also known as quantum machine instruction languages), which give the specific instructions to the computer, to higher level languages, where quantum algorithms are already programmed at low level and we only have to introduce some specific parameters of our problem.

 

Quantum coding may be among the most complex tasks ever undertaken, because the quirks of quantum computing create limitations that don’t exist in classical programming languages. One example: quantum programs can’t have loops in them that repeat a sequence of instructions; they have to run straight through to completion.

 

To begin with, there are some scientific challenges that are unique to quantum technology. For example, the very nature of quantum mechanics makes it impossible to “clone” or duplicate qubits, which are the quantum equivalent of a classical computer bit. This makes many common programming techniques that rely on copying the value of a variable impossible to use with quantum technology. For similar reasons, it’s impossible to read the same qubit twice. While this can be a great advantage for secure communications where you want to generate unforgeable cryptographic keys, it can create tremendous difficulties in computing as it complicates the techniques necessary to test or “debug” a program before running it.

 

Programming languages help express quantum algorithms using high-level constructs. The most recent one comes from Microsoft, which has unveiled Q# (pronounced Q sharp) and some associated tools to help developers use it to create software. It joins a growing list of other high-level quantum programming languages such as QCL and Quipper.

 

The University of Melbourne has launched an online quantum computer simulator and programming environment aimed at making students and industry ‘quantum ready’. Named Quantum User Interface or QUI, the web-based program lets users click and drag logic elements that operate on quantum bits (known as qubits) to create a quantum program. A remote cluster of computers at the university runs the program on a simulated quantum computer and sends back the results in real time.

 

As the international race to develop quantum technology accelerates, there’s an increasingly urgent need to train the next generation of ‘quantum’ programmers, software developers and engineers,” explained Melbourne Uni’s Professor Lloyd Hollenberg, who also the deputy director of the Australian Research Council Centre of Excellence for Quantum Computation and Communication Technology (CQC2T). “The real challenge is to get people up to speed in quantum information processing with a minimal knowledge base in quantum mechanics, and to dispel common misconceptions about how quantum computers work,” he added.

 

One of the key features of the programme is its ability to display visualisations of the quantum computer’s state at every stage in the program. “Anyone who tries to represent or visualise the state of the quantum computer comes up against this wicked problem – how to visualise this thing which exists in a multi-dimensional space of real and imaginary numbers, let alone the weird effects of quantum superposition and entanglement,” Hollenberg said.

First proof of quantum computer advantage

Robert König, professor for the theory of complex quantum systems at the TUM, in collaboration with David Gosset from the Institute for Quantum Computing at the University of Waterloo and Sergey Bravyi from IBM, has now placed a cornerstone in this promising field.

 

König and his colleagues have now conclusively demonstrated the advantage of quantum computers. To this end, they developed a quantum circuit that can solve a specific difficult algebraic problem. The new circuit has a simple structure—it only performs a fixed number of operations on each qubit. Such a circuit is referred to as having a constant depth. In their work, the researchers prove that the problem at hand cannot be solved using classical constant-depth circuits. They furthermore answer the question of why the quantum algorithm beats any comparable classical circuit: The quantum algorithm exploits the non-locality of quantum physics.

 

Prior to this work, the advantage of quantum computers had been neither proven nor experimentally demonstrated—notwithstanding that evidence pointed in this direction. One example is Shor’s quantum algorithm, which efficiently solves the problem of prime factorization. However, it is merely a complexity-theoretic conjecture that this problem cannot be efficiently solved without quantum computers. It is also conceivable that the right approach has simply not yet been found for classical computers.

 

Robert König considers the new results primarily as a contribution to complexity theory. “Our result shows that quantum information processing really does provide benefits—without having to rely on unproven complexity-theoretic conjectures,” he says. Beyond this, the work provides new milestones on the road to quantum computers. Because of its simple structure, the new quantum circuit is a candidate for a near-term experimental realization of quantum algorithms.

 

Quantum computer coding

On the most basic hardware level, quantum computers differ from classical computers because they are not binary — rather than working with bits that are in one of two states, quantum processors work with “qubits” that are in both of two states simultaneously. Some of the physical systems that exhibit superposition clearly enough to function as qubitss include photons with two directions of polarization, atomic nuclei with two spin orientations, and  superconducting loops with clockwise and counterclockwise electric currents.

 

First, regardless of the coding language used, all math executed on a quantum computer must be “reversible.” What this means is that all outputs have to contain enough information that the entire process of creating them can be run backwards to regenerate the input. Reversible methods for all computable functions should be used, you must avoid erasures, preserving enough information to retrace your steps. This principle is called the no-cloning theorem forbids copying a qubit. Nor can you arbitrarily set or reset qubits in the middle of a computation. Attempting to do so would destroy the quantum superposition.

 

Quantum program must have a stovepipe architecture. Reversible quantum logic gates are lined up in sequence, and information flows straight through them from one end to the other. Of particular importance, the program structure can have no loops, where control is transferred backward to an earlier point so that a sequence of instructions is traversed multiple times.

 

The protocol for solving a problem with a quantum computer is often described like this: Prepare a set of qubits in a suitable initial state, apply a specified series of operations, then measure the final state of the qubits. If all goes well, the measurement will yield the answer to the problem.

 

Quantum Computing Model

A quantum computer is actually a hybrid machine consisting of a quantum device (hardware) and a classical computer, which sends the instructions to the hardware and receive and process the results (software).  This model of quantum computation treat the quantum computer as a coprocessor, similar to that used for GPUs, FPGAs, and other adjunct processors. The primary control logic runs classical code on a classical “host” computer. When appropriate and necessary, the host program can invoke a sub-program that runs on the adjunct processor. When the sub-program completes, the host program gets access to the sub-program’s results.

In this model, there are three levels of computation:+

  • Classical computation that reads input data, sets up the quantum computation, triggers the quantum computation, processes the results of the computation, and presents the results to the user.
  • Quantum computation that happens directly in the quantum device and implements a quantum algorithm.
  • Classical computation that is required by the quantum algorithm during its execution.

Quantum languages correspond with this classical software. Some of them are proper programming languages as are the classical ones. Others are in fact libraries programmed using some well-known language (python, C++, Matlab, …) that help the user to program quantum algorithms. In this article, and its corresponding figures, we will refer as a “quantum language” both cases indistinctly.

 

Quantum languages have been around since before the emergence of real quantum devices. They have been used to simulate quantum algorithms on classical computers. With the construction of real quantum computers, each research group and company have developed their own language to use their machines. This has motivated the creation of even more languages that try to join some elements of the existing ones and whose aim is to be used in any backend: we call them Universal Quantum Languages.

 

One of this quantum universal language is ProjectQ, which can be used to program IBM devices and Google’s with its extension CirqProjectQ. Other is XACC, which is a full-stack library to program IBM, Rigetti and D-Wave quantum computers.

 

Quantum algorithms

The quantum algorithms exploit a form of parallelism that’s unique to quantum systems. In effect, a collection of n qubits can explore all of its 2 possible con􀃒gurations at once; a classical system might have to look at the 2 bit patterns one at a time.

 

Among those algorithms, the best known is a procedure for finding the factors of integers, devised in 1994 by Peter W. Shor, now at MIT. When factoring an n -digit number, the fastest known classical algorithms take an amount of time that grows exponentially with n ; Shor’s algorithm works in time proportional to n 3 . For large enough n , the quantum algorithm is far faster.

 

QCL

QCL, or Quantum Computation Language, is the invention of Bernhard Ömer of the Vienna University of Technology. Ömer’s interpreter for the language (http://www.itp.tuwien.ac.at/~oemer/qcl.html (http://www.itp.tuwien.ac.at/~oemer/qcl.html) ) includes an emulator that runs quantum programs on classical hardware.

Of course the emulator can’t provide the speedup of quantum parallelism; on the other hand, it o􀃗ers the programmer some helpful facilities—such as commands for inspecting the internal state of qubits— that are impossible with real quantum hardware.

QCL borrows the syntax of languages such as C and Java, which are sometimes described as “imperative” languages because they rely on direct commands to set and reset the values of variables. As noted, such commands are generally forbidden within a quantum computation, and so major parts of a QCL program run only on classical hardware.

 

Quipper

Quipper is intended for the same kinds of programming tasks as QCL, but it has a di􀃗erent structure and appearance. The language is implemented as an extension of the programming language Haskell (named for the logician Haskell B. Curry), which adopts a functional rather than imperative mode of expression. That is, the language is modeled on the semantics of mathematical functions, which map inputs to outputs but have no side e􀃗ects on the state of other variables. A functional style of programming seems more closely attuned to the constraints of quantum computing, although Haskell does not enforce the quantum rule that a variable can be assigned a value only once.

The Quipper system is a compiler rather than an interpreter; it translates a complete program all in one go rather than executing statements one by one. The output of the compiler consists of quantum circuits: networks of interconnected, reversible logic gates.

 

Microsoft’s Q# quantum programming language out now in preview

Recently Microsoft launched a preview version of a new programming language for quantum computing called Q#. The industry giant also launched a quantum simulator that developers can use to test and debug their quantum algorithms.

Given that quantum computers are still rare, Microsoft has built an as-yet-unnamed quantum simulator to run those quantum programs. The local version, released as part of the preview, can support programs using up to 32 quantum bits (qubits), using some 32GB of RAM. Microsoft is also offering an Azure version of the simulator, scaling up to 40 qubits.

Q# (Q-sharp) is a domain-specific programming language used for expressing quantum algorithms. It is to be used for writing sub-programs that execute on an adjunct quantum processor, under the control of a classical host program and computer.

Q# provides a small set of primitive types, along with two ways (arrays and tuples) for creating new, structured types. It supports a basic procedural model for writing programs, with loops and if/then statements. The top-level constructs in Q# are user defined types, operations, and functions.

 

Microsoft is developing its LIQUi|> initiative (pronounced “liquid”), in part to translate appropriate algorithms from well-known coding languages to quantum machine code for execution on a quantum computer. Quantum computation is a relatively new field that introduces a new paradigm in computation. There are totally different technologies to run quantum algorithms and the ultimate programming language should satisfy the needs of all possible users.

 

Quantum instruction sets

Quantum instruction sets are used to turn higher level algorithms into physical instructions that can be executed on quantum processors. Sometimes these instructions are specific to a given hardware platform, e.g. ion traps or superconducting qubits.

 

Quil

Quil is an instruction set architecture for quantum computing that first introduced a shared quantum/classical memory model. It was introduced by Robert Smith, Michael Curtis, and William Zeng in A Practical Quantum Instruction Set Architecture. Many quantum algorithms (including quantum teleportation, quantum error correction, simulation, and optimization algorithms require a shared memory architecture.

Rigetti  has released a public beta environment, Forest, so developers can get to grips with how these systems work. This beta is built on Quil, a programming language for quantum and classical computers, meaning programmers can leverage the best parts of both.

 

 

OpenQASM

OpenQASM is the intermediate representation introduced by IBM for use with their Quantum Experience.

 

Quantum software consortium

In 2017, the 10 year Dutch research project Quantum software started with the Gravitation grant of the Dutch Ministry of Science & Education. Top scientists of three Dutch universities are working on software and systems for quantum computers. Researchers of the Leiden Institute of Advanced Computer Science (LIACS) are developing new algorithms to make those super computers work.

‘Every group in the consortium contributes with its own expertise. LIACS is developing the new quantum algorithms and systems,’ said Aske Plaat is the scientific director of LIACS, the computer science institute of Leiden University.

Two areas of expertise at LIACS are the theory of computer science and machine learning. In the latter, algorithms do predictions based on the data they have collected themselves.

The essence of quantum computing is that it works with quantum bits, which can be zero and one at the same time. Classical computer bits are always zero or one. With classical computers, we can solve quite complicated problems quite fast. But with qubits, we can solve some problems much faster. Some solutions are even impossible to find efficiently without a quantum computer. At LIACS, we are aiming on algorithms in the so-called combinatory machine learning. These algorithms are at the basis of applications in big data.

 

 

References and resources also include:

https://www.universiteitleiden.nl/en/research/research-projects/science/algorithms-for-quantum-software

https://docs.microsoft.com/en-us/quantum/quantum-qr-intro?view=qsharp-preview

https://medium.com/@quantum_wa/quantum-computing-languages-landscape-1bc6dedb2a35

https://www.computerworld.com.au/article/648432/melbourne-uni-launches-quantum-simulator-environment-qui/

About Rajesh Uppal

Check Also

Enhancing Military Operations: The Versatile Applications of Image Processing Technology

From grainy satellite images to real-time drone footage, the modern battlefield generates a staggering amount …

error: Content is protected !!