We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
By
Mingsheng Ying, University of Technology Sydney,
Runyao Duan, University of Technology Sydney,
Yuan Feng, University of Technology Sydney,
Zhengfeng Ji, Perinmeter Institute for Theoretical Physics
Edited by
Simon Gay, University of Glasgow,Ian Mackie, Imperial College London
This chapter presents a systematic exposition of predicate transformer semantics for quantum programs. It is divided into two parts: The first part reviews the state transformer (forward) semantics of quantum programs according to Selinger's suggestion of representing quantum programs by superoperators and elucidates D'Hondt-Panangaden's theory of quantum weakest preconditions in detail. In the second part, we develop a quite complete predicate transformer semantics of quantum programs based on Birkhoff-von Neumann quantum logic by considering only quantum predicates expressed by projection operators. In particular, the universal conjunctivity and termination law of quantum programs are proved, and Hoare's induction rule is established in the quantum setting.
8.1 Introduction
In the mid-1990s Shor and Grover discovered, respectively, the famous quantum factoring and searching algorithms. Their discoveries indicated that in principle quantum computers offer a way to accomplish certain computational tasks much more efficiently than classical computers, and thus stimulated an intensive investigation in quantum computation. Since then a substantial effort has been made to develop the theory of quantum computation, to find new quantum algorithms, and to exploit the physical techniques needed in building functional quantum computers, including in particular fault tolerance techniques.
Currently, quantum algorithms are expressed mainly at the very low level of quantum circuits. In the history of classical computation, however, it was realized long time ago that programming languages provide a technique that allows us to think about a problem that we intend to solve in a high-level, conceptual way, rather than the details of implementation.
Reasoning about quantum systems has gained prominence due to a big potential in applications such as information processing, security, distributed systems, and randomized algorithms. This fact has attracted research in formal reasoning about quantum states, programs, and processes. On the other hand, temporal logics have proved to be successful in the verification of classical distributed systems and security protocols. In this chapter we extend exogenous quantum propositional logic with temporal modalities, considering both linear and branching time. We provide a weakly complete Hilbert calculi for the proposed quantum temporal logics and study their SAT and model-checking problems.
10.1 Introduction
Reasoning about quantum systems has gained prominence due to their potential applications in information processing, security, distributed systems and randomized algorithms. This has attracted research in formal reasoning about quantum states (see for instance van der Meyden and Patra 2003b, a; Mateus and Sernadas 2006; Chadha et al. 2009; Caleiro et al. 2006) and quantum programs (cf. Knill 1996; Sanders and Zuliani 2000; Abramsky and Coecke 2004; D'Hondt and Panangaden 2004; Altenkirch and Grattage 2005; Selinger and Valiron 2005; Baltag and Smets 2006; Baltazar et al. 2007; Chadha et al. 2006, 2007; Baltazar et al. 2008). On the other hand, formal methods have proved to be successful in design and verification of classical distributed systems and security protocols (e.g., Clarke and Wing 1996; Meadows 2003). Herein, we present branching and linear temporal logics for reasoning about evolution of quantum systems composed of a fixed finite set of qubits.
We describe model-checking techniques for protocols arising in quantum information theory and quantum cryptography. We discuss the theory and implementation of a practical model checker, QMC, for quantum protocols. In our framework, we assume that the quantum operations performed in a protocol are restricted to those within the stabilizer formalism; while this particular set of operations is not universal for quantum computation, it allows us to develop models of several useful protocols as well as of systems involving both classical and quantum information processing. We discuss the modeling language of QMC, the logic used for verification, and the verification algorithms that have been implemented in the tool. We demonstrate our techniques with applications to a number of case studies, including quantum teleportation and the BB84 quantum coin-flipping protocol.
11.1 Introduction
Model-checking is a method of computer-aided verification used widely in both academia and industry for the analysis of software and communication systems. The method essentially consists of three phases: (1) modeling the system of interest at a suitable level of abstraction, usually by describing it in a formal specification language; (2) specifying the properties that the system must satisfy – these properties are typically expressed with reference to the system model, using a logic designed specially for this purpose (e.g., linear or branching time temporal logic); and (3) supplying the model and the properties to a model-checking tool, which automatically constructs the state space of the model and checks whether or not the given properties are satisfied (and if not, usually providing a counterexample).
The idea of quantum computation, in the algorithmic sense, originated from the suggestion by Feynman (1982) that a computer based on the principles of quantum mechanics might be capable of efficiently simulating quantum systems of interest to physicists; such simulation seems to be very difficult with classical computers. Feynman's suggestion was followed up by Deutsch (1985), who introduced the notion of the quantum Turing machine and investigated the possible computational power of physically realizable computers. He showed that a specific problem, now known as Deutsch's problem, can be solved more efficiently by a quantum algorithm than by a classical algorithm. Several years later, Shor (1994) discovered efficient quantum algorithms for two important practical problems – integer factorization and the “discrete logarithm” problem – and shortly afterwards, Grover (1996) discovered an efficient quantum algorithm for unstructured searching. Since then, quantum algorithmics and quantum complexity theory have developed into substantial and active research fields.
Meanwhile, the principles of quantum mechanics were being used as the foundation for a new approach to cryptography. Bennett and Brassard (1984) defined a protocol for key distribution whose security is guaranteed by the laws of quantum theory. Their system built on earlier work by Wiesner (1983), which remained unpublished until several years after its conception. We regard quantum cryptography as an aspect of quantum computation, in particular distributed quantum computation; alternatively, both quantum algorithmics and quantum cryptography can be viewed as branches of quantum information processing.
In this chapter, we present applications of abstract interpretation techniques in quantum computing. Quantum computing is a now well-established domain of computer science, and the recent developments of semantic techniques show the vitality of this rapidly growing area. On the other hand, the proof has been made that abstract interpretation is a powerful theory (of “classical” computer science) for comparing more or less precise semantics of the same programming language. In a more general picture, abstract interpretation can be seen as a framework for comparing the precision of several representations of the same dynamic system evolution. In this paper, we point out that abstract interpretation can be fruitfully used in quantum computing: (i) for establishing a hierarchy of quantum semantics and (ii) for analyzing entanglement evolution.
6.1 Introduction
The field of quantum information processing is growing rapidly. This development is based on the exploitation of nonclassical phenomena such as superposition and entanglement, which gave rise to new protocols (cryptographic protocols with unconditional security, Bennett 1992; Bennett and Brassard 1984; teleportation Bennett et al. 1993), and new algorithms (polynomial time factorization, Shor 1994 and search algorithms (Grover 1996)).
Some recent developments in quantum information processing are based on the use of high-level methods, adapting and extending the methods successfully used in classical computation (Abramsky and Coecke 2004; Gay and Nagarajan 2005, 2006; Gay et al. 2008; Kashefi 2003; Lalire and Jorrand 2004; Selinger 2004c, d).
Measurement-based quantum computation (MBQC) has emerged as a new approach to quantum computation where the notion of measurement is the main driving force of computation. This is in contrast with the more traditional circuit model that takes unitary operations as fundamental. Among measurement-based quantum computation methods the recently introduced one-way quantum computer stands out as basic and fundamental. The key idea is to start from an entangled state and then use measurements and one-qubit unitaries, which may be dependent on the outcomes of measurements, to guide the computation. The main point is that one never has to perform unitaries on more than one qubit at a time after the initial preparation of an entangled state. The “programs” that one writes in this model are traditionally called “patterns.”
In this chapter, we develop a rigorous mathematical model underlying measurement-based quantum computation. We give syntax, operational semantics, denotational semantics, and an algebra of programs derived from the denotational semantics. We also present a rewrite theory and prove a general standardization theorem that allows all programs to be put in a semantically equivalent standard form. Standardization has far-reaching consequences: a new physical architecture based on performing all the entanglement in the beginning, parallelization by exposing the dependency structure of measurements, and expressiveness theorems.
We use our general measurement calculus not just to formalize the one-way model but also several other measurement-based models, e.g., Teleportation, Phase, and Pauli models, and present compositional embeddings of them into and from the one-way model.
It is well known that the “quantum logic” approach to the foundations of quantum mechanics is based on the subspace ordering of projectors on a Hilbert space. In this paper, we show that this is a special case of an ordering on partial isometries, introduced by Halmos and McLaughlin. Partial isometries have a natural physical interpretation, however, they are notoriously not closed under composition. In order to take a categorical approach, we demonstrate that the Halmos-McLaughlin partial ordering, together with tools from both categorical logic and inverse categories, allows us to form a category of partial isometries.
This category can reasonably be considered a “categorification” of quantum logic – we therefore compare this category with Abramsky and Coecke's “compact closed categories” approach to foundations and with the “monoidal closed categories” view of categorical logic. This comparison illustrates a fundamental incompatibility between these two distinct approaches to the foundations of quantum mechanics.
9.1 Introduction
As early as 1936, von Neumann and Birkhoff proposed treating projectors on Hilbert space as propositions about quantum systems (Birkhoff and von Neumann 1936), by direct analogy with classical order-theoretic approaches to logic. Boolean lattices arise as the Lindenbaum-Tarski algebras of propositional logics, and as the set of all projectors on a Hilbert space also forms an orthocomplemented lattice, the operations meet, join, and complement were analogously interpreted as the logical connectives conjunction, disjunction, and negation.
Just as conventional functional programs may be understood as proofs in an intuitionistic logic, so quantum processes can also be viewed as proofs in a suitable logic.We describe such a logic, the logic of compact closed categories and biproducts, presented both as a sequent calculus and as a system of proof-nets. This logic captures much of the necessary structure needed to represent quantum processes under classical control, while remaining agnostic to the fine details. We demonstrate how to represent quantum processes as proof-nets and show that the dynamic behavior of a quantum process is captured by the cut-elimination procedure for the logic. We show that the cut-elimination procedure is strongly normalizing: that is, that every legal way of simplifying a proof-net leads to the same, unique, normal form. Finally, taking some initial set of operations as nonlogical axioms, we show that that the resulting category of proof-nets is a representation of the free compact closed category with biproducts generated by those operations.
3.1 Introduction
3.1.1 Logic, Processes, and Categories
Birkhoff and von Neumann initiated the logical study of quantum mechanics in their well-known 1936 paper. They constructed a logic by assigning a proposition letter to each observable property of a given quantum system and studied negations, conjunctions, and disjunctions of these properties. The resulting lattice is nondistributive, and so the heart of what is called “quantum logic” is the study of various kinds of nondistributive lattices.
The no-cloning theorem is a basic limitative result for quantum mechanics, with particular significance for quantum information. It says that there is no unitary operation that makes perfect copies of an unknown (pure) quantum state. We re-examine this foundational result from the perspective of the categorical formulation of quantum mechanics recently introduced by the author and Bob Coecke. We formulate and prove a novel version of the result, as an incompatibility between having a “natural” copying operation and the structural features required for modeling quantum entanglement coexisting in the same category. This formulation is strikingly similar to a well-known limitative result in categorical logic, Joyal's lemma, which shows that a “Boolean cartesian closed category” trivializes and hence provides a major roadblock to the computational interpretation of classical logic. This shows a heretofore unsuspected connection between limitative results in proof theory and no-go theorems in quantum mechanics. The argument is of a robust, topological character and uses the graphical calculus for monoidal categories to advantage.
1.1 Introduction
The no-cloning theorem (Dieks 1982; Wootters and Zurek 1982) is a basic limitative result for quantum mechanics, with particular significance for quantum information. It says that there is no unitary operation that makes perfect copies of an unknown (pure) quantum state. A stronger form of this result is the no-broadcasting theorem (Barnum et al. 1996), which applies to mixed states. There is also a no-deleting theorem (Pati and Braunstein 2000).
The quantum IO monad is a purely functional interface to quantum programming implemented as a Haskell library. At the same time it provides a constructive semantics of quantum programming. The QIO monad separates reversible (i.e., unitary) and irreversible (i.e., probabilistic) computations and provides a reversible let operation (ulet), allowing us to use ancillas (auxiliary qubits) in a modular fashion. QIO programs can be simulated either by calculating a probability distribution or by embedding it into the IO monad using the random number generator. As an example we present a complete implementation of Shor's algorithm.
5.1 Introduction
We present an interface from a pure functional programming language, Haskell, to quantum programming: the quantum IO monad, and use it to implement Shor's factorization algorithm. The implementation of the QIO monad provides a constructive semantics for quantum programming, i.e., a functional program that can also be understood as a mathematical model of quantum computing. Actually, the Haskell QIO library is only a first approximation of such a semantics; we would like to move to a more expressive language that is also logically sound. Here we are thinking of a language such as Agda (Norell 2007), which is based on Martin Löf's type theory. We have already investigated this approach of functional specifications of effects in a classical context (Swierstra and Altenkirch 2007, 2008; Swierstra 2008). At the same time the QIO monad provides a high-level interface to a hypothetical quantum computer.
We discuss the design of a typed lambda calculus for quantum computation. After a brief discussion of the role of higher-order functions in quantum information theory, we define the quantum lambda calculus and its operational semantics. Safety invariants, such as the no-cloning property, are enforced by a static type system that is based on intuitionistic linear logic. We also describe a type inference algorithm and a categorical semantics.
4.1 Introduction
The lambda calculus, developed in the 1930s by Church and Curry, is a formalism for expressing higher-order functions. In a nutshell, a higher-order function is a function that inputs or outputs a “black box,” which is itself a (possibly higher-order) function. Higher-order functions are a computationally powerful tool. Indeed, the pure untyped lambda calculus has the same computational power as Turing machines (Turing 1937). At the same time, higher-order functions are a useful abstraction for programmers. They form the basis of functional programming languages such as LISP (McCarthy 1960), Scheme (Sussman and Steele 1975), ML (Milner 1978), and Haskell (Hudak et al. 2007).
In this chapter, we discuss how to combine higher-order functions with quantum computation. We believe that this is an interesting question for a number of reasons. First, the combination of higher-order functions with quantum phenomena raises the prospect of entangled functions. Certain well-known quantum phenomena can be naturally described in terms of entangled functions, and we give some examples of this in Section 4.2.
In recent work, several researchers including the authors have developed a categorical formalization of quantum mechanics in terms of symmetric monoidal dagger categories. In this framework, classical data turned out to be represented by an algebraic structure, that of special commutative dagger Frobenius algebras. This structure captures the distinct capabilities that apply to classical data – that they can be copied and deleted. In the present paper, we provide categorical semantics and diagrammatic representations of deterministic, nondeterministic, and probabilistic operations over classical data represented in this way.
Moreover, a combination of some fundamental categorical constructions (the Kleisli construction of the category of free algebras and the Grothendieck construction of the total category of an indexed category) with the specific categorical presentations of pure and mixed quantum states provides a resource-sensitive categorical account of classical control of quantum data and of classical data resulting from quantum measurements, as well as of the classical data processing that may happen in between measurements and controls. Along the way we also discover some apparently novel quantum typing structures.
One of the salient features of categorical quantum mechanics is still its graphic calculus, which allows succinct presentations of diverse quantum protocols. The elements of an abstract stochastic calculus are beginning to emerge from it, pointing toward convenient refinements of resource-sensitive logics that are hoped to capture the probabilistic content and limited observability of quantum data.