Quantum


2016

(Modular) Effect Algebras are Equivalent to (Frobenius) Antispecial Algebras

Effect algebras are one of the generalizations of Boolean algebras proposed in the quest for a quantum logic. Frobenius algebras are a tool of categorical quantum mechanics, used to present various families of observables in abstract, often nonstandard frameworks. Both effect algebras and Frobenius algebras capture their respective fragments of quantum mechanics by elegant and succinct axioms; and both come with their conceptual mysteries. A particularly elegant and mysterious constraint, imposed on Frobenius algebras to characterize a class of tripartite entangled states, is the antispecial law. A particularly contentious issue on the quantum logic side is the modularity law, proposed by von Neumann to mitigate the failure of distributivity of quantum logical connectives. We show that, if quantum logic and categorical quantum mechanics are formalized in the same framework, then the antispecial law of categorical quantum mechanics corresponds to the natural requirement of effect algebras that the units are each other’s unique complements; and that the modularity law corresponds to the Frobenius condition. These correspondences lead to the equivalence announced in the title. Aligning the two formalisms, at the very least, sheds new light on the concepts that are more clearly displayed on one side than on the other (such as e.g. the orthogonality). Beyond that, it may also open up new approaches to deep and important problems of quantum mechanics (such as the classification of complementary observables).


2010

Relating toy models of quantum computation: comprehension, complementarity dagger mix autonomous categories

Toy models have been used to separate important features of quantum computation from the rich background of the standard Hilbert space model. Category theory, on the other hand, is a general tool to separate components of mathematical structures, and analyze one layer at a time. It seems natural to combine the two approaches, and several authors have already pursued this idea. We explore categorical comprehension construction as a tool for adding features to toy models. We use it to comprehend quantum propositions and probabilities within the basic model of finite-dimensional Hilbert spaces. We also analyze complementary quantum observables over the category of sets and relations. This leads into the realm of test spaces, a well-studied model. We present one of many possible extensions of this model, enabled by the comprehension construction. Conspicuously, all models obtained in this way carry the same categorical structure, extending the familiar dagger compact framework with the complementation operations. We call the obtained structure dagger mix structure, because it extends mix autonomous categories, popular in computer science, in a similar way like dagger compact structure extends compact categories. Dagger mix autonomous categories seem to arise quite naturally in quantum computation, as soon as complementarity is viewed as a part of the global structure.


2009

Geometry of abstraction in quantum computation

Quantum algorithms are sequences of abstract operations, performed on non-existent computers. They are in obvious need of categorical semantics. We present some steps in this direction, following earlier contributions of Abramsky, Coecke and Selinger. In particular, we analyze function abstraction in quantum computation, which turns out to characterize its classical interfaces.

Some quantum algorithms provide feasible solutions of important hard problems, such as factoring and discrete log (which are the building blocks of modern cryptography). It is of a great practical interest to precisely characterize the computational resources needed to execute such quantum algorithms. There are many ideas how to build a quantum computer. Can we prove some necessary conditions? Categorical semantics help with such questions. We show how to implement an important family of quantum algorithms using just abelian groups and relations.

Classical and quantum structuralism 

with Bob Coecke and Eric Oliver Paquette

In recent work, symmetric dagger-monoidal categories have emerged as a convenient categorical formalization of quantum mechanics. The objects represent physical systems, the morphisms physical operations, whereas the tensors describe composite systems. Classical data turn out to correspond to Frobenius algebras with some additional properties. They express the distinguishing capabilities of classical data: in contrast with quantum data, classical data can be copied and deleted. The algebraic approach thus shifts the paradigm of “quantization” of a classical theory to “classicization” of a quantum theory. Remarkably, the simple dagger-monoidal framework suffices not only for this conceptual shift, but even allows us to distinguish the deterministic classical operations (i.e. functions) from the nondeterministic classical operations (i.e. relations), and the probabilistic classical operations (stochastic maps). Moreover, a combination of some basic categorical constructions (due to Kleisli, resp. Grothendieck) with the categorical presentations of quantum states, provides a resource sensitive account of various quantum-classical interactions: of classical control of quantum data, of classical data arising from quantum measurements, as well as of the classical data processing in-between controls and measurements. A salient feature here is the graphical calculus for categorical quantum mechanics, which allows a purely diagrammatic representation of classical-quantum interaction.

 


2008

On quantum statistics in data analysis

Extended abstract

Originally, quantum probability theory was developed to analyze statistical phenomena in quantum systems, where classical probability theory does not apply, because the lattice of measurable sets is not necessarily distributive. On the other hand, it is well known that the lattices of concepts, that arise in data analysis, are in general also non-distributive, albeit for completely different reasons. In his recent book, van Rijsbergen (2004) argues that many of the logical tools developed for quantum systems are also suitable for applications in information retrieval. I explore the mathematical support for this idea on an abstract vector space model, covering several forms of data analysis (information retrieval, data mining, collaborative filtering, formal concept analysis…), and roughly based on an idea from categorical quantum mechanics (Abramsky & Coecke 2004; Coecke & Pavlovic 2007). It turns out that quantum (i.e., noncommutative) probability distributions arise already in this rudimentary mathematical framework. We show that a Bell-type inequality (Bell 1964) must be satisfied by the standard similarity measures, if they are used for preference predictions. The fact that already a very general, abstract version of the vector space model yields simple counterexamples for such inequalities seems to be an indicator of a genuine need for quantum statistics in data analysis.

 

A new description of orthogonal bases

with Bob Coecke and Jamie Vicary

We show that an orthogonal basis for a finite-dimensional Hilbert space can be equivalently characterised as a commutative dagger-Frobenius monoid in the category FHilb, which has finite-dimensional Hilbert spaces as objects and continuous linear maps as morphisms, and tensor product for the monoidal structure. The basis is normalised exactly when the corresponding commutative dagger-Frobenius monoid is special. Hence orthogonal and orthonormal bases can be axiomatised in terms of composition of operations and tensor product only, without any explicit reference to the underlying vector spaces. This axiomatisation moreover admits an operational interpretation, as the comultiplication copies the basis vectors and the counit uniformly deletes them. That is, we rely on the distinct ability to clone and delete classical data as compared to quantum data to capture basis vectors. For this reason our result has important implications for categorical quantum mechanics.

 

Quantum and classical structures in nondeterministic computation

In categorical quantum mechanics, classical structures characterize the classical interfaces of quantum resources on one hand, while on the other hand giving rise to some quantum phenomena. In the standard Hilbert space model of quantum theories, classical structures over a space correspond to its orthonormal bases. In the present paper, we show that classical structures in the category of relations correspond to biproducts of abelian groups. Although relations are, of course, not an interesting model of quantum computation, this result has some interesting computational interpretations. If relations are viewed as denotations of nondeterministic programs, it uncovers a wide variety of non-standard quantum structures in this familiar area of classical computation. Ironically, it also opens up a version of what in philosophy of quantum mechanics would be called an ontic-epistemic gap, as it provides no direct interface to these nonstandard quantum structures.


2007

Quantum measurements without sums

with Bob Coecke

Sums play a prominent role in the formalisms of quantum mechanics, be it for mixing and superposing states, or for composing state spaces. Surprisingly, a conceptual analysis of quantum measurement seems to suggest that quantum mechanics can be done without direct sums, expressed entirely in terms of the tensor product. The corresponding axioms define classical spaces as objects that allow copying and deleting data. Indeed, the information exchange between the quantum and the classical worlds is essentially determined by their distinct capabilities to copy and delete data. The sums turn out to be an implicit implementation of this capabilities. Realizing it through explicit axioms not only dispenses with the unnecessary structural baggage, but also allows a simple and intuitive graphical calculus. In category-theoretic terms, classical data types are dagger-compact Frobenius algebras, and quantum spectra underlying quantum measurements are Eilenberg-Moore coalgebras induced by these Frobenius algebras.