2024
Language processing in humans and computers
Machine-learned language models have transformed everyday life: they steer us when we study, drive, manage money. They have the potential to transform our civilization. But they hallucinate. Their realities are virtual. This note provides a high-level overview of language models and outlines a low-level model of learning machines. It turns out that, after they become capable of recognizing hallucinations and dreaming safely, as humans tend to be, the language-learning machines proceed to generate broader systems of false beliefs and self-confirming theories, as humans tend to do.
2023
From Gödel’s Incompleteness Theorem to the completeness of bot religions
D.P. and Temra Pavlovic
Hilbert and Ackermann asked for a method to consistently extend incomplete theories to complete theories. Gödel essentially proved that any theory capable of encoding its own statements and their proofs contains statements that are true but not provable. Hilbert did not accept that Gödel’s construction answered his question, and in his late writings and lectures, Gödel agreed that it did not, since theories can be completed incrementally, by adding axioms to prove ever more true statements, as science normally does, with completeness as the vanishing point. This pragmatic view of validity is familiar not only to scientists who conjecture test hypotheses but also to real estate agents and other dealers, who conjure claims, albeit invalid, as necessary to close a deal, confident that they will be able to conjure other claims, albeit invalid, sufficient to make the first claims valid. We study the underlying logical process and describe the trajectories leading to testable but unfalsifiable theories to which bots and other automated learners are likely to converge.
2020
Retracing some paths in categorical semantics: From process-propositions-as-types to categorified real numbers and monoidal computers
The logical parallelism of propositional connectives and type constructors extends beyond the static realm of predicates, to the dynamic realm of processes. Understanding the logical parallelism of process propositions and dynamic types was one of the central problems of the semantics of computation, albeit not always clear or explicit. It sprung into clarity through the early work of Samson Abramsky, where the central ideas of denotational semantics and process calculus were brought together and analyzed by categorical tools, e.g. in the structure of interaction categories. While some logical structures borne of dynamics of computation immediately started to emerge, others had to wait, be it because the underlying logical principles (mainly those arising from coinduction) were not yet sufficiently well-understood, or simply because the research community was more interested in other semantical tasks. Looking back, it seems that the process logic uncovered by those early semantical efforts might still be starting to emerge and that the vast field of results that have been obtained in the meantime might be a valley on a tip of an iceberg.
In the present paper, I try to provide a logical overview of the gamut of interaction categories and to distinguish those that model computation from those that capture processes in general. The main coinductive constructions turn out to be of this latter kind, as illustrated towards the end of the paper by a compact category of all real numbers as processes, computable and uncomputable, with polarized bisimulations as morphisms. The addition of the reals arises as the biproduct, real vector spaces are the enriched bicompletions, and linear algebra arises from the enriched kan extensions. At the final step, I sketch a structure that characterizes the computable fragment of categorical semantics.
Logic of fusion
The starting point of this work is the observation that the Curry-Howard isomorphism, relating types and propositions, programs and proofs, composition and cut, extends to the correspondence of program fusion and cut-elimination. This simple idea suggests logical interpretations of some of the basic methods of generic and transformational programming. In the present paper, we provide a logical analysis of the general form of build fusion, also known as deforestation, over the inductive and the coinductive datatypes, regular or nested. The analysis is based on a novel logical interpretation of parametricity in terms of the paranatural transformations, introduced in the paper.
2019
Abstract Extensionality. On the properties
of incomplete abstract interpretations
with Roberto Bruni, Roberto Giacobazzi, Roberta Gori, Isabel Garcia-Contreras
In this paper we generalise the notion of extensional (functional) equivalence of programs to abstract equivalences induced by abstract interpretations. The standard notion of extensional equivalence is recovered as the special case, induced by the concrete interpretation. Some properties of the extensional equivalence, such as the one spelled out in Rice’s theorem, lift to the abstract equivalences in suitably generalised forms. On the other hand, the generalised framework gives rise to interesting and important new properties, and allows refined, non-extensional analyses. In particular, since programs turn out to be extensionally equivalent if and only if they are equivalent just for the concrete interpretation, it follows that any non-trivial abstract interpretation uncovers some intensional aspect of programs. This striking result is also effective, in the sense that it allows constructing, for any non-trivial abstraction, a pair of programs that are extensionally equivalent but have different abstract semantics. The construction is based on the fact that abstract interpretations are always sound, but that they can be made incomplete through suitable code transformations. To construct these transformations, we introduce a novel technique for building incompleteness cliques of extensionally equivalent yet abstractly distinguishable programs: They are built together with abstract interpretations that produce false alarms. While programs are forced into incompleteness cliques using both control-flow and data-flow transformations, the main result follows from limitations of data-flow transformations with respect to control-flow ones. A further consequence is that the class of incomplete programs for a non-trivial abstraction is Turing complete. The obtained results also shed a new light on the relation between the techniques of code obfuscation and the precision in program analysis.
2018
Sound up-to techniques and Complete abstract domains
with Filippo Bonchi, Pierre Ganty and Roberto Giacobazzi
Abstract interpretation is a method to automatically find invariants of programs or pieces of code whose semantics is given via least fixed-points. Up-to techniques have been introduced as enhancements of coinduction, an abstract principle to prove properties expressed via greatest fixed-points.
While abstract interpretation is always sound by definition, the soundness of up-to techniques needs some ingenuity to be proven. For completeness, the setting is switched: up-to techniques are always complete, while abstract domains are not.
In this work we show that, under reasonable assumptions, there is an evident connection between sound up-to techniques and complete abstract domains.
2017
Monoidal computer III: A coalgebraic view of computability and complexity
with Muzamil Yahia
Monoidal computer is a categorical model of intensional computation, where many different programs correspond to the same input-output behavior. The upshot of yet another model of computation is that a categorical formalism should provide a much needed high level language for theory of computation, flexible enough to allow abstracting away the low level implementation details when they are irrelevant, or taking them into account when they are genuinely needed. A salient feature of the approach through monoidal categories is the formal graphical language of string diagrams, which supports visual reasoning about programs and computations.
In the present paper, we provide a coalgebraic characterization of monoidal computer. It turns out that the availability of interpreters and specializers, that make a monoidal category into a monoidal computer, is equivalent with the existence of a *universal state space*, that carries a weakly final state machine for any pair of input and output types. Being able to program state machines in monoidal computers allows us to represent Turing machines, to capture their execution, count their steps, as well as, e.g., the memory cells that they use. The coalgebraic view of monoidal computer thus provides a convenient diagrammatic language for studying computability and complexity.
Refinement for Signal Flow Graphs
with Filippo Bonchi, Joshua Holland and Pawel Sobocinski
The symmetric monoidal theory of Interacting Hopf Algebras provides a sound and complete axiomatisation for linear relations over a given field. As is the case for ordinary relations, linear relations have a natural order that coincides with inclusion. In this paper, we give a presentation for this ordering by extending the theory of Interacting Hopf Algebras with a single additional inequation. We show that the extended theory gives rise to an abelian bicategory—a concept due to Carboni and Walters—and highlight similarities with the algebra of relations. Most importantly, the ordering leads to a well-behaved notion of refinement for signal flow graphs.
2015
Testing randomness by matching pennies
In the game of Matching Pennies, Alice and Bob each hold a penny, and at every tick of the clock they simultaneously display the head or the tail sides of their coins. If they both display the same side, then Alice wins Bob’s penny; if they display different sides, then Bob wins Alice’s penny. To avoid giving the opponent a chance to win, both players seem to have nothing else to do but to randomly play heads and tails with equal frequencies. However, while not losing in this game is easy, not missing an opportunity to win is not. Randomizing your own moves can be made easy. Recognizing when the opponent’s moves are not random can be arbitrarily hard.
The notion of randomness is central in game theory, but it is usually taken for granted. The notion of outsmarting is not central in game theory, but it is central in the practice of gaming. We pursue the idea that these two notions can be usefully viewed as two sides of the same coin.
2014
Monoidal computer II: Normal complexity by string diagrams
In Monoidal Computer I, we introduced a categorical model of computation where the formal reasoning about computability was supported by the simple and popular diagrammatic language of string diagrams. In the present paper, we refine and extend that model of computation to support a formal complexity theory as well. This formalization brings to the foreground the concept of normal complexity measures, which allow decompositions akin to Kleene’s normal form. Such measures turn out to be just those where evaluating the complexity of a program does not require substantially more resources than evaluating the program itself. The usual time and space complexity are thus normal measures, whereas the average and the randomized complexity measures are not. While the measures that are not normal provide important design time information about algorithms, and for theoretical analyses, normal measures can also be used at run time, as practical tools of computation, e.g. to set the bounds for hypothesis testing, inductive inference and algorithmic learning.
2012
Monoidal computer I: Basic computability by string diagrams
We present a new model of computation, described in terms of monoidal categories. It conforms the Church-Turing Thesis, and captures the same computable functions as the standard models. It provides a succinct categorical interface to most of them, free of their diverse implementation details, using the ideas and structures that in the meantime emerged from research in semantics of computation and programming. The salient feature of the language of monoidal categories is that it is supported by a sound and complete graphical formalism, string diagrams, which provide a concrete and intuitive interface for abstract reasoning about computation. The original motivation and the ultimate goal of this effort is to provide a convenient high level programming language for a theory of computational resources, such as one-way functions, and trapdoor functions, by adopting the methods for hiding the low level implementation details that emerged from practice. Only a modest first step towards this ambitious goal is presented.
Quantitative Concept Analysis
Formal Concept Analysis (FCA) begins from a context, given as a binary relation between some objects and some attributes, and derives a lattice of concepts, where each concept is given as a set of objects and a set of attributes, such that the first set consists of all objects that satisfy all attributes in the second, and vice versa. Many applications, though, provide contexts with quantitative information, telling not just whether an object satisfies an attribute, but also quantifying this satisfaction. Contexts in this form arise as rating matrices in recommender systems, as occurrence matrices in text analysis, as pixel intensity matrices in digital image processing, etc. Such applications have attracted a lot of attention, and several numeric extensions of FCA have been proposed. We propose the framework of proximity sets (proxets), which subsume partially ordered sets (posets) as well as metric spaces. One feature of this approach is that it extracts from quantified contexts quantified concepts, and thus allows full use of the available information. Another feature is that the categorical approach allows analyzing any universal properties that the classical FCA and the new versions may have, and thus provides structural guidance for aligning and combining the approaches.
2009
A semantical approach to equilibria and rationality
Game theoretic equilibria are mathematical expressions of rationality. Rational agents are used to model not only humans and their software representatives, but also organisms, populations, species and genes, interacting with each other and with the environment. Rational behaviors are achieved not only through conscious reasoning, but also through spontaneous stabilization at equilibrium points.
Formal theories of rationality are usually guided by informal intuitions, which are acquired by observing some concrete economic, biological, or network processes. Treating such processes as instances of computation, we reconstruct and refine some basic notions of equilibrium and rationality from the some basic structures of computation.
It is, of course, well known that equilibria arise as fixed points; the point is that semantics of computation of fixed points seems to be providing novel methods, algebraic and coalgebraic, for reasoning about them.
2007
Labelled Markov Processes as generalised stochastic relations
with Michael Mislove and James Worrell
Labelled Markov processes (LMPs) are labelled transition systems in which each transition has an associated probability. In this paper we present a universal LMP as the spectrum of a commutative C*-algebra consisting of formal linear combinations of labelled trees. This yields a simple trace-tree semantics for LMPs that is fully abstract with respect to probabilistic bisimilarity. We also consider LMPs with distinguished entry and exit points as stateful stochastic relations. This allows us to define a category LMP, with measurable spaces as objects and LMPs as morphisms. Our main result in this context is to provide a predicate-transformer duality for LMP that generalises Kozen’s duality for the category SRel of stochastic relations.
2006
Testing semantics: Connecting processes and process logics
with Michael Mislove and James Worrell
We propose a methodology based on testing as a framework to capture the interactions of a machine represented in a denotational model and the data it manipulates. Using a connection that models ma- chines on the one hand, and the data they manipulate on the other, test- ing is used to capture the interactions of each with the objects on the other side: just as the data that are input into a machine can be viewed as tests that the machine can be subjected to, the machine can be viewed as a test that can be used to distinguish data. This approach is based on generalizing from duality theories that now are common in semantics to logical connections, which are simply contravariant adjunctions. In the process, it accomplishes much more than simply moving from one side of a duality to the other; it faithfully represents the interactions that embody what is happening as the computation proceeds.
Our basic philosophy is that tests can be used as a basis for modeling interactions, as well as processes and the data on which they operate. In more abstract terms, tests can be viewed as formulas of process logics, and testing semantics connects processes and process logics, and assigns computational meanings to both.
2004
Duality for Labelled Markov Processes
with Michael Mislove, Joel Ouaknine and James Worrell
Labelled Markov processes (LMPs) are probabilistic labelled transition systems. In this paper we present a `universal’ LMP as the Stone-Gelfand-Naimark dual of a C*-algebra consisting of formal linear combinations of labelled trees. We characterize the state space of the universal LMP as the set of homomorphims from an ordered commutative monoid of labelled trees into the multiplicative unit interval. This yields a simple semantics for LMPs which is fully abstract with respect to probabilistic bisimilarity. We also consider LMPs with entry points and exit points in the framework of Elgot’s iterative theories. We define an iterative theory of LMPs by specifing its categorical dual: a category of commutative rings consisting of C*-algebras of trees and `shapely maps’ between them. We find that the basic operations for composing LMPs have simple definitions in the dual category.
2001
Towards semantics of self-adaptive software
When people perform computations, they routinely monitor their results, and try to adapt and improve their algorithms when a need arises. The idea of self-adaptive software is to implement this common facility of human mind within the framework of the standard logical methods of software engineering. The ubiquitous practice of testing, debugging and improving programs at the design time should be automated, and established as a continuing run time routine.
Technically, the task thus requires combining functionalities of automated software development tools and of runtime environments. Such combinations lead not just to challenging engineering problems, but also to novel theoretical questions. Formal methods are needed, and the standard techniques do not suffice.
As a first contribution in this direction, we present a basic mathematical framework suitable for describing self-adaptive software at a high level of semantical abstraction. A static view leads to a structure akin to the Chu construction. An dynamic view is given by a coalgebraic presentation of adaptive transducers.
Categories of processes enriched in final coalgebras
with Sava Krstic and John Launchbury
Simulations between processes can be understood in terms of coalgebra homomorphisms, with homomorphisms to the final coalgebra exactly identifying bisimilar processes. The elements of the final coalgebra are thus natural representatives of bisimilarity classes, and a denotational semantics of processes can be developed in a final-coalgebra-enriched category where arrows are processes, canonically represented.
In the present paper, we describe a general framework for building final-coalgebra-enriched categories. Every such category is constructed from a multivariant functor representing a notion of process, much like Moggi’s categories of computations arising from monads as notions of computation. The “notion of process” functors are intended to capture different flavors of processes as dynamically extended computations. These functors may involve a computational (co)monad, so that a process category in many cases contains an associated computational category as a retract. We further discuss categories of resumptions and of hyperfunctions, which are the main examples of prcess categories. Very informally, resumptions can be understood as computations extended in time, whereas hypercomputations are extended in space.
2000
Process fusion
The technique of build fusion, also known as deforestation, removes intermediate results in a composition involving the “build” of an initial (inductive, finite) data structure, followed by its consumption. Here we show that it is analogously possible to do process fusion, removing intermediate final (coinductive, potentially infinite) data passing between a producer and a consumer.
The key observation leading to our results is the fact that the Curry-Howard isomorphism, relating types to propositions, programs to proofs, and sequential composition to cut, extends to the correspondence of fusion to cut elimination. this simple idea gives us logical interpretations of the basic methods of generic and transformational programming. In the present paper, we provide a logical analysis of the general form of build fusion over the inductive data types, regular or nested. The analysis is based on a novel logical interpretation of parametricity in terms of the paranatural transformations, introduced in the paper. We extend it to cover process fusion on coinductive data types.
The results obtained are truly generic, in the sense of applying to all coinductive (final) data types, including nested ones, and allow a far wider range of optimizations than previously possible. By the standard embedding of initial into final data types, it also applies to arbitrary intial-final mixtures (e.g., infinitely unfolding trees of finite lists).
1999
Semantics of first order parametric specifications
Parametricity is one of the most effective ways to achieve compositionality and reuse in software development. Parametric specifications have been thoroughly analyzed in the algebraic setting and are by now a standard part of most software development toolkits. However, an effort towards classifying, specifying and refining algorithmic theories, rather than mere datatypes, quickly leads beyond the realm of algebra, and often to full first order theories. We extend standard semantics of parametric specifications to this more general setting.
The familiar semantic characterization of parametricity in the algebraic case is expressed in terms of the free functor, i.e. using the initial models. In the general case, the initial models may not exist, and the free functor is not available. Various syntactic, semantic and abstract definitions of parametricity have been offered, but their exact relationships are often unclear. Using the methods of categorical model theory, we establish the equivalence of two well known, yet so far unrelated definitions of parametricity, one syntactic, one semantic. Besides providing support for both underlying views, and a way for aligning the systems based on each of them, the offered general analysis and its formalism open up several avenues for future research and applications.
1997
Specifying Interaction Categories
with Samson Abramsky
We analyse two complementary methods for obtaining categorical models of process calculi. They allow adding new features to the captured notions of process and of type respectively. By alternating these two methods, all the familiar examples, as well as some new interaction categories, can be derived from basic monoidal categories.
Using the proposed constructions, interaction categories can be built and analysed without fixing a set of axioms for them. They are thus approached via specifications, just like algebras are approached via equations and relations, independantly of the intrinsic characterisation of varieties.
Categorical logic of names and abstraction in action calculi
Milner’s action calculus implements abstraction in monoidal categories, so that the familiar lambda-calculi, together with the pi-calculus and Petri nets can be subsumed. Variables are generalised to names: only a restricted form of substitution is allowed.
In the present paper, the well-known categorical semantics of the lambda-calculus is generalised to the action calculus. A suitable functional completeness theorem for symmetric monoidal categories is proved: we determine the conditions under which the abstraction is definable. Algebraically, the distinction between the variables and the names boils down to the distinction between the transcendental and the algebraic elements. The former lead to the polynomial extensions, like e.g. the ring Z[x], the latter to the algebraic extensions like Z[i].
Building upon the work of P. Gardner, we introduce action categories, and show that they are related to the static action calculus exacly as cartesian closed categories are related to the lambda-calculus. Natural examples of this structure arise from allegories and cartesian bicategories. On the other hand, the free algebras for any commutative Moggi monad form an action category. The general correspondence of action calculi and Moggi monads will be worked out in a sequel to this work.
1996
Convenient categories of processes and simulations II: Asynchronous cases
In the preceding paper, we have argued for a representation of processes taking into account computationally relevant morphisms. It has been shown that the category of synchronous processes, i.e. modulo strong bisimilarity, is isomorphic with the category of labelled irredundant trees.
In the present paper, we extend this representation to asynchronous processes, namely modulo the weak and the branching bisimulations and congruences. They are shown to correspond to interesting subcategories of the category of labelled irredundant trees. The form of the representatives in the case of the branching bisimilarity suggests possible connections with game theory. An abstract construction of a category of processes in a general setting is presented in the appendix.
1995
Categorical logic of concurrency and interaction I: Synchronous processes
This is a report on a mathematician’s effort to understand some concurrency theory. The starting point is a logical interpretation of Nielsen and Winskel’s account of the basic models of concurrency. Upon the obtained logical structures, we build a calculus of relations which yields, when cut down by bisimulations, Abramsky’s interaction category of synchronous processes. It seems that all interaction categories arise in this way. The obtained presentation uncovers some of their logical contents and perhaps sheds some more light on the original idea of processes as relations extended in time.
The sequel of this paper will address the issues of asynchronicity, preemption, noninterleaving and linear logic in the same setting.
Convenient categories of processes and simulations I: Synchronous case
Deep categorical analyses of various aspects of concurrency have been developed, but a uniform categorical treatment of the very first concepts seems to be hindered by the fact that the existing representations of processes as bisimilarity classes do not provide an account of computational morphisms.
In the present paper, we describe a category of processes modulo strong bisimulations, with the bisimilarity preserving simulations as morphisms, and show that it is equivalent to the category of labelled irredundant trees and the label preserving tree morphisms. The developed representation turns out to be applicable to weaker notions of process as well. They will be studied in the sequel.