Seminar series of the Mathematical Foundations group

Tuesdays 13.15 - 15.15 in 1W 2.102

Seminars are open to all

Organisers: Willem Heijltjes and Ben Ralph

Email us to suggest speakers

**Upcoming seminars **

**27 April**

**Pino Rosolini (University of Genoa)**

**Frames from topology, algebraically**

We describe a connection between frames and algebras for the double exponential monad on the Sierpinski space. Instrumental for the presentation is Dana Scott's category Equ of equilogical spaces. We present a subcategory of Equ, closed under the double exponential monad, on which the category of algebras is equivalent to that of frames (and frame homomorphisms). I hope to connect this with Taylor's work on Abstract Stone Duality.

This is joint work with Giulia Frosoni and Alessio Santamaria.

** **

**3 May**

**Ekaterina Komendantskaya (University of Dundee/Heriot-Watt University)**

**Automated Theorem Proving for Type Inference, Constructively**

** **Two facts are universally acknowledged: -- critical software must be subject to formal verification and -- modern verification tools need to scale and become more user-friendly in order to make more impact in industry.

** **There are two major styles of verification:

** **(*) algorithmic: verification problems are specified in an automated prover, e.g. (constraint) logic programming or SMT solver, and properties of interest are verified by the prover automatically.

** **Such provers can be fast, but their trustworthiness is hard to establish without producing and checking proofs. This is due to complexity of modern-day solvers, e.g. SMT solvers have codebases 100K in C++. These tools exhibit bugs and are not trustworthy enough for critical systems.

** **An alternative is (**) a typeful approach to verification: instead of verifying programs in an external prover, a programmer may record all properties of interest as types of functions in his programs. Thanks to Curry-Howard isomorphism, type inhabitants also play the role of runnable functions and proof witnesses, thus completing the certification cycle.

At their strongest, types can cover most of the properties of interest to the verification community, e.g. recent dialects Liquid Haskell and F* allow pre- and post-condition formulation at type level. But, when properties expressed at type level become rich, type inference engines have to assume the role of automated provers, e.g. Liquid Haskell and F* connect directly to SMT solvers. Thus, once again, we delegate trust without having proper certification of automated proofs.

** **In this talk, I will tell about our recent work that resolves the above dichotomy “scale versus trust” by offering a new, typeful, approach to automated proving for type inference. Recently, we designed a new method of using logic programming in Haskell type class inference: Horn clauses can be represented as types, and proofs by resolution -- as proof terms inhabiting the types. Thus, automated proofs by resolution plug in directly to Haskell’s type system and compiler, while keeping high standards of certification of proofs used in type inference.

** **Joint work with Peng Fu and Tom Schrijvers, the talk will be based on our FLOPS’16 paper ``Proof-relevant Corecursive Resolution”.

**Past seminars**

__2016__

**20 April**

**Jamie Vicary (University of Oxford)**

**Geometrical Proofs for Linear Logic**

Linear logic is a fundamental way to reason about resources that cannot be duplicated or deleted. In this talk, I will present a new approach to the proof theory of linear logic, in which proofs are represented as surfaces embedded in 3-dimensional space. Proof equivalence then has a simple definition: two proofs are logically equivalent just when their surfaces are geometrically equivalent. The technical basis for the work comes from higher category theory, and I will give a simple and accessible introduction to this.

**19 April**

**Paul Harrenstein (University of Oxford)**

**Expressiveness and Nash Equilibrium in Iterated Boolean Games**

** **We introduce and investigate a novel notion of expressiveness for temporal logics that is based on game theoretic properties of multi-agent systems. We focus on iterated Boolean games, where each player has a goal, represented using (a fragment of) Linear Temporal Logic (LTL). This goal captures the player's preferences: the models of the goal represent system behaviours that would satisfy the player. Moreover each player is assumed to act strategically, taking into account the goals of the other players, in order to bring about computations satisfying their goal. In this setting, we apply the standard game-theoretic concept of Nash equilibria: the Nash equilibria of an iterated Boolean game can be understood as a (possibly empty) set of computations, each computation representing one way the system could evolve if players chose strategies in Nash equilibrium. Such an equilibrium set of computations can be understood as expressing a temporal property—which may or may not be expressible within a particular LTL fragment. The new notion of expressiveness that we study is then as follows: what LTL properties are characterised by the Nash equilibria of games in which agent goals are expressed in fragments of LTL? We formally define and investigate this notion of expressiveness and some related issues, for a range of LTL fragments.

**12 April**

**Harry Gunn (University of Bath, Masters Student)**

**Nature-based Cryptography**

This is a review of several papers by G. Grigoriev and V. Shpilrain on a novel approach to public key cryptography. These authors write:

"We use various laws of classical physics to offer several solutions of Yao’s millionaires’ problem without using any one-way functions. We also describe several informationally secure public key encryption protocols, i.e., protocols secure against passive computationally unbounded adversary. This introduces a new paradigm of decoy-based cryptography, as opposed to “traditional” complexity-based cryptography. In particular, our protocols do not employ any one-way functions.”

**05 April**

**Sam Staton (University of Oxford)**

**Semantics for probabilistic programming**

I'll talk about the semantics of probabilistic programming languages. This is an old subject, but recently probabilistic programming has attracted a lot of interest as a method of statistical modelling, through languages like Anglican and Church. These raise some new problems, such as how to combine continuous distributions with higher types. I'll describe our work on operational semantics and denotational semantics (based on sheaves and measurable spaces).

**15 March**

**John Power (University of Bath) **

Category theoretic semantics for theorem proving in logic programming: embracing the laxness

(joint with Ekaterina Komendantskaya)

A propositional logic program P may be identified with a P_{f}P_{f}-coalgebra on the set of atomic propositions in the program. The corresponding C(P_{f}P_{f})-coalgebra, where C(P_{f}P_{f}) is the cofree comonad on P_{f}P_{f}_{ }describes derivations by resolution. Using lax semantics, that correspondence may be extended to a class of first-order logic programs without existential variables. The resulting extension captures proofs by term-matching resolution in logic programming. Refining the lax approach, we further extend it to arbitrary logic programs.

**8 March**

**John Gowers (University of Bath)**

**Games with ordinal sequences of moves**

I shall present a modification of the Abramsky-Jagadeesan games model to allow sequences of moves indexed by transfinite ordinals. The motivation for this construction is work arising from work by Laird and Churchill in [1,2] concerning the sequoid operator. In [2], the authors construct an exponential in the category of games that is a cofree commutative comonoid for the 'tensor on the left' functor and a final coalgebra for the 'sequoid on the left' functor. In the category of finite games and strategies, this exponential can be constructed from the sequoid functor as the limit of a diagram indexed by the ordinal w. If we try to extend this result to the 'sequoidal categories' introduced in [1], then we find that this construction does not always produce a final coalgebra, but that for natural categories of games a similar construction using a higher ordinal will work. If the lengths of plays in an ordinal game can be bounded by a limit ordinal k then we may construct the final coalgebra for the sequoid using a suitable diagram indexed by k. Conversely, if a game contains plays greater than an ordinal k then the limit of the natural diagram indexed by k does not have the natural structure of a coalgebra.

There is a sizeable body of research in the field of games with plays indexed by transfinite ordinals (sometimes called 'long games'). In [3], Itay Neeman presents results concerning whether or not such games are determined. More recently, Laird has applied a similar model to the study of unbounded determinism in [4]. The construction given in this talk is a straightforward extension of the games model outlined in [1,5]. A nice feature of the construction is that it includes as special cases both the 'games with winning condition on infinite plays', given in [5], and the pure finite games introduced by Blatt in [6].

After shifting focus from finite to infinite ordinals, it becomes convenient to treat plays (ordinal sequences of moves), rather than moves, as primitive, and one possible formulation is to define a game to be a sheaf of sets on some given ordinal k, where the ordinal b < k is sent to the set of legal plays of length b. In contrast to the Abramsky-Jagadeesan model, in which moves are designated as Player-moves or Opponent-moves, we have a function that designates plays as Player-plays or Opponent-plays. If a play is indexed by a limit ordinal, then it has no last move, so this distinction is important. For example, in the case of games played over the ordinal w + 1, we are free to specify whether a play of length w should belong to Player or to Opponent, and this corresponds exactly to choosing a set of infinite plays that are Player-winning, as in [5].

I shall outline the motivation and construction for games played over transfinite ordinals, and shall discuss briefly some tentative questions about links to ordinals occurring elsewhere in the theory - in particular, game-value ordinals for winning positions and consistency-strength ordinals in proof theory.

[1]: Martin Churchill, James Laird and Guy McCusker. Imperative programs as proofs via game semantics. LICS 2011: 65-74, June 2011

[2]: James Laird. A Categorical Semantics of Higher Order Store, CTCS 2002. Proceedings of CTCS '02, Elsevier, 2002

[3]: Itay Neeman. The Determinacy of Long Games. De Gruyter Series in Logic and its Applications, 1972

[4]: James Laird. Sequential Algorithms for Unbounded Nondeterminism. MFPS XXXI: 271-287, 2015

[5]: Samson Abramsky, Radha Jagadeesan. Games and full completeness for multiplicative linear logic. Journal of Symbolic Logic 59 (02): 543-574, 1994

[6]: A. Blass. A game semantics for linear logic. Annals of Pure and Applied Logic, 56(1-3): 183 - 220, 1992

**1 March**

**Willem Heijltjes (University of Bath)**

**Proof Nets and Complexity**

In this talk I will give an overview of some recent and some very recent developments in linear logic proof nets.

__2015__

**8 December**

**Giuseppe Primiero (Middlesex University)**

**SecureND: Natural Deduction for Secure Trust**

Applications in computational domains complement verified knowledge with information sharing processes. From a logical viewpoint, formulating assertion operations in terms of a trust function is challenging, both conceptually and technically. In this talk we overview SecureND, a natural deduction calculus for knowledge derivation under trust. Its design is motivated by the problem of trust transitivity. We present also its implementation as the Coq protocol SecureNDC, to deal with trusted sources in software management systems. We conclude with an overview of current and future extensions of our language.

**1 December**

**Andrea Aler Tubella (University of Bath)**

**A generalised cut-elimination procedure through subatomic logic**

Through subatomic logic we are able to present sufficient conditions for a proof system to enjoy cut-elimination. In this talk I will present subatomic logic, how it enables us to present proof systems that have single (linear) rule scheme and a recent result: we can generalise the splitting procedure for cut-elimination to any proof system whose rules and connectives have certain properties.

**17 & 24 November**

**John Power (University of Bath)**

**Lawvere Theories**

I plan to give two talks about Lawvere theories. These are not for experts but rather to give the details. Lawvere introduced the notion in his PhD thesis in 1963, providing not only a category theoretic account of universal algebra but one that is also presentation-independent. Remarkably, his definition was embraced, albeit with a caveat and in different terms, by universal algebraists but not by category theorists. The latter, from 1966, generally preferred to model universal algebra owing to a little more generality but at a very considerable cost. Computer scientists were then influenced to adopt monads, but much could and has been gained by recasting some of the latter's concerns in terms of Lawvere theories. Ultimately, I think Lawvere theories are a superior approach, but benefit very much from the relationship with monads, and I duly plan to explain it.

**27 October**

** Guillaume Munch-Maccagnoni (University of Cambridge)****Polarised realizability structures, models, and depolarisation**

Polarisation describes the presence of an evaluation order, and is characterised denotationally by a non-associativity of compositions. We recently proposed a polarised, Curry-style approach to the λ-calculus with extensional sums, in correspondence with polarised intuitionistic logic. We suggested that associativity of composition in this context should not be seen as a syntactic axiom, but as an emergent property akin to termination. Traditionally, issues with sums in denotational semantics have rather been considered to be with extensionality than with the associativity. This will be explained in an introductory fashion in a first part.

In a second part, I will more formally relate the termination in the λ-calculus with sums to depolarisation, i.e. associativity of composition, or more familiarly the fact that the order of evaluation does not matter. First, a general setting of polarised realizability structures for polarised calculi with or without control operators is developed. Then, a general technique to build observational models from these structures is explained. Finally, under broad conditions, the observational models that the non-associative syntactic structure gives rise to satisfy the associativity of composition (and are therefore cartesian closed categories with binary co-products). I will sketch an analogy between intuitionistic depolarisation and parametricity.

**20 October**

** Matthijs Vákár (University of Oxford)****Game Semantics for Dependent Types**

Game semantics can act as a unifying semantic framework, providing compelling models for a strikingly wide range of programming languages, type theories and logics. A notable exception has been dependent type theory, which had so far defied a game theoretic description. We present a proposal to fill this gap in the form of a new categorical model of dependent type theory, based on a category of games and history-free winning strategies. We model dependent type theory with 1-, Sigma-, Pi- and intensional Id-types as well as finite inductive type families (which act as ground types, like calendars). We discuss the place of the Id-types in the intensionality spectrum as well as the strong completeness properties the model satisfies.

Most of the talk should be understandable without prior knowledge of game semantics and dependent type theory.

**15 October**

**Ugo dal Lago (Universitá di Bologna)****Higher-Order Probabilistic Computation: Calculi, Observational Equivalence, and Implicit Complexity**

Probabilistic models are more and more pervasive in computer science, and randomized algorithms are the ones offering the best performances in many domains. Higher-order probabilistic computation – in which a probabilistic function may be passed as a parameter and returned as a result – is on the other hand a relatively underdeveloped field, which is however receiving more and more attention. We give a survey of what is known about probabilistic lambda-calculi, later focusing on some of our recent results on implicit complexity and on inductive and coinductive techniques for program equivalence. Finally, we hint at how all this could be useful when structuring proofs of security for cryptographic primitives, but also when expressing probabilistic models in the context of machine learning.

**10 June**

** Willem Heijltjes (University of Bath)****Complexity Bounds for Sum-Product Logic via Additive Proof Nets and Petri Nets**

This is joint work with Dominic Hughes. We investigate efficient algorithms for the additive fragment of linear logic. This logic is an internal language for categories with finite sums and products, and describes concurrent two-player games of finite choice. In the context of session types, typing disciplines for communication along channels, the logic describes the communication of finite choice along a single channel.

We give a simple linear time correctness criterion for unit-free propositional additive proof nets via a natural construction on Petri nets. This is an essential ingredient to linear time complexity of the combinatorial proofs for classical logic by Dominic Hughes.

For full propositional additive linear logic, including the units, we give a proof search algorithm that is linear-time in the product of the source and target formula, and an algorithm for proof net correctness that is of the same time complexity. We prove that proof search in first-order additive linear logic is NP-complete.

**2 June**

** Anupam Das (ENS Lyon)****A complete axiomatisation of MSOL on infinite trees.**

We show that an adaptation of Peano's axioms for second-order arithmetic to the language of monadic second-order logic (MSOL) completely axiomatises the associated theory (SkS) over infinite trees. This continues a line of work begun by Büchi and Siefkes with axiomatisations of MSOL over various classes of linear orders. Our proof formalises, in the axiomatisation, a translation of MSO formulas to alternating parity tree automata. The main ingredient is the formalised proof of positional determinacy for the corresponding parity games which, as usual, allows us to complement automata and to deal with the negation of MSO formulas. The Comprehension Scheme of MSOL is used to obtain uniform winning strategies, whereas most usual proofs of positional determinacy rely on instances of the Axiom of Choice or transfinite induction. (Consequently we obtain an alternative decision procedure for MSOL over infinite trees, via proof search, that remains entirely internal to the language.)

This talk is based on joint work with Colin Riba that will be presented at LICS '15.

**12 May**

** Georg Struth (University of Sheffield)****Completeness Theorems for Bi-Kleene Algebras and Series-Parallel Rational Pomset Languages**

Pomsets form a standard model of true concurrency. In this lecture I present a completeness result for a class of pomset languages, which generalises the regular languages to the realm of concurrency. More precisely I show that the congruence on series-parallel rational pomset expressions generated by series-parallel rational pomset language identity is axiomatised by the axioms of Kleene algebra plus those of commutative Kleene algebra. A decision procedure is extracted from this proof. On the way to this result, series-parallel rational pomset languages are proved to be closed under the operations of co-Heyting algebras and homomorphisms. These results form a significant step towards a decision procedure for the equational theory of concurrent Kleene algebras, which have recently been proposed for concurrency verification (joint work with Michael Laurence).

**5 May**

**James Brotherston (University College London)**

**Parametric completeness for separation theories (via hybrid logic).**

In this talk, we consider the logical gap between the following two concepts:

(1) provability in a propositional axiomatisation of separation logic, which is usually given by the bunched logic BBI; and

(2) validity in an intended class of models of separation logic, as commonly considered in its program verification applications. Such intended classes are usually specified by a collection of algebraic axioms describing specific model properties, which we call a separation theory.

Here, we show first that several typical properties of separation theories are in fact not definable in BBI. Then, we show that these properties become definable in a natural hybrid extension of BBI, obtained by adding a theory of naming to BBI in the same way that hybrid logic extends normal modal logic. Finally, we show how to build an axiomatic proof system for our hybrid logic in such a way that adding any axiom of a certain form yields a sound and complete proof system with respect to the models satisfying those axioms. In particular, this yields sound and complete proof systems for any separation theory from our considered class (which, to the best of our knowledge, includes all those appearing in the literature).

This is joint work with Jules Villard, now at Facebook.

**28 April**

**Guilhem Jaber (Queen Mary)****Reasoning on Equivalence of Stateful Programs with Operational Game Semantics**

Contextual equivalence of programs written in a functional language with references (i.e. local mutable states) is a notoriously hard problem, specially with higher-order references (i.e. references which can store functions). In the last twenty years, different techniques have been introduce to that purpose: Kripke Logical Relations, Bisimulations and Algorithmic Game Semantics.

In this talk, we will see how to use operational game semantics, namely the trace semantics for a language with references introduced by Laird, to build a new technique, Kripke Open Bisimulations, to reason on equivalence of programs, taking the best of the previous methods. This technique is simple enough to be mostly automatized: it becomes possible to model-check equivalence of programs.

If time permits, we will see how to extend this technique to polymorphism.