Programme
Lundi
À partir de 9h30: Accueil & café
10h10–12h30: Exposés, présidé par Ralph Matthes
Valentin Blot (Paris-Sud), Polymorphism vs. Bar Recursion: towards a new termination proof for System F (invité)
Spector defined bar recursion in 1962 to extend Gödel’s Dialectica interpretation of arithmetic to full mathematical analysis, that is, arithmetic with the axiom scheme of comprehension. In the early 70’s, Girard and Reynolds discovered another unrelated interpretation of second-order arithmetic: polymorphic λ-calculus. While the logical equivalence between analysis and second-order arithmetic has always been clear, the computational counterpart did not receive much attention. My talk gives a first step towards filling this gap by presenting an interpretation of System F into System T augmented with bar recursion. While both languages are total, their termination proofs are very different: the only known proofs of termination of System F rely on the impredicative notion of reducibility candidates (or saturated sets), while termination of bar recursion is usually proven with the axiom of dependent choice. My goal is to combine the interpretation of System F into System T + bar recursion with a proof of termination of bar recursion, so as to obtain a new proof of termination for System F that does not rely on the impredicative notion of reducibility candidates.
Pierre Vial (IRIF), The complete unsoundness of system R: typing non-trivially every λ-term with infinite formulas
Certain type assignment systems are known to ensure or characterize normalization. The grammar of the types they feature is usually inductive. It is easy to see that, when types are coinductively generated, we obtain unsound type systems (meaning here that they are able to type some mute terms: terms that do not normalize in any reasonable meaning). For most of those systems, it is not difficult to find an argument proving that every term is typable (complete unsoundness). However, this argument does not hold for relevant intersection type systems (ITS), that are more restrictive because they forbid weakening. Thus, the question remains: are relevant ITS featuring coinductive types – despite being unsound – still able to characterize some bigger class of terms? We show that it is actually not the case: every term is typable (in a non-trivial way) in a standard coinductive ITS called R. Moreover, we prove that semantic information can be extracted from the typing derivations of R, as the order (i.e. the arity) of the typed terms. Our work also implicitly provides a new non sensible relational model for pure λ-calculus.
Lutz Strassburger (Inria), Intuitionistic combinatorial proofs
Joint work with Willem Heijltjes and Dominic Hughes.
In this talk I will introduce intuitionistic combinatorial proofs together
with a normalization procedure, and I will compare this normalization with
the cut elimination procedure for classical combinatorial proofs.
Hugo Herbelin (Inria, IRIF), A proof with side effects of completeness with respect to Tarski semantics
There is a standard way to normalize proofs by composing appropriate proofs of soundness and completeness with respect to semantics such as e.g. Kripke semantics. We shall investigate how to do the same with Tarski semantics by placing ourselves in a logical system with a memory side effect.
Buffet
14h00–16h20: Exposés, présidé par Pierre Clairambault
Paul-André Melliès (IRIF), Automates à parité d’ordre supérieur (invité)
Dans cet exposé, j’expliquerai comment définir une notion d’automate à parité capable de reconnaître (ou de rejeter) un λ-terme simplement typé avec points fixes. La notion d’automate à parité que je formulerai étend grandement le domaine d’application traditionnel du model-checking d’ordre supérieur. Elle éclaire aussi un lien profond et inédit entre théorie des automates et modèles relationnels de la logique linéaire. Du point de vue technique, la preuve de décidabilité que je présenterai repose sur une combinaison d’idées issues de la sémantique dénotationnelle et de la réécriture infinitaire. Ce travail marque une étape de clarification dans un programme de recherche lancé aux côtés de Charles Grellois (LIF) et en affinité élective avec les travaux de Sylvain Salvati (CRIStAL) et Igor Walukiewicz (LaBRI).
Léo Stefanesco (IRIF), Jeux asynchrones et logique de séparation concurrente
We develop an asynchronous account of concurrent separation logic (CSL): we interpret both programs and derivation trees of Concurrent Separation Logic as 2-player asynchronous graphs, where the Code and the Environment interact. For a given derivation tree of a specification of a program C, there is always an asynchronous graph morphism between them. The soundness theorem states that this morphism is an op-fibration and a 2-fibration. This implies in particular the usual soundness theorem of CSL, including the absence of data races.
Benjamin Dupont (Université Claude Bernard Lyon 1), Explicit bases in representation theory by rewriting
In this talk, we will explain how to present higher dimensional linear categories using higher dimensional linear rewriting systems. In particular, we will deal with 2-categories whose spaces of 2-cells are vector spaces. We will see that the property of convergence of those rewriting systems allows us to find explicit bases of each space of 2-cells. We will illustrate these methods on a family of algebras arising in representation theory for categorifying quantum groups: the quiver Hecke algebras. We will introduce a diagrammatic presentation of these algebras using string diagrams and explain how to obtain bases of type Poincaré-Birkhoff-Witt by rewriting.
Philippe Malbos (Institut Camille Jordan – Université Claude Bernard Lyon 1), Coherence by decreasingness
For a confluent and terminating string rewriting system, the relations amongst rewriting reductions are generated by confluence diagrams of critical pairs. This result corresponds to a homotopical version of the critical pair lemma based on a homotopical interpretation of the notion of reduction strategy. In particular, this interpretation gives a constructive method to solve coherence problems for monoids.
In this talk, I will present these constructions on algebraic prototypical examples and I will show how to weaken the termination hypothesis in the description of all the relations amongst rewriting reductions for string and linear rewriting systems using a decreasingness method.
Café
16h50–17h50: Exposés, présidé par Alexis Saurin
Marie Kerjean (IRIF), Smooth models of Differential Linear Logic
Differential Linear Logic was constructed following a study of discrete vectorial models of Linear Logic. We want to extend the semantics of Linear Logic in the natural domain of continuous objects and analysis. From the basic fact that Seely’s formulas is the direct interpretation of the Kernel Theorem for distributions, we explain two developments. On one hand we axiomatize a Smooth Differential Linear Logic with a graded syntax where to each solvable Linear partial differential equation one associate an exponential. We construct a model of nuclear Fréchet/ Df spaces for this syntax. On the other hand, we argue that the interpretation of the ⅋ as Schwartz’s epsilon product should be the cornestone of the construction of a smooth classical model of DiLL. From a first-model of k-reflexive spaces, and based on pioneering works by Kriegl, Michor and Meise, we construct a variety of (at least two) new models of DiLL. This part is joint work with Y. Dabrowski.
Luc Pellissier (ENS Lyon), Subsumed by linear approximations
In this talk, we will show how different familiar operations on proofs/programs can be carried on linear approximations rather than on general programs. This clarifies the role that linearity plays in the understanding of computation.
17h50–: Réunion GTs
20h: Dîner chez La Passagère
Mardi
8h40–10h00: Exposés, présidé par Beniamino Accattoli
Amina Doumane (ENS Lyon), Completeness of mu-calculi and the mystery of circular proofs (invitée)
Fixed point logics are useful in many domains of computer science. In programming languages, they allow to encode in a modular way (co)inductive data structures. In verification, they are used as a specification language to describe the properties of reactive systems.
There are two main families of proof systems for logics with fixed points: finitary proof systems with an explicit rule of (co)induction and infinitary proof systems. While the first family has received much attention from the scientific community and is now very well understood, much less work has been done to develop the second one from a proof-theoretical viewpoint.
In this talk, we show that infinitary proof systems deserve to be considered as proper proof systems. We show also how they can be used to prove completeness in a constructive way for various settings.
Ralph Matthes (CNRS, IRIT), Predicting the number of inhabitants in simply-typed λ-calculus, a coinductive approach
Joint work with José Espírito Santo and Luís Pinto
Using a coinductive approach to the representation of the entire search space in proof search, developed previously by the authors, we reprove refinements of the coherence theorem for balanced types: even for only negatively non-duplicated types, there is at most one normal inhabitant, and for positively non-duplicated types, there are only finitely many normal inhabitants. The accent is on the method that allows to prove these two related results that originally had been obtained by very different methods.
Café
10h30–12h30: Exposés, présidé par Gabriel Scherer
Rodolphe Lepigre (Inria, LSV), Practical Curry-Style using Choice Operators and Local Subtyping
In a recent (submitted) work with Christophe Raffalli, we designed a rich type system for an extension of System F with subtyping. It includes primitive sums and products, existential types, (co-)inductive (sized) types, and it supports general recursion with termination checking. Despite its Curry-style nature, the system can be (and has been) implemented thanks to new techniques based on choice operators, local subtyping and circular proofs. During the talk, I will give you a flavour of the first two techniques. In particular, I will show how choice operators can be used to get rid of typing contexts, and how subtyping can be used to handle Curry-style quantification in a syntax-directed way.
Kenji Maillard (IRIF), Towards a “standard” metatheory for F*
F* is a dependently typed language featuring monadic effects standing at the intersection between programming languages and proof assistants. As a programming language, F* allows to write programs in a ML style with a minimal amount of annotations and to prove properties of the programs only as needed, e.g. termination. This effective style is being used quite extensively in the Everest project whose goal is to provide a fully verified https stack. As a proof assistant, F* features an extensional type theory with inductive types and a hierarchy of universes enabling a rich logic. To enable a swift style of reasoning F* uses refinement types backed by effect-based weakest precondition calculi and smt automation. These refinement types generates in turn a subsumptive subtyping relation which is known to interact badly with usual presentations of inductives. In this talk, we will present and motivate the current theory of F* and explain how we plan to resolve some of its current limitations with respect to subtyping by embedding it in a standard but exceptional type theory.
Guillaume Genestier (LSV, Inria), Vérification de la terminaison dans le λΠ-calcul modulo théorie
Dedukti (Assaf et al., 2016) est un type-checker pour le λΠ-calcul modulo théorie, qui a la particularité d’autoriser l’utilisateur à déclarer des règles de réécriture, en particulier pour encoder la logique utilisée. Grace à la correspondance de Curry-Howard-De Bruijn, le type-checking peut être utilisé pour vérifier des preuves. Malheureusement, pour décider le type-checking, Dedukti a besoin que le système de règles de réécriture déclarées par l’utilisateur, combiné avec la bêta-réduction soit confluent et terminant. Un outil extérieur peut être appelé pour vérifier la confluence, mais aucun outil automatique n’existe pour vérifier la terminaison dans Dedukti. Vérification qui doit donc être faite à la main indépendamment.
L’objet de l’exposé sera donc de présenter un critère de terminaison utilisant candidats de réductibilité et « Size-Change Principle » (Lee, Jones and Ben-Amram 2001) pour un fragment du λΠ-calcul modulo théorie, inspiré de Wahlstedt (2007), ainsi qu’un prototype de celui-ci implémenté pour Dedukti (implémentation en cours, dont je ne peux garantir qu’elle sera terminée lors de l’exposé).
Daniel El Ouraoui (Inria, Loria), Higher-order in SMT
Higher-order (HO) logic is a pervasive setting for reasoning about numerous real-world applications. In particular, it is widely used in proof assistants (also known as interactive theorem provers) to provide trustworthy, machine-checkable formal proofs of theorems. A major challenge in these applications is to automate as much as possible the production of these formal proofs, thereby reducing the “burden of proof” on the users. An effective approach towards stronger automation is to rely on less expressive but more automatic theorem provers to discharge some of the proof obligations. Systems such as HOLyHammer, MizAR, Sledgehammer, and Why3,which provide a one-click connection from proof assistants to first-order provers, have led in recent years to considerable improvements in proof assistant automation.
Today, the leading automatic provers for first-order classical logic are based either on the superposition calculus or on CDCL(T). Those based on the latter are usually called satisfiability modulo theory (SMT) solvers. Our goal, as part of the Matryoshka project, is to extend SMT solvers to natively handle higher-order problems, thus avoiding completeness and performance issues with clumsy encodings.
In this talk, we present our first steps towards two contributions within our established goal: to extend the input (problems) and output (proofs) of SMT solvers to support higher-order constructs. Most SMT solvers support SMT-LIB as an input format. We provide a syntax extension for augmenting SMT-LIB with higher-order functions with partial applications, λ-abstractions, and quantification on higher-order variables.
Buffet
14h00–16h20: Exposés, présidé par Hugo Herbelin
Pierre-Marie Pédrot (MPI-SWS), Taming effects in a dependent world (invité)
In the traditional view of the Curry-Howard isomorphism, Martin-Löf type theory is a total functional programming language, or equivalently a hardcore intuitionistic theory. Although it seemed close to impossible to come up with a way to alter this property by adding effects to it, we recently challenged this belief. By using the so-called syntactic model approach, which consists in compiling a type theory into another one through crazily dependent program transformations, we showed that it was actually possible to extend MLTT with impure constructions. There is an implicit tension between the features of MLTT though, and a few assumptions need to be relaxed. Intuitively, in an effectful world, there is a type-theoretical ‘triangle of fire’ and one is forced to pick only two properties out of the three following ones: unrestricted conversion (i.e. call-by-name theory), logical consistency, and dependent elimination. Led by our bottomless faith in Curry-Howard and riding our flawless Coq destrier, we will jump into the chasm to explore the parallel universes where one of the two latter conditions are weakened. Along our journey, we will discover that lies can bring truth, that value restriction has an evil twin, and that semantics of MLTT is really bad for your mental health.
Eric Finster (Inria), Lex Modalities in Type Theory
In ordinary topos theory, one can encode the notion of a subtopos as a Lawvere-Tierney topology on the sub-object classifier. In homotopy type theory, the natural replacement for such a topology is a modality on types. But whereas the localization associated to a Lawvere-Tierney topology always preserves finite limits, this fails to be the case in a higher setting so that not every modality corresponds to a subtopos (the standard example of n-trucation provides a simple counterexample).
In this talk, I will discuss the problem of lex modalities in type theory and exhibit a criterion which allows us to detect when modalities do indeed generate sub-toposes, as well provides us with tools to generate such modalities.
Matthieu Sozeau (Inria Paris & IRIF), Cumulative Inductive Types in Coq
Joint work with Amin Timany (imec-Distrinet, KU Leuven, Belgium)
In order to avoid well-know paradoxes associated with self-referential definitions, higher-order dependent type theories stratify the theory using a countably infinite hierarchy of universes (also known as sorts), Type(0) : Type(1) : … Such type systems are called cumulative if for any type T we have that T : Type(i) implies T : Type(i+1). The Predicative Calculus of Inductive Constructions which forms the basis of the Coq proof assistant, is one such system.
We will present the Predicative Calculus of Cumulative Inductive Constructions which extends the cumulativity relation to inductive types, and its soundness proof. We also discuss cumulative inductive types as they are supported in the recently released Coq version 8.7.
Samuel Mimram (École Polytechnique), A Type-Theoretical Definition of Weak ω-Categories
Joint work with Eric Finster
We introduce a dependent type theory whose models are weak ω-categories, generalizing Brunerie’s definition of ω-groupoids. Our type theory is based on the definition of ω-categories given by Maltsiniotis, himself inspired by Grothendieck’s approach to the definition of ω-groupoids. In this setup, ω-categories are defined as presheaves preserving globular colimits over a certain category, called a coherator. The coherator encodes all operations required to be present in an ω-category: both the compositions of pasting schemes as well as their coherences. Our main contribution is to provide a canonical type-theoretical characterization of pasting schemes as contexts which can be derived from inference rules. Finally, we present an implementation of a corresponding proof system.