View: session overviewtalk overview
08:45  Modalities and Linearity SPEAKER: Elaine Pimentel ABSTRACT. This is a work about how to use linear logic (with or without subexponentials) as a logical framework to specify sequent calculus proof systems as well as concurrent computational systems. In particular, we present simple and decidable conditions for guaranteeing that the cut rule and nonatomic initial rules can be eliminated and we show how to use the so called subexponentials in order to deal with more involving logical and computational systems. 
09:45  The inhabitation problem for nonidempotent intersection SPEAKER: Simona Ronchi Della Rocca ABSTRACT. The inhabitation problem for intersection types is known to be undecidable. We study the problem in the case of nonidempotent intersection, in particular for a type assignment system characterizing solvable terms, and we prove decidability through a sound and complete algorithm. 
08:45  Isabelle tutorial for beginners (part 1) SPEAKER: Tobias Nipkow ABSTRACT. This is a handson tutorial introduction to the proof assistant Isabelle/HOL. Slide presentations alternate with practical exercises: partipants should bring their own laptop with Isabelle installed. There will be seasoned Isabelle tutors to help you using the system. The tutorial takes 3.5h and has three sections:
See also Programming and Proving with Isabelle/HOL. NOTE: Please install this Isabelle release candidate before coming to the tutorial. 
08:45  A formally verified proof of the Central Limit Theorem SPEAKER: Johannes Hölzl ABSTRACT. We describe a formally verified proof of the Central Limit Theorem in the Isabelle proof assistant. 
09:15  Primitively (Co)recursive Definitions for Isabelle/HOL SPEAKER: Lorenz Panny ABSTRACT. Isabelle/HOL has recently been enriched with a new definitional package for datatypes and codatatypes. The package introduces the specified types and derives auxiliary constants and characteristic theorems, notably (co)recursors and (co)induction principles. We now introduce support for highlevel specifications of (co)recursive functions, in the form of three commands: primrec, primcorec, and primcorecursive. The new commands internally reduce the specifications to arguments to the (co)recursor and generate a number of theorems about the function definition, automating an otherwise tedious process. 
09:45  Patternbased Subterm Selection In Isabelle SPEAKER: unknown ABSTRACT. This article presents a patternbased language designed to select (a set of) subterms of a given term in a concise and robust way. Building on this language, we implement a singlestep rewriting tactic in the Isabelle theorem prover, which removes the need for obscure "occurrence numbers" for subterm selection. The language was inspired by the language of patterns of Gonthier and Tassi, but provides an elegant way of handling bound variables and a modular semantics. 
08:45  A Unified Proof System for QBF Preprocessing SPEAKER: Marijn Heule ABSTRACT. For quantified Boolean formulas (QBFs), preprocessing is essential to solve many realworld formulas. The application of a preprocessor, however, prevented the extraction of proofs for the original formula. Such proofs are required to independently validate correctness of the preprocessor's rewritings and the solver's result. Especially for universal expansion proof checking was not possible so far. In this talk, we present a unified proof system based on three simple and elegant quantified resolution asymmetric tautology QRAT rules. In combination with an extended version of universal reduction, they are sufficient to efficiently express all preprocessing techniques used in stateoftheart preprocessors including universal expansion. Moreover, these rules give rise to new preprocessing techniques. We equip the preprocessor bloqqer with QRAT proof logging and provide a proof checker for QRAT proofs. 
09:30  On InstantiationBased Calculi for QBF SPEAKER: Mikolas Janota ABSTRACT. Several calculi for quantified Boolean formulas (QBFs) exist but relations between them are not yet fully understood. This talk defines a novel calculus, which is resolutionbased and enables unification of Qresolution and the expansionbased calculus ∀Exp+Res. These calculi play an important role in QBF solving. This talks shows simulation results for the new calculus and discusses further possible extensions of the calculus. 
08:45  Welcome SPEAKER: Organisers 
09:00  Using HOL4 to Formalize Physical Systems SPEAKER: Umair Siddique ABSTRACT. Recently, HOL4 has been successfully used in both the formalization of pure mathematics (e.g., extended real number theory and fractional calculus) and verification of engineering systems (e.g., wireless sensor networks, communication protocols). The aim of this talk is twofold: First, presenting the perspective of noncomputer science community such as physicist or an engineer. We will also present some suggestions which can be helpful in extending the user community of HOL4 in new fields such as physics, biology, and economics. For example, developing an automatic translation from procedural to declarative style proofs, which are close to informal proofs (one such example is miz3 for HOL Light [1]). Second, suggestion to introduce archives of formal proofs [2] for HOL4, which is an efficient way to maintain large HOL4 scripts. In this way the HOL4 scripts (particularly related to application) can be centralized which are usually available on research groups or individual researchers webpages. [1] F. Wiedijk, A Synthesis of the Procedural and Declarative Styles of Interactive Theorem Proving, Logical Methods in Computer Science (8), 2012 [http://arxiv.org/abs/1201.3601] [2] http://afp.sourceforge.net/ 
09:30  Modernising HOL's documentation SPEAKER: Magnus O. Myreen ABSTRACT. Much of HOL's documentation looks and feels old. I propose that we try to modernise the documentation to make it more appealing to, say, masters students who might like to do projects using HOL. I'm thinking that the documentation could be in some new format, e.g. screencasts, a blog, a new welllinked website, wiki or something even better. I hope there will be a fruitful discussion that leads to real improvements. 
08:45  Decision Procedures for Proving Inductive Theorems without Induction SPEAKER: Takahito Aoto ABSTRACT. Automated inductive reasoning for term rewriting has been extensively studied in the literature. Classes of equations and term rewriting systems (TRSs) with decidable inductive validity have been identified and used to automatize the inductive reasoning. In this talk, a new approach for deciding inductive theorems of TRSs is described. In particular, we give procedures for deciding the inductive validity of equations in some standard TRSs on natural numbers and lists. Our essential idea is to use the validity in the initial algebras of TRSs, instead of validity guaranteed by existence of inductive proofs. Contrary to previous decidability results, our procedures can automatically decide without involving induction reasoning the inductive validity of arbitrary equations for these TRSs, that is, without imposing any syntactical restrictions on the form of equations. 
09:15  Discussion on the International School on Rewriting (ISR) SPEAKER: Aart Middeldorp ABSTRACT.

09:00  Opening SPEAKER: Manfred SchmidtSchauß 
09:15  HERMIT: An Equational Reasoning Model to Implementation Rewrite System for Haskell SPEAKER: Andrew Gill ABSTRACT. HERMIT is a rewrite system for Haskell. Haskell, a pure functional programming language, is an ideal candidate for performing equational reasoning. Equational reasoning, replacing equals with equals, is a tunneling mechanism between different, but equivalent, programs. The ability to be agile in representation and implementation, but retain equivalence, brings many benefits. Posthoc optimization is one obvious application of representation agility. HERMIT operates at the Glasgow Haskell compiler's Core level, deep inside GHC, where type information is easy to obtain, and the language being rewritten is smaller. HERMIT provides three levels of support for transformation and prototyping: a strategic programming base with many typed rewrite primitives, a simple shell that can used to interactively request rewrites and explore transformation possibilities, and a batch language that can mechanize focused, and optionally program specific, optimizations. We will demonstrate all three of these levels, and show how they cooperate. 
09:00  Welcome SPEAKER: Organizers 
09:05  Semantics and Proof Theory in Team Play: the case of Paraconsistent Logics SPEAKER: Anna Zamansky ABSTRACT. A useful modular semantics is an important property of any proof system: it can be used to characterize important syntactic properties of the system, provide insights into its underlying logic, and in many cases induce a decision procedure for it. Recent developments in the field of paraconsistent logics are an interesting case which exemplifies the crossfertilization between semantic explorations and proof theoretical developments. In this talk I will survey the framework of nondeterministic semantics and its prooftheoretical applications, and demonstrate how nondeterministic semantics pushed forward the development of analytic Gentzentype calculi for the large family of Logics of Formal Inconsistency. 
09:50  On the Construction of Analytic Sequent Calculi for Subclassical Logics SPEAKER: Ori Lahav ABSTRACT. We study the question of when a given set of derivable rules in some basic analytic propositional sequent calculus forms itself an analytic calculus. First, a general syntactic criterion for analyticity in the family of pure sequent calculi is presented. Next, given a basic calculus admitting this criterion, we provide a method to construct weaker pure calculi by collecting simple derivable rules of the basic calculus. The obtained calculi are analyticbyconstruction. While the criterion and the method are completely syntactic, our proofs are semantic, based on interpretation of sequent calculi via nondeterministic valuation functions. In particular, this method captures calculi for a wide variety of paraconsistent logics, as well as some extensions of Gurevich and Neeman's primal infon logic. 
09:00  Opening SPEAKER: Temur Kutsia 
09:15  Extensible Symbolic System Analysis SPEAKER: Jose Meseguer ABSTRACT. Unification and narrowing are a key ingredient not only to solve equations modulo an equational theory, but also to perform symbolic system analysis. The key idea is that a concurrent system can be naturally specified as a rewrite theory (Sig,E,R), where (Sig,E) is an equational theory specifying the system's states as an algebraic data type, and R specifies the system's concurrent, and often nondeterministic, transitions. One can perform such symbolic analysis by describing sets of states as (the instances of) terms with logical variables, and using narrowing modulo E to symbolically perform transitions. Under reasonable conditions on the rewrite theory, this idea can be applied not only for complete reachability analysis, but also for temporal logic model checking. This approach is quite flexible but has some limitations. Could it be possible to make symbolic system analysis techniques more extensible and more widely applicable by simultaneously combining the powers of rewriting, narrowing, SMT solving and model checking? We give a progress report on current work aiming at such a unified symbolic approach. 
09:00  ImmermanFest Opening Remarks SPEAKER: Kousha Etessami 
09:15  27 and still counting: Iterated product, inductive counting, and the structure of P SPEAKER: Eric Allender ABSTRACT. This month marks an important anniversary. It was precisely 3^{3} years ago (July, 1987) that Neil Immerman announced a proof that NL = coNL. This shook the complexitytheoretic universe, triggering the collapse of multiple hierarchies, and spawning a series of aftershocks that still reverberate. This lecture will survey some of these developments, and will also trace the impact of some work of Immerman and Landau, which brought to the fore the central role that is played by iterated multiplication (in different algebras), in elucidating the structure of subclasses of P. 
09:00  FRP, LTL and GUIs SPEAKER: Alan Jeffrey ABSTRACT. Functional Reactive Programming (FRP) is a style of reactive programming whose semantics is given as timeindexed streams of values. Lineartime Temporal Logic (LTL) is a logic indexed by time, with modalities such as "at all times in the future". In this talk, I will discuss the use of constructive LTL as a type system for FRP, and show that functional reactive programs can serve two purposes: as executable programs, and as proof terms fur constructive LTL. This style of programming has been implemented in Agda, compiled to JavaScript, and linked against the DOM API for Graphical User Interfaces (GUIs). The result is a programming environment which combines interactive code with proofs of tautologies in temporal logic. 
09:00  Fun With Formal Methods 2014: Opening Remarks SPEAKER: Nikolay Shilov ABSTRACT. A part of the reason of student’s and engineer’s poor attitude to Formal Methods, is very simple: FMexperts do not care about primary education in this field at the early stage of higher education. In particular, many courses on Formal Semantics start with fearful terms like state machine, logic inference, denotational semantics, etc., without elementary explanations of the basic notions. 
09:15  Algebraic description of restricted programming and elements of its logics SPEAKER: Nikolay Nepejvoda ABSTRACT. Algebras describing higher order programming notions in a maximally abstract manner (general algebraic program structures, GAPS) are introduced and investigated. The criterion when given programming language can be enriched by a system of program transformations is stated and proved. Specializations of GAPS for fully invertible and injective programs are investigated. Numerous examples are presented. 
09:00  TBC SPEAKER: Alexandre Miquel ABSTRACT. TBA 
10:00  Extensional Models of Typed Lambdamu Calculus SPEAKER: Koji Nakazawa ABSTRACT. This paper shows that the stream models introduced by Nakazawa and Katsumata can be adapted to a typed setting. A variant of de'Liguoro's type system for an extension of the Lambdamu calculus, called $\Lambda\mu_{cons}$, is sound and complete with respect to the stream models. It is proved that any individual stream model with whole function spaces and infinite domains for base types characterizes the extensional equality of $\Lambda\mu_{cons}$. This result corresponds to Friedman's theorem for the simplytyped lambda calculus and the full type hierarchies. 
09:15  Numeral Systems in the Lambda Calculus SPEAKER: Ian Mackie ABSTRACT. We investigate numeral systems in the λcalculus; specifically 
09:15  Dolius: A Distributed Parallel SAT Solving Framework SPEAKER: Benoît Hoessen ABSTRACT. Over the years, parallel SAT solving becomes more and more important. However, most of stateoftheart parallel SAT solvers are portfoliobased ones. They aim at running several times the same solver with different parameters. In this paper, we propose a tool called Dolius, mainly based on the divide and conquer paradigm. In contrast to most current parallel efficient engines, Dolius does not need shared memory, can be distributed, and scales well when a large number of computing units is available. Furthermore, our tool contains an API allowing to plug any SAT solver in a simple way. 
09:45  Validating Unsatisfiability Results of Clause Sharing Parallel SAT Solvers SPEAKER: Tobias Philipp ABSTRACT. As satisfiability (SAT) solver performance has improved, so has their complexity, which make it more likely that SAT solvers contain bugs. One important source of increased complexity is clause sharing in parallel SAT solvers. SAT solvers can emit a proof of unsatisfiability to gain confidence that the their results are correct. Such proofs must contain deletion information in order to check them efficiently. Computing deletion information is easy and cheap for parallel solvers without clause sharing, but tricky for parallel solvers with clause sharing. We present a method to generate unsatisfiability proofs from clause sharing parallel SAT solvers. We show that the overhead of our method is small and that the produced proofs can be validated in a time similar to the solving (CPU) time. However, proofs produced by parallel solvers without clause sharing can be checked in a similar to the solving (wallclock) time. This raises the question whether our method can be improved such that the checking time of proofs from parallel solvers without clause sharing is comparable to the time to check proofs from parallel solver with clause sharing. 
09:15  Quasipolynomial Size Frege Proofs of Frankl's Theorem on the Trace of Finite Sets SPEAKER: James Aisenberg ABSTRACT. We extend results of Bonet, Buss and Pitassi on Bondy's Theorem and of Nozaki, Arai and Arai on Boolobas's Theorem by proving that Frankl's Theorem on the trace of sets has quasipolynomial size Frege proofs. For constant values of the parameter t, we prove that Frankl's Theorem has polynomial size, constant depth proofs from instances of the pigeonhole principle. 
09:45  n^O(log log n) size monotone proofs of the weak pigeonhole principle SPEAKER: Anupam Das ABSTRACT. We present n^{O(log log n)}size proofs in the (treelike) monotone sequent calculus of the pigeonhole principle with (1 + ε)n pigeons and n holes, for ε = 1/polylog n. This improves on the bound of n^{O(log n)} inherited from proofs of the usual pigeonhole principle, with n + 1 pigeons and n holes, by Atserias et al., and supports the more general conjecture that the monotone sequent calculus polynomially simulates HilbertFrege systems. 
09:15  An algebraic approach for inferring and using symmetries in rulebased models SPEAKER: Jerome Feret ABSTRACT. Symmetries arise naturally in rulebased models, and under various forms. Besides automorphisms between site graphs, which are usually built within the semantics, symmetries can take the form of pairs of sites having the same capability of interactions, of some protein variants behaving exactly the same way, or of some linear, planar, or 3D molecular complexes which could be see modulo permutations of their axis and/or mirrorimage symmetries. 
09:15  De haut en bas: relating highlevel and lowlevel abstractions SPEAKER: Nick Benton ABSTRACT. We need to reason about lowlevel programs because they are what actually runs on our machines. As lowlevel languages tend to be messy and complex, so does proving things about them. But a slightly subtler and more interesting problem is that the things we want to prove about such programs are often phrased in terms of highlevel notions, such as being a valid implementation of a particular function, or behaving like something with a particular highlevel type. It can be far from obvious how to even translate such specifications into terms that make sense at the low level, let alone verify that programs meet them. I’ll discuss the problem, and ways it can be addressed, drawing on examples from compositional correctness and type soundness of compilers, and the design of logics for lowlevel programs. 
09:25  On the Formalization of LambdaCalculus Confluence and Residuals SPEAKER: Beniamino Accattoli 
10:45  Some Reflections on Definability and Complexity SPEAKER: Phokion Kolaitis 
11:45  Crane Beach Revisited SPEAKER: Nicole Schweikardt 
10:45  Undecidability of Multiplicative Subexponential Logic SPEAKER: Kaustuv Chaudhuri ABSTRACT. Subexponential logic is a variant of linear logic with a family of exponential connectivescalled subexponentialsthat are indexed and arranged in a preorder. Each subexponential has or lacks associated structural properties of weakening and contraction. We show that classical propositional multiplicative linear logic extended with one unrestricted and two incomparable linear subexponentials can encode the halting problem for two register Minsky machines, and is hence undecidable. 
11:15  Superstructural Reversible Logic SPEAKER: Zachary Sparks ABSTRACT. Linear logic refines conventional logic by being a ``resourceconscious'' logic. We analyze the notion of resource maintained by linear logic proofs and argue that it only focuses on a ``spatial'' dimension omitting another important ``temporal'' dimension. We generalize (or further refine using even more structural rules) linear logic to maintain combined spatial and temporal resources. The resulting logic is reversible and thus allows a study of reversible computation from a prooftheoretic perspective. 
11:45  A Linear/Producer/Consumer model of Classical Linear Logic SPEAKER: Jennifer Paykin ABSTRACT. This paper defines a new proof and categorytheoretic framework for classical linear logic that separates reasoning into one linear regime and two persistent regimes corresponding to ! and ?. The resulting linear/producer/consumer (LPC) logic puts the three classes of propositions on the same semantic footing, following Benton's linear/nonlinear formulation of intuitionistic linear logic. Semantically, LPC corresponds to a system of three categories connected by adjunctions that reflect the linear/producer/consumer structure. The paper's metatheoretic results include admissibility theorems for the cut and duality rules, and a translation of the LPC logic into category theory. The work also presents several concrete instances of the LPC model, including one based on finite vector spaces. 
12:15  Cut Elimination in Multifocused Linear Logic SPEAKER: unknown ABSTRACT. We study cut elimination for a multifocused variant of linear logic in the sequent calculus. The multifocused normal form of proofs yields problems that do not appear in the standard focused system, related to the constraints in grouping rule instances inside focusing phases. We show that cut elimination can be performed in a sensible way even though the proof requires specific lemmas to deal with multifocusing phases, and discuss the difficulties arising with cut elimination when considering normal norms of proofs in linear logic. 
10:45  Verifying Optimizations for Concurrent Programs SPEAKER: William Mansky ABSTRACT. While program correctness for compiled languages depends fundamentally on compiler correctness, compiler optimizations are not usually formally verified due to the effort involved, particularly in the presence of concurrency. In this paper, we present a framework for stating and reasoning about compiler optimizations and transformations on programs in the presence of relaxed memory models. The core of the framework is the PTRANS specification language, in which program transformations are expressed as rewrites on control flow graphs with temporal logic side conditions. We demonstrate our technique by verifying the correctness of a redundant store elimination optimization in a simple LLVMlike intermediate language, relying on a theorem that allows us to lift singlethread simulation relations to simulations on multithreaded programs. 
11:15  Inverse Unfold Problem and Its Heuristic Solving SPEAKER: Masanori Nagashima ABSTRACT. Unfold/fold transformations have been widely studied in various programming paradigms and are used in program transformations, theorem proving, and so on. This paper, by using an example, show that restoring an onestep unfolding is not easy, i.e., a challenging task, since some rules used by unfolding may be lost. We formalize this problem by regarding onestep program transformation as a relation. Next we discuss some issues on a specific framework, called pureconstructor systems, which constitute a subclass of conditional term rewriting systems. We show that the inverse of $T$ preserves rewrite relations if $T$ preserves rewrite relations and the signature. We propose a heuristic procedure to solve the problem, and show its successful examples. We improve the procedure, and show examples for which the improvement takes effect. 
11:45  Short break SPEAKER: Break 
12:00  Structural Rewriting in the PiCalculus SPEAKER: David Sabel ABSTRACT. We consider reduction in the synchronous picalculus with replication, without sums. Usual definitions of reduction in the picalculus use a closure w.r.t. structural congruence of processes. In this paper we operationalize structural congruence by providing a reduction relation for piprocesses which also performs necessary structural conversions explicitly by rewrite rules. As we show, a subset of structural congruence axioms is sufficient. We show that our rewrite strategy is equivalent to the usual strategy including structural congruence w.r.t. the observation of barbs and thus w.r.t. may and shouldtesting equivalence in the picalculus. 
12:30  Attractor Equivalence: An Observational Semantics for Reaction Networks SPEAKER: unknown ABSTRACT. We study observational semantics for networks of chemical reactions as used in systems biology. Reaction networks without kinetic information, as we consider, can be identified with Petri nets. We present a new observational semantics for reaction networks that we call the attractor equivalence. The main idea of the attractor equivalence is to observe reachable attractors and reachability of an attractor divergence in all possible contexts. The attractor equivalence can support powerful simplifications for reaction networks as we illustrate at the example of the TetOn system. Alternative semantics based on bisimulations or traces, in contrast, do not support all needed simplifications. 
10:45  Isabelle tutorial for beginners (part 2) SPEAKER: Tobias Nipkow ABSTRACT. This is a handson tutorial introduction to the proof assistant Isabelle/HOL. Slide presentations alternate with practical exercises: partipants should bring their own laptop with Isabelle installed. There will be seasoned Isabelle tutors to help you using the system. The tutorial takes 3.5h and has three sections:
See also Programming and Proving with Isabelle/HOL. NOTE: Please install this <a href="http://isabelle.in.tum.de/websiteIsabelle2014RC0/">Isabelle release candidate</a> before coming to the tutorial. 
10:45  The CoCon Experiment SPEAKER: Andrei Popescu ABSTRACT. We report on the experience of designing, formally verifying and using CoCon, a conference management system with verified document confidentiality. 
10:55  The CAVA Automata Library SPEAKER: Peter Lammich ABSTRACT. We report on the graph and automata library that is used in the fully verified LTL model checker CAVA. Most components of CAVA use some type of graphs or automata. A common automata library simplifies assembly of the components to a working model checker and reduces redundancy. The CAVA automata library provides a hierarchy of graph and automata classes, together with some standard algorithms. Its object oriented design allows for sharing of algorithms, theorems, and implementations between the different types of graphs, and also simplifies adding new graph types. Moreover, it is integrated into the Automatic Refinement Framework, supporting automatic refinement of the abstract automata types to efficient data structures. Note that the CAVA automata library is work in progress. Currently, it is very specifically tailored towards the requirements of the CAVA model checker. Nevertheless, the formalization techniques allow an extension of the library to a wider scope. Moreover, they are not limited to graph libraries, but apply to class hierarchies in general. 
11:20  From Types to Sets in Isabelle/HOL SPEAKER: Ondřej Kunčar ABSTRACT. HOL types are naturally interpreted as nonempty sets—this intuition is reflected in the type definition rule for the HOLbased systems (including Isabelle/HOL), where a new type can be defined whenever a nonempty set is exhibited. However, in HOL this definition mechanism cannot be applied inside proof contexts. We analyze some undesired consequences of this limitation and propose a more expressive typedefinition rule that addresses it. The new expressive power opens the opportunity for a package that relativizes typebased statements to more flexible setbased variants—to streamline this process, we further implement a rule that transforms the implicit typeclass constraints into explicit assumptions. Moreover, our tool is an interesting use case of Lifting and Transfer. 
11:45  Towards abstract and executable multivariate polynomials in Isabelle SPEAKER: Andreas Lochbihler ABSTRACT. This work in progress report envisions a library for multivariate polynomials developed jointly by experts from computer theorem proving (CTP) and computer algebra (CA). The urgency of verified algorithms has been recognised in the field of CA, but the cultural gap to CTP is considerable; CA users expect high usability and efficiency. This work collects the needs of CA experts and reports on the design of a proofofconcept prototype in Isabelle/HOL. The CA requirements have not yet been fully settled, and its development is still at an early stage. The authors hope for lively discussions at the Isabelle Workshop. 
12:15  Tutorial: Isabelle/jEdit for seasoned Isabelle users SPEAKER: Makarius Wenzel ABSTRACT. Isabelle/jEdit is the default userinterface and Prover IDE (PIDE) for Isabelle. After several years of development of Isabelle/PIDE concepts and implementation of Isabelle/Scala infrastructure, Isabelle/jEdit was first released in October 2011 as a proper application. Each subsequent release made significant progress, so that we can speak already of about 5 generations of Isabelle Prover IDEs today. This short tutorial is meant to help seasoned users of Isabelle, who have not yet discovered the full potential of the jEdit text editor and the Isabelle/jEdit Prover IDE, to get uptodate. The demonstration will show established functionality of Isabelle20132 (December 2013). 
10:45  Needed Computations Shortcutting Needed Steps SPEAKER: Sergio Antoy ABSTRACT. We define a compilation scheme for a constructorbased stronglysequential graph rewriting system which shortcuts some needed steps. The object code is a constructorbased graph rewriting system as well that is normalizing for the original system when using an innermost strategy. Hence, the object code can be easily implemented by eager functions in a variety of programming languages. Then, we modify our object code in a way that avoids totally or partially the construction of the contracta of some needed steps of a computation. When computing normal forms in this way, both memory consumption and execution time are reduced compared to ordinary rewriting computations in the original system. 
11:15  Proving Termination of Unfolding Graph Rewriting for General Safe Recursion SPEAKER: Naohi Eguchi ABSTRACT. We present a new termination proof and complexity analysis of unfolding graph rewriting which is a specific kind of infinite graph rewriting expressing the general form of safe recursion. We introduce a termination order over sequences of terms together with an interpretation of term graphs into sequences of terms. Unfolding graph rewrite rules expressing the general safe recursion can be embedded into the termination order by the interpretation, yielding the polynomial runtime complexity. 
11:45  Nested Term Graphs SPEAKER: Clemens Grabmayer ABSTRACT. We report on work in progress on `nested term graphs' for formalizing higherorder terms (e.g. finite or infinite lambdaterms), including those expressing recursion (e.g. terms in the lambdacalculus with letrec). The idea is to represent the nested scope structure of a higherorder term by a nested structure of term graphs. Based on a signature that is partitioned into atomic and nested function symbols, we define nested term graphs both intensionally, as treelike recursive graph specifications that associate nested symbols with usual term graphs, and extensionally, as enriched term graph structures. These definitions induce corresponding notions of bisimulation between nested term graphs. Our main result states that nested term graphs can be implemented faithfully by firstorder term graphs. 
10:45  Comparing Sequent Calculi via Hilbertstyle Axioms SPEAKER: Björn Lellmann ABSTRACT. In this talk I will consider the question of how to compare and evaluate the expressive strength of different formats of Gentzenstyle systems. More precisely, the general question is which logics can be captured in the different frameworks. Of course some results are clear: e.g. since every standard sequent can be seen as a hypersequent with only one component or as a nested sequent without the nesting, every logic which can be captured in the framework of standard sequent calculus can be captured in the framework of hypersequent or nested sequent calculi as well. Other results are given by explicitly translating one framework into another, see e.g. [Goré and Ramanayake:2012]. While such results certainly establish important connections between the different frameworks, they only provide information on the expressive strength of one Gentzenstyle framework relative to another one. Yet we are also interested in the absolute expressive strength and in particular the limits of what can be expressed in the different frameworks in a meaningful way. The qualification "in a meaningful way'' here is important, since of course given any set A of formulae the sequent calculus with groundsequents =>a for every a in A will serve to derive exactly the formulae in A, even in a cutfree way. Thus to make this question nontrivial we also need to restrict the format of the rules in the different frameworks. Comparing the different frameworks and formats of rules finally can be done by identifying characterisations for them in a single common framework. Since we would like to compare extensions of sequent calculus, this should be a framework outside the class of Gentzenstyle frameworks, and ideally the characterisations should provide a way of quickly determining which Gentzenstyle framework is suitable for a particular logic. For this purpose here we choose the framework of Hilbertstyle axiom systems. This gives a high degree of flexibility and allows in particular to capture nonnormal modal logics and logics based on other than classical propositional logic. A characterisation of a specific Gentzenstyle framework together with a specific format of rules then is given by an ideally purely syntactically defined class of axioms together with translations from rules of the considered format and framework into axioms of this class and vice versa. The prime example for this method is Kracht's result characterising those modal temporal logics which can be captured by structural rules in a display calculus satisfying Belnap's conditions for cut elimination as exactly the logics axiomatisable by primitive axioms [Kracht:1996]. Another example is the characterisation of structural rules in a sequent or hypersequent calculus for substructural logics by certain levels of the substructural hierarchy from [Ciabattoni et al:2008]. Apart from providing a first step for the construction of analytic calculi from Hilbert axioms, this method also yields a way to show limitative results: By showing that a certain logic cannot be axiomatised by axioms of a specific (syntactically given) form we can show that the logic cannot be captured by the framework and format of rules corresponding to this class of axioms. In this talk I will discuss some recent results in this spirit concerning logical rules of different formats in the frameworks of standard sequent calculi and hypersequent calculi. The logics under consideration are propositional modal logics such as Gödel Löb logic GL or the logic of 2transitive Kripke frames, and more generally, extensions of classical or intuitionistic propositional logic with additional not necessarily unary connectives. 
11:10  Sequent Systems for Classical Modal Logics SPEAKER: Paolo Maffezioli ABSTRACT. This paper develops sequent calculi for several classical modal logics. Utilizing a polymodal translation of the standard modal language, we are able to establish a base system for the minimal classical modal logic from which we generate extensions in a modular, and uniform, manner. We also briefly suggest potential applications to epistemic logic. 
11:35  Lyndon Interpolation for the Modal MuCalculus SPEAKER: Daniyar Shamkanov ABSTRACT. We prove Lyndon interpolation for the modal mucalculus by applying socalled circular proofs. One of the proof systems for the modal mucalculus is based on nonwellfounded derivation trees with a global condition which roughly says that in each infinite branch, there must be an outermost greatest fixed point unfolded infinitely many often. We analyse regular proof trees (socalled circular proofs) in this proof system and obtain the Lyndon interpolation for the modal mucalculus. 
12:00  On cuts and cutelimination in modal fixed point logics SPEAKER: Thomas Studer ABSTRACT. We present recent developments and discuss open questions concerning syntactic cutelimination for modal fixed point logics. 
12:25  Displaytype calculi and their cut elimination metatheorem SPEAKER: Giuseppe Greco ABSTRACT. The range of nonclassical logics has been rapidly expanding, driven by influences from other fields which have opened up new opportunities for applications. However, the hurdles preventing a standard prooftheoretic development for these logics are due precisely to some of their defining features which make them suitable for applications, such as e.g. their not being closed under uniform substitution, or the fact that (the semantic interpretations of) key connectives are not adjoints. These difficulties caused the existing proposals in literature to be often ad hoc, not easily generalisable, and more in general lacking a smooth prooftheoretic behaviour. In particular, the difficulty in smoothly transferring results from one logic to another is a problem in itself, since these logics typically come in large families (consider for instance the family of dynamic logics), and hence prooftheoretic approaches which uniformly apply to each logic in a given family are in high demand. In this talk we focus on the core technical aspects of a prooftheoretic methodology and setup closely linked to Belnap's display logic and Sambin's basic logic. The present setup, which we refer to as displaytype calculi, generalizes display calculi in two aspects: by allowing multitype languages, and by dropping the full display property. The generalisation to a multitype environment makes it possible to introduce specific tools enhancing expressivity, which have proved useful e.g. for a smooth prooftheoretic treatment of multimodal and dynamic logics. The generalisation to a setting in which full display property is not required makes it possible to account for logics which admit connectives which are neither adjoints nor residuals. One technical aspect which guarantees the cut elimination metatheorem to go through for displaytype calculi, even in the absence of the full display property, concerns the strengthening of the separation property (requiring principal formulas in introduction rules to appear in isolation) to the visibility property. Visibility requires all active formulas in introduction rules to occur in isolation. This property was recognized to be crucial for the cut elimination theorem of basic logic. However, in the present setup of displaytype calculi, visibility is also weakened, in the sense that, in order to account for logics that are not closed under uniform substitution, principal formulas in axioms are not required to occur in isolation. In the proposed talk, we will illustrate the basic principles of the multitype environment, and also how the above combination of weakenings, strengthenings of the separation property concurs to guaranteeing the cut elimination metatheorem for displaytype calculi. Time permitting, we will also discuss some difficulties that still arise in the case of PDL and some ideas towards treating predicative logics. 
10:45  On the Relation between Resolution Calculi for QBFs and Firstorder Formulas SPEAKER: Uwe Egly ABSTRACT. Quantified Boolean formulas (QBFs) generalize propositional formulas by admitting quantifications over propositional variables. QBFs can be viewed as (restricted) formulas of firstorder predicate logic and easy translations from QBFs to firstorder formulas exist. We analyze the effect of such a translation on the complexity of resolution proofs. More precisely, we show that there are classes $(\Phi_i)_{i\geq 1}$ of QBFs where any Qresolution refutation of $\Phi_i$ has length superpolynomial in $i$, but the translated version of $\Phi_i$ possesses a short resolution refutation. Moreover we discuss that firstorder predicate logic can be used to succinctly represent refutations of QBFs obtained in different QBF calculi. 
11:15  Resolution Paths and Term Resolution SPEAKER: Friedrich Slivovsky ABSTRACT. We define Q(D)term resolution, a family of proof systems for true Quantified Boolean Formulas (QBFs). These systems are generalizations of term resolution that capture the proof system(s) used by DepQBF to generate proofs of true formulas. DepQBF implements a version of the QCDCL algorithm that is parameterized by a socalled dependency scheme, a mapping that associates each formula with a binary relation on its variables that represents potential variable dependencies. Similarly, Q(D)term resolution is parameterized by a dependency scheme D. We show that Q(Dres)term resolution is sound, where Dres is the socalled Resolutionpath Dependency Scheme. This entails soundness of Q(Dst)term resolution, where Dst is the Standard Dependency Scheme currently implemented in DepQBF. 
11:45  Efficient Extraction of QBF (Counter)models from LongDistance Resolution Proofs SPEAKER: Valeriy Balabanov ABSTRACT. Resolution is a fundamental part of modern search and learning based QBF solvers, and the extraction of Skolemfunction models and Herbrandfunction countermodels from QBF resolution proofs is a key enabler for various applications. Although longdistance resolution (LQresolution) came into existence a decade ago, its superiority to shortdistance resolution (Qresolution) was not recognized until recently. It remains open whether there exists a polynomial time algorithm for model and countermodel extraction from LQresolution proofs, although an efficient algorithm does exist for the Qresolution counterparts. This paper settles this open problem affirmatively by constructing a lineartime extraction procedure. Experimental results show the distinct benefits of the proposed method in extracting high quality certificates. 
12:15  Dependency Quantified Boolean Formulas  Challenges, Applications, First Lines of Attack SPEAKER: Christoph Scholl ABSTRACT. We will give an introduction into Dependency Quantified Boolean Formulas (DQBF) 
10:45  Writing proof automation for HOL4 SPEAKER: Magnus O. Myreen ABSTRACT. The HOLtoCakeML translator, which is described in Myreen and Owens's ICFP'12 paper, is a nice selfcontained piece of proofautomation. I propose describing its implementation. If the audience is mostly consisting of beginners, this talk can be geared towards a tutorialstyle introduction into how proof automation for HOL can be implemented. If, on the other hand, the audience consists mostly of HOL experts, then this talk can serve as a discussion of methods by which proof automation is constructed. 
11:15  Hack Session 1 SPEAKER: Everyone 
10:45  Certification of Confluence Proofs using CeTA SPEAKER: René Thiemann ABSTRACT. CeTA was originally developed as a tool for certifying termination proofs which have to be provided as certificates in the CPFformat. Its soundness is proven as part of IsaFoR, the Isabelle Formalization of Rewriting. By now, CeTA can also be used for certifying confluence and nonconfluence proofs. In this system description, we give a short overview on what kind of proofs are supported, and what information has to be given in the certificates. As we will see, only a small amount of information is required and so we hope that CSI will not stay the only confluence tool which can produce certificates. 
11:15  Confluence and CriticalPairClosing Systems SPEAKER: Nao Hirokawa ABSTRACT. In this note we introduce criticalpairclosing systems which are aimed at analyzing confluence of term rewriting systems. Based on the notion two new confluence criteria are presented. One is a generalization of weak orthogonality based on relative termination, and another is a partial solution to the RTA Open Problem #13. 
10:45  Post Mortem Analysis of SAT Solver Proofs SPEAKER: Laurent Simon ABSTRACT. ConflictDriven Clause Learning algorithms are well known from an engineer point of view. Thanks to Minisat, their designs are well understood, and most of their implementations follow the same ideas, with essentially the same components. Same heuristics, fast restarts, same learning mechanism. However, their efficiency has an important drawback: they are more and more like complex systems and harder and harder to handle. Unfortunately, only a few works are focusing on understanding them rather than improving them. In most of the cases, their studies are often based on a generate and test pattern: An idea is added to an existing solver and if it improves its efficiency the idea is published and kept. In this paper, we analyse ``{\em postmortem}'' the proofs given by one typical CDCL solvers, Glucose. The originality of our approach is that we only consider it has a {\em resolution proofs} builder, and then we analyze some of the proof characteristics on a set of selected UNSAT instances, by shuffling each of them 200 times. We particularly focus on trying to characterize useless and useful clauses in the proof as well as proofs shapes. We also show that despite their incredible efficiency, roughly 90\% of the time spent in a CDCL is useless for the final proof. 
11:15  Formula partitioning revisited SPEAKER: Zoltan Mann ABSTRACT. Dividing a Boolean formula into smaller independent subformulae can be a useful technique for accelerating the solution of Boolean problems, including SAT and #SAT. Nevertheless, and despite promising early results, formula partitioning is hardly used in stateoftheart solvers. In this paper, we show that this is rooted in a lack of consistency of the usefulness of formula partitioning techniques. In particular, we evaluate two existing and a novel partitioning model, coupled with two existing and two novel partitioning algorithms, on a wide range of benchmark instances. Our results show that there is no onesizefitsall solution: for different formula types, different partitioning models and algorithms are the most suitable. While these results might seem negative, they help to improve our understanding about formula partitioning; moreover, the findings also give some guidance as to which method to use for what kinds of formulae. 
11:45  New CNF Features and Formula Classification SPEAKER: Enrique Alfonso ABSTRACT. In this paper, we try to bridge this gap to be able to use a solver as black box but still exploit its configuration space. We propose new features that can be extracted quickly from a CNF formula compared to the existing features of SATzilla. Then, we compare these features with the known features and construct a black box SAT solver selects a configuration based on the features and a classifier that uses random decision forests. Experiments on application and hand crafted benchmarks show that the constructed classifier on the one hand improves the performance of the SAT solvers. On the other hand, the comparison shows that the set of new features can be computed very fast and results in a more precise classification. 
12:15  Typicalcase complexity and the SAT competitions SPEAKER: Zoltan Mann ABSTRACT. The aim of this paper is to gather insight into typicalcase complexity of the Boolean Satisfiability (SAT) problem by mining the data from the SAT competitions. Specifically, the statistical properties of the SAT benchmarks and their impact on complexity are investigated, as well as connections between different metrics of complexity. While some of the investigated properties and relationships are "folklore" in the SAT community, this study aims at scientifically showing what is true from the folklore and what is not. The standard approach of the SAT competitions is to use benchmarks to assess the efficiency of the submitted solvers; this talk takes a complimentary point of view: using the solvers to assess the complexity of the benchmarks. 
10:45  Weak Pigeonhole Principles, Circuits for Approximate Counting, and BoundedDepth Proofs SPEAKER: Albert Atserias ABSTRACT. I will start this talk by giving an overview of the status of the weak pigeonhole principle (WPHP) in proof complexity and the connection between this problem and some fundamental questions on the computational complexity of approximate counting. Then I will discuss some recent progress on the main remaining question about WPHP, namely whether the known quasipolynomialsize depth2 proof is optimal in terms of size. We show that a relativized version of the WPHP that also has quasipolynomialsize depth2 proofs does indeed require quasipolynomial size to prove in depth2. To our knowledge, this gives the first example of a natural principle with matching but superpolynomial upper and lower bounds in a natural proof system. 
10:45  Rewriting, Proofs and Transition Systems SPEAKER: Gilles Dowek ABSTRACT. We give a new proof of the decidability of reachability in alternating pushdown systems, showing that it is a simple consequence of a cutelimination theorem for some logic. 
11:30  SemanticallyGuided GoalSensitive Theorem Proving SPEAKER: Maria Paola Bonacina ABSTRACT. SGGS, for SemanticallyGuided GoalSensitive theorem proving, is a new inference system for firstorder logic. It was inspired by the idea of generalizing to firstorder logic the modelbased style of the DavisPutnamLogemannLoveland (DPLL) procedure for propositional logic. Modelbased reasoning is a key feature of SAT solvers, SMT solvers, and modelconstructing decision procedures for specific theories, and a crucial ingredient to their practical success. However, modelbased reasoning in firstorder logic is challenging, because the logic is only semidecidable, the search space is infinite, and model representation is harder than in the propositional case. SGGS meets the challenge by realizing a seemingly rare combination of properties: it is modelbased a la DPLL; semantically guided by an initial interpretation, to avoid blind guessing in an infinite search space; proof confluent, to avoid backtracking, which may be cumbersome for firstorder problems; goalsensitive, which is important when there are many axioms or a large knowledge base; and it uses unification to avoid enumeration of ground terms, which is inefficient, especially for rich signatures. In terms of operations, SGGS combines instance generation, resolution, and constraints, in a modelcentric approach: it uses sequences of constrained clauses to represent models, instance generation to extend the model, resolution and other inferences to repair it. This talk advertises SGGS to the rewriting community, presenting the main ideas in the method: a main direction for future work is extension to firstorder logic with equality, which requires rewritebased reasoning. A manuscript including all aspects of SGGS, the technical details, the proofs of refutational completeness and goalsensitivity, a comparison with other work (e.g., resolution with set of support, the disconnection calculus, the model evolution calculus, the InstGen method) and more references, is available. 
12:00  Impacts of the Digital Revolution on two Main Activities of Scientists: Teaching and Publishing SPEAKER: Claude Kirchner 
12:30  Discussion on Massive Open Online Courses and Overlay Journals SPEAKER: Claude Kirchner 
10:45  Using answer set programming to integrate largescale heterogeneous information about the response of a biological system SPEAKER: Anne Siegel ABSTRACT. The response of a biomolecular system is usually investigated thanks to heterogenous and incomplete data. More precisely, to address a control a biological system, we may have at hand only partial knowledge about molecular interactions together with observations of a subset of molecules. The issue we wish to address is how can take benefit of this multiscale amount of information. In this talk, we will explain how answer set programming can be used both to model in a uniform langage several problem of data integration and to solve difficult combinatorial problems related to heregeneity and data incompleteness. All together, these modelling and solving technologies allow identifying robust controllers of biological systems. We will illustrate this on issues about metabolic networks and logical signaling networks. 
11:30  Attractor equivalence: An observational semantics for reaction networks SPEAKER: Cedric Lhoussaine ABSTRACT. We study observational semantics for networks of chemical reactions as used in systems biology. Reaction networks without kinetic information, as we consider, can be identified with Petri nets. We present a new observational semantics for reaction networks that we call the attractor equivalence. The main idea of the attractor equivalence is to observe reachable attractors and reachability of an attractor divergence in all possible contexts. The attractor equivalence can support powerful simplifications for reaction networks as we illustrate at the example of the TetOn system. Alternative semantics based on bisimulations or traces, in contrast, do not support all needed simplifications. 
12:15  A Logical Framework for Systems Biology SPEAKER: Joelle Despeyroux ABSTRACT. We propose a novel approach for the formal verification of biological systems based on the use of a modal linear logic. We show how such a logic can be used, with worlds as instants of time, as an unified framework to encode both biological systems and temporal properties of their dynamic behaviour. To illustrate our methodology, we consider a model of the P53/Mdm2 DNAdamage repair mechanism. We prove several properties that are important for such a model to satisfy and serve to illustrate the promise of our approach. We formalize the proofs of these properties in the Coq Proof Assistant, with the help of a Lambda Prolog prover for partial automation of the proofs. 
10:45  Modeling Algorithms in SystemC and ACL2 SPEAKER: John O'Leary ABSTRACT. We describe the formal language MASC, based on a subset of SystemC and intended for modeling algorithms to be implemented in hardware. By means of a specialpurpose parser, an algorithm coded in SystemC is converted to a MASC model for the purpose of documentation, which in turn is translated to ACL2 for formal verification. The parser also generates a SystemC variant that is suitable as input to a highlevel synthesis tool. As an illustration of this methodology, we describe a proof of correctness of a simple 32bit radix4 multiplier. 
11:15  Development of a Translator from LLVM to ACL2 SPEAKER: unknown ABSTRACT. In our current work a library of formally verified software components is to be created, and assembled, using the LowLevel Virtual Machine (LLVM) intermediate form, into subsystems whose toplevel assurance relies on the assurance of the individual components. We have thus undertaken a project to build a translator from LLVM to the applicative subset of Common Lisp accepted by the ACL2 theorem prover. Our translator produces executable ACL2 formal models, allowing us to both prove theorems about the translated models as well as validate those models by testing. The resulting models can be translated and certified without user intervention, even for code with loops, thanks to the use of the \texttt{def::ung} macro which allows us to defer the question of termination. Initial measurements of concrete execution for translated LLVM functions indicate that performance is nearly 2.4 million LLVM instructions per second on a typical laptop computer. In this paper we overview the translation process and illustrate the translator's capabilities by way of a concrete example, including both a functional correctness theorem as well as a validation test for that example. 
11:45  An ACL2 Mechanization of an Axiomatic Framework for Weak Memory SPEAKER: Benjamin Selfridge ABSTRACT. Proving the correctness of programs written for multiple processors is a challenging proglem, due in no small part to the weaker memory guarantees afforded by most modern architectures. In particular, the existence of store buffers means that the programmer can no longer assume that writes to different locations become visible to all processors in the same order. However, all practical architectures do provide a collection of weaker guarantees about memory consistency across processors, which enable the programmer to write provably correct 
12:15  Using ACL2 to Verify Loop Pipelining in Behavioral Synthesis SPEAKER: unknown ABSTRACT. Behavioral synthesis involves compiling an Electronic SystemLevel (ESL) design into its RegisterTransfer Level (RTL) implementation. Loop pipelining is one of the most critical and complex transformations employed in behavioral synthesis. Certifying the loop pipelining algorithm is challenging because there is a huge semantic gap between the input sequential design and the output pipelined implementation making it infeasible to verify their equivalence with automated sequential equivalence checking techniques. We discuss our ongoing effort using ACL2 to certify loop pipelining transformation. The completion of the proof is work in progress. However, some of the insights developed so far may already be of value to the ACL2 community. In particular, we discuss the key invariant we formalized, which is very different from that used in most pipeline proofs. We discuss the needs for this invariant, its formalization in ACL2, and our envisioned proof using the invariant. We also discuss some tradeoffs, challenges, and insights developed in course of the project. 
10:45  A Simple Parallel Implementation of Interaction Nets in Haskell SPEAKER: Wolfram Kahl ABSTRACT. Due to their ``inherent parallelism'', interaction nets have since their introduction been considered as an attractive implementation mechanism for functional programming. We show that a simple highlyconcurrent implementation in Haskell can achieve promising speedups on multiple cores. 
11:15  Some observations for the parallel implementation of interaction nets SPEAKER: Shinya Sato ABSTRACT. There have been several studies about the parallel implementation of interaction nets. Here we look at the related question: when is an interaction net system suitable for parallel evaluation? Some systems are sequential, some have the potential to be highly parallel. This first investigation aims to throw some light on this topic, and perhaps help pave the way to wider use of this technology for parallel evaluation. In this short paper we focus on presenting experimental evidence in a number of small case studies. 
10:45  Abstract Self Modifying Machines SPEAKER: Hubert Godfroy ABSTRACT. We describe a new framework for selfmodifying programs. Our goal is to show how to extract program abstraction focusing on selfmodifications. On the first hand, we use a abstract machine which makes explicit some typical behavior, such as turning data into executable code and vice versa. Moreover memory is also separated between data (what we can read and write) and code (what we can execute). On the other hand, we add another level of granularity in memory location, in order to build a more static program abstraction. 
11:15  Programming and Verifying with Effect Handling and Iteration SPEAKER: unknown ABSTRACT. Effect handling is a novel paradigm in programming that complements the established algebraic approach to programming with effects. Intuitively, algebraic effects are those which distribute over sequential composition and hence can affect the control flow only in a quite conservative fashion. Effect handling, by contrast, allows for expressing more powerful constructions, the most basic example being exception handling. More generally, effect handling can be viewed as a lightweight, wellbehaved alternative to control operations like call/cc. We develop novel semantic foundations for effect handling in the presence of iteration, based on cofree extensions of monads. In the larger perspective we view our current work as a prerequisite for designing a generic relatively complete Hoare calculus from programs with algebraic effects, iteration and handling. 
11:45  Tensorial logic with algebraic effects SPEAKER: PaulAndré Melliès ABSTRACT. Tensorial logic is a primitive logic of tensor and negation which refines linear logic by relaxing the hypothesis that negation is involutive. One reason for introducing the logic is that its proofnets coincide with innocent strategies between dialogue games. In this talk, we explain how to extend tensorial logic with various notions of algebraic effects. By way of illustration, we establish that the resulting notion of tensorial proof net coincides in the particular case of local stores with the notion of visible strategy formulated by Abramsky and McCusker in their study of Idealized Algol. 
12:15  Compiling Effectful Terms to Transducers: Prototype Implementation of Memoryful Geometry of Interaction SPEAKER: unknown ABSTRACT. We present a prototype implementation of the memoryful GoI framework of [Hoshino, Muroya and Hasuo, CSLLICS 2014] that translates lambda terms with algebraic effects to transducers. Those transducers can be thought of as “proof nets with memories” and are constructed in a compositional manner by means of coalgebraic component calculi. The transducers thus obtained can be simulated in our tool, too, helping us to scrutinize the stepbystep interac tions that take place in higherorder effectful computation. 
10:50  A Translation of Intersection and Union Types for the lambdamu Calculus SPEAKER: Kentaro Kikuchi ABSTRACT. We introduce an intersection and union type system for the lambdamu calculus, which includes a restricted version of the traditional unionelimination rule. Strong normalisation of terms typable by the system is shown via a translation of intersection and union types into intersection and product types, relying on the result of strong normalisation of terms typable by van Bakel, Barbanera and de'Liguoro's system. 
11:10  Dualized Type Theory SPEAKER: unknown ABSTRACT. We propose a new biintuitionistic type theory called Dualized Type Theory (DTT). It is a type theory with perfect intuitionistic duality, and corresponds to a singlesided polarized sequent calculus. We prove DTT strongly normalizing, and prove type preservation. DTT is based on a new propositional biintuitionistic logic called Dualized Intuitionistic Logic (DIL) that builds on Pinto and Uustalu's logic L. DIL is a simplification of L which removes several admissible inference rules from L while maintaining consistency and completeness. Furthermore, DIL is defined using a dualized syntax by labeling formulas and logical connectives with polarities thus reducing the number of inference rules needed to define the logic. We give a direct proof of consistency, but prove completeness by reduction to L. 
11:30  A prooftheoretic view on scheduling in concurrency SPEAKER: Emmanuel Beffara ABSTRACT. This paper elaborates on a new approach of the question of the prooftheoretic study of concurrent interaction called "proofs as schedules". Observing that proof theory is well suited to the description of confluent systems while concurrency has nondeterminism as a fundamental feature, we develop a correspondence where proofs provide what is needed to make concurrent systems confluent, namely scheduling. In our logical system, processes and schedulers appear explicitly as proofs in different fragments of the proof language and cut elimination between them does correspond to execution of a concurrent system. This separation of roles suggests new insights for the denotational semantics of processes and new methods for the translation of picalculi into prefixless formalisms (like solos) as the operational counterpart of translations between proof systems. 
12:00  A type system for Continuation Calculus SPEAKER: unknown ABSTRACT. Continuation Calculus (CC), introduced by Geron and Geuvers [2], is a simple foundational model for functional computation. It is closely related to lambda calculus and term rewriting, but it has no variable binding and no pattern matching. It is Turing complete and evaluation is deterministic. Notions like “callbyvalue” and “callbyname” computation are available by choosing appropriate function definitions: e.g. there is a callbyvalue and a callbyname addition function. In the present paper we extend CC with types, to be able to define data types in a canonical way, and functions over these data types, defined by iteration. The iteration scheme comes in two flavors: a callbyvalue and a callbyname iteration scheme. Data type definitions follow the socalled “Scott encoding” of data, as opposed to the more familiar “Church encoding”. We give examples of how to define functions over data types and how to reason with these functions. 
12:30  Confluence for classical logic through the distinction between values and computations SPEAKER: Ralph Matthes ABSTRACT. We apply an idea originated in the theory of programming languages  monadic metalanguage with a distinction between values and computations  in the design of a calculus of cutelimination for classical logic. The cutelimination calculus we obtain comprehends the callbyname and callbyvalue fragments of CurienHerbelin's lambdabarmumutildecalculus without losing confluence, and is based on a distinction of "modes" in the proof expressions and "mode" annotations in types. Modes resemble colors and polarities, but are quite different: we give meaning to them in terms of a monadic metalanguage where the distinction between values and computations is fully explored. This metalanguage is a refinement of the classical monadic language previously introduced by the authors, and is also developed in the paper. 
11:00  TROvE: a Graphical Tool to Evaluate OWL Reasoners SPEAKER: Luca Pulina ABSTRACT. In this paper we present TROvE(Tool for Rapid OWL Reasoner Evaluation), a tool aimed to offer to a nonexpert user the possibility to evaluate OWL reasoners on several reasoning tasks by means of a simple “pushbutton” solution. 
11:20  Using OpenStreetMap Data to Create Benchmarks for Description Logic Reasoners SPEAKER: Guohui Xiao ABSTRACT. Engines for query answering over ontological knowledge bases are becoming increasingly popular and important in areas like the Semantic Web or information integration. They are mostly designed to answer queries over ontologies expressed using various Description Logics and in the presence of large amounts of instance data. This computational task, known as ontologybased query answering (OQA), is an important component in the more general area of ontologybased data access. Unfortunately, it has been acknowledged that judging the performance of current OQA reasoners and their underlying algorithms is difficult due to the lack of publicly available benchmarks that consist of large amounts of reallife instance data. In this paper we describe how benchmarks for OQA systems can be created from the publicly available OpenStreetMap (OSM) geospatial data. To this end, we first develop a formalization of OSM and present a rulebased language to specify the rules to extract instance data from OSM data. The declarative nature of the approach allows various variants of a benchmark to be created via small modifications to the rules of the specification. We describe a highly flexible engine to create a benchmark from a given OSM map and a given set of rules and present some evaluation results. 
11:40  A Scalable Benchmark for OBDA Systems: Preliminary Report SPEAKER: Davide Lanti ABSTRACT. In ontologybased data access (OBDA), the aim is to provide a highlevel conceptual view over potentially very large (relational) data sources by means of a mediating ontology. The ontology is connected to the data sources through a declarative specification given in terms of mappings that relate each (class and property) symbol in the ontology to an (SQL) view over the data. Although prototype OBDA systems providing the ability to answer SPARQL queries over the ontology are available, a significant challenge remains: performance. To properly evaluate OBDA systems, benchmarks tailored towards the requirements in this setting are needed. OWL benchmarks, which have been developed to test the performance of generic SPARQL query engines, however, fail at 1) exhibiting a complex realworld ontology, 2) providing challenging real world queries, 3) providing large amounts of data, and the possibility to test a system over data of increasing size, and 4) capturing important OBDAspecific measures related to the rewritingbased query answering approach in OBDA. In this work, we propose a novel benchmark for OBDA systems based on a real world usecase adopted in the EU project Optique. We validate our benchmark on the system Ontop, showing that it is more adequate than previous benchmarks not tailored for OBDA. 
12:00  Evaluating OWL 2 Reasoners in the Context Of Checking EntityRelationship Diagrams During Software Development SPEAKER: Alexander Kropotin ABSTRACT. This paper evaluates the performances of the OWL 2 reasoners HermiT, FaCT++ and TReasoner in the context of an ontological decision support system in designing entityrelationship diagrams during software development. First, I described a developed ontology which is the knowledge base of the developed application for designing databases. In the first set of experiments I compared how the classification and realization time of the DBOM ontology varied when increasing the ABox with ERD elements individuals. In the second set of experiments the consistency checking time of the DBOM ontology was estimated by increasing the ABox with ERD elements individuals. 
12:20  Just: a Tool for Computing Justifications w.r.t. ELH Ontologies SPEAKER: Michel Ludwig ABSTRACT. We introduce the tool Just for computing justifications for general concept inclusions w.r.t. ontologies formulated in the description logic EL extended with role inclusions. The computation of justifications in Just is based on saturating the input axioms under all possible inferences w.r.t. a consequencebased calculus. We give an overview of the architecture of the tool and we conclude with an experimental evaluation of its performance when applied on several practical ontologies. 
12:40  Android Went Semantic: Time for Evaluation SPEAKER: Fernando Bobillo ABSTRACT. Applications for mobile devices could often show a more intelligent behaviour by using a semantic reasoner to discover new knowledge. Unfortunately, using Description Logic reasoners on Android devices is not trivial. In this paper we continue our previous work on investigating the use of semantic reasoners on mobile devices. In particular, we port some new OWL~2 EL reasoners to Android and analyze the results of some experiments measuring the performance of several OWL~2 DL and OWL~2 EL reasoners on Android smartphones and tablets. 
11:00  Chekofv: Crowdsourced Formal Verification SPEAKER: unknown ABSTRACT. Over the past year, we have been developing Chekofv, a system for crowdsourced formal verification. Chekofv starts with an at tempt to verify a given C program using the source code analysis plat form FramaC. Every time the analysis loses precision while analyzing looping controlflow, Chekofv tries to obtain an invariant to regain precision using crowdsourcing. To that end, Chekofv translates the problem of finding loop invariants into a puzzle game and presents it to players of a game, Xylem, that is being developed as part of the Chekofv system. In this paper, we report on the design and implementation of the Chekofv system, the challenges and merits of gamification of the invariants detection problem, and problems and obstacles that we have encountered so far. 
11:30  Using Esoteric Language for Teaching Formal Semantics SPEAKER: Nikolay Shilov ABSTRACT. The talk presents an approach to undergraduate teaching of formal semantics of programs. It explains what is formal semantics by developing operational, denotational and axiomatic semantics for an ``esoteric'' language. We hope that the approach will be interesting to communities of Program Theoreticians and Software Engineers, that it will help better education to bridge a cultural gap between Formal Methods theories and Software Engineering practice. 
12:00  Introducing Formal Verification with Lego SPEAKER: Dimitri Racordon ABSTRACT. “In the end, you are a mathematician, not a computer scientist” or “Have we not already discovered everything in computer science?”. Which theoretical computer scientist has not heard a similar sentence when trying to explain hir research to a layperson? Promotion of theoretical computer science, and formal methods in particular, is mainly hindered by the high level of abstraction commonly used in the field, and probably furthermore complicated by a lack of education of the people thereupon. We present in this paper an educational experiment intended to explain the importance and aim of formal methods, along with the challenges they present. It uses Lego robots as an anchor to the real world, introducing fun into the presentation. Primarily targeting highschool students, it forms a solid and adaptable basis to reach various audiences. 
12:30  Explaining the decompositionality of MSO using applications to combinatorics SPEAKER: Tomer Kotek ABSTRACT. Abstract. Courcelle's celebrated theorem forms a strong theoretical basis for the development of formal methods. This extended abstract argues that a good intuition into the underlying ideas behind Courcelle's theorem can be achieved by considering the applications of these ideas to combinatorics. Technical notions regarding compuation are eliminated and replaced with simple notions of recurrence relations. 
12:00  Critical Pairs in Network Rewriting SPEAKER: Lars Hellström ABSTRACT. This extended abstract breifly introduces rewriting of networks (directed acyclic graphs with the extra structure needed to serve as expressions for PROducts and Permutations categories) and describes the critical pairs aspects of this theory. 
12:30  On Proving Confluence of Conditional Term Rewriting Systems via the Computationally Equivalent Transformation SPEAKER: Naoki Nishida ABSTRACT. This paper improves the existing criterion for proving confluence of a normal conditional term rewriting system (CTRS) via the SerbanutaRosu transformation, a computationally equivalent transformation of CTRSs into unconditional term rewriting systems (TRS), showing that a weakly leftlinear normal CTRS is confluent if the transformed TRS is confluent. Then, we discuss usefulness of the optimization of the SerbanutaRosu transformation, which has informally been proposed in the literature. 
12:00  Total Space in Resolution SPEAKER: Ilario Bonacina ABSTRACT. We show Ω(n^{2}) lower bounds on the total space used in resolution refutations of random kCNFs over n variables, and of the graph pigeonhole principle and the bit pigeonhole principle for n holes. This answers the longstanding open problem of whether there are families of kCNF formulas of size O(n) requiring total space Ω(n^{2}) in resolution, and gives the first truly quadratic lower bounds on total space. The results follow from a more general theorem showing that, for formulas satisfying certain conditions, in every resolution refutation there is a memory configuration containing many clauses of large width. 
12:30  From Small Space to Small Width in Resolution SPEAKER: Mladen Miksa ABSTRACT. In 2003, Atserias and Dalmau established that the space complexity of formulas in resolution is an upper bound on the width needed to refute them. Their proof is beautiful but somewhat mysterious in that it relies on tools from finite model theory. We give an alternative proof that works by simple syntactic manipulations of resolution refutations. As a byproduct, we develop a "blackbox" technique for proving space lower bounds. 
12:00  Quantitative semantics for higherorder probabilistic and quantum computation SPEAKER: Michele Pagani ABSTRACT. Various models of Linear Logic have been defined in categories of
vector spaces (more generally modules) and linear functions. These models provide semantics of quite expressive functional programming languages, where programs using their inputs exactly once correspond to linear functions and programs using their inputs an arbitrary number of times (like recursive programs) correspond to power series. Quantitative semantics are particularly suitable for interpreting nondeterministic computations, such as probabilistic and quantum algorithms. The addition between vectors expresses a kind of superposition of atomic states and the scalars a quantitative estimation of this superposition. In this talk I will introduce to the main ideas of quantitative semantics and I will present the most recent results achieved in the framework of higherorder probabilistic and quantum computation. 
14:00  On the power of fixedpoint logic with counting SPEAKER: Anuj Dawar ABSTRACT. In 1982, Neil Immerman proposed an extension of fixedpoint logic by means of counting quantifiers (which we denote FPC) as a logic that

15:00  Reasoning about transitive closure in Immerman's style SPEAKER: Mooly Sagiv 
14:30  Continuations, closed reduction and process networks SPEAKER: Ian Mackie ABSTRACT. In the early 1990s it was shown that the linear lambdacalculus 
15:30  Type Classes for Lightweight Substructural Types SPEAKER: Edward Gan ABSTRACT. Linear and substructural types are powerful tools, but adding them to standard functional programming languages often means introducing extra annotations and typing machinery. We propose a lightweight substructural type system design that recasts the structural rules of weakening and contraction as type classes; we demonstrate this design in a prototype language, Clamp. Clamp supports polymorphic substructural types as well as an expressive system of mutable references. At the same time, it adds little additional overhead to a standard Damas–Hindley–Milner type system enriched with type classes. We have established type safety for the core model and implemented a type checker for Clamp in Haskell. 
14:30  On Proving Soundness of the Computationally Equivalent Transformation for Normal Conditional Term Rewriting Systems by Using Unravelings SPEAKER: Naoki Nishida ABSTRACT. In this paper, we show that the SR transformation, a computationally equivalent transformation proposed by Serbanuta and Rosu, is sound for weakly leftlinear normal conditional term rewriting systems (CTRS). Here, soundness for a CTRS means that reduction of the transformed unconditional term rewriting system (TRS) creates no undesired reduction for the CTRS. We first show that every reduction sequence of the transformed TRS starting with a term corresponding to the one considered on the CTRS is simulated by the reduction of the TRS obtained by the simultaneous unraveling. Then, we use the fact that the unraveling is sound for weakly leftlinear normal CTRSs. 
15:00  Contextual Equivalences in CallbyNeed and CallByName Polymorphically Typed Calculi (Preliminary Report) SPEAKER: Manfred SchmidtSchauss ABSTRACT. This paper presents a callbyneed polymorphically typed lambdacalculus with letrec, case, constructors and seq. The typing of the calculus is modelled in a systemF style. Contextual equivalence is used as semantics of expressions. We also define a callbyname variant without letrec. We adapt several tools and criteria for recognizing correct program transformations to polymorphic typing, in particular an inductive applicative simulation. 
15:30  Verifying the Correctness of Tupling Transformations based on Conditional Rewriting SPEAKER: Yuki Chiba ABSTRACT. Chiba et al.\ (2010) proposed a framework of program transformation by templates based on term rewriting. Their framework can deal with tupling, which improve efficiency of programs. Outputs of their framework, however, may not always more efficient than inputs. In this paper, we propose a technique to show the correctness of tupling based on conditional term rewriting. We give an extended equational logic in order to add conditional rules. 
14:30  Proving Gödel’s Incompleteness Theorems SPEAKER: Lawrence Paulson ABSTRACT. Gödel’s two incompleteness theorems have been proved in Isabelle. The nominal package was used to handle variable binding in conjunction with de Bruijn indexes. This is the first mechanical proof of the second incompleteness theorem. 
15:00  Isabelle/jEdit NEWS SPEAKER: Makarius Wenzel ABSTRACT. This system demo will give an overview of NEWS for Isabelle/jEdit in the forthcoming Isabelle2014 release, which is anticipated for summer 2014. Both the prover and the integration with the underlying jEdit editor have been refined in various ways. Here is an incomplete list of notable items: improved completion mechanism, integrated spellchecker, hyperlink navigation within the editor, URL and browser support, support for auxiliary files, and official SML'97 IDE support. See also Isabelle2014RC0, which is the same prerelease snapshot that will be used in the Isabelle tutorial. 
15:30  Programming TLS in Isabelle/HOL SPEAKER: Andreas Lochbihler ABSTRACT. Isabelle/HOL is not just a theorem prover, it has become a functional programming language. Algebraic datatypes and (pure) recursive functions are defined with various packages and compiled to executable code with the code generator. In this work, we explore whether and how this programming language is suitable for developing applications, which are stateful, interact with the environment, and use external libraries. To that end, we have implemented a prototype of the TLS network protocol as a case study. We present a model of interaction in HOL and its compilation, and discuss on the challenges in application development that the theorem prover/HOL Isabelle poses. 
14:30  An Introduction to HigherDimensional Rewriting Theory SPEAKER: Samuel Mimram 
14:30  Cyclic proof for quantitative logics SPEAKER: Alex Simpson ABSTRACT. I will talk about ongoing work with Matteo Mio (University of Cambridge) to build useful proof systems for quantitative fixedpoint logics. Our approach adapts cyclicproofbased sequent calculi to the quantitative setting. Several technical issues arise, not all of which have been resolved. The overall aim of the programme is to develop machinery for reasoning about probabilistic concurrent systems. But the talk will focus on prooftheoretic issues, in a purely logical setting. 
15:15  Modular Systems for Intuitionistic Modal Logics in Nested Sequents SPEAKER: Sonia Marin ABSTRACT. In this talk we show for each of the modal axioms d, t, b, 4, and 5 an equivalent set of inference rules in a nested sequent system, such that, when added to the basic system for the intuitionistic modal logic IK, the resulting system admits cut elimination. The result is similar to the one presented at "Gentzen Systems and Beyond 2011" about classical modal logics. We use a combination of structural and logical rules to achieve modularity. 
15:40  An Intuitionisticaly based Description Logic SPEAKER: Alexandre Rademaker ABSTRACT. This article presents iALC, an intuitionistic version of the classical description logic ALC, based on the framework for constructive modal logics presented by Simpson (1994) and related to description languages, via hybrid logics, by de Paiva (2006). This article corrects and extends the presentation of iALC appearing in de Paiva et al (2010). It points out the difference between iALC and the intuitionistic hybrid logic presented in de Paiva (2006). Completeness and soundness proofs are provided. A Sequent Calculus for iALC and a discussion on the computational complexity is taken. It is worth mentioning that iALC is used to formalize legal knowledge, and in fact, was specifically designed to this goal.

14:30  Incremental QBF Reasoning for the Verification of Partial Designs SPEAKER: Paolo Marin ABSTRACT. Incremental solving methods made SAT attractive for industrial use as core technology of numerous electronic design automation tools, e.g. for verification tasks like Bounded Model Checking (BMC). On the other hand, when dealing with partial designs or unknown specifications, SAT formulas are not expressive enough, whereas a description via quantified Boolean formulas (QBF) is more adequate. Persuaded by the success of incremental SAT, we explore various ways to tackle QBF problems incrementally and thereby make this technology available. We developed the first incremental QBF solver based on the stateoftheart QBF solver QuBE, which can now reuse some information from previous iterations, therefore pruning the search space. However, the need for a preprocessing QBF phase, that in general cannot be paired with incremental solving because of the nonpredictable elimination of variables in the future incremental steps, posed the question of incremental QBF preprocessing. We study an approach for retaining the QBF formula being preprocessed while extending its clauses and prefix incrementally. This procedure yields effectively preprocessed QBF formulas, nevertheless it may come together with long runtimes. We consider various heuristics to dynamically skip incremental preprocessing when its overhead is not compensated by the reduced solving time anymore. We experimentally analyze from both the point of view of the effectiveness of the single procedure and how a generic QBF solver can profit from it how the different methods positively affect BMC for partial designs (i.e. designs containing socalled blackboxes which represent unknown parts). The goal of this problem is to disprove realizability, that is, to prove that an unsafe state is reachable no matter how the blackboxes are implemented. On a domain of partial design benchmarks, engaging incremental QBF methods significant performance gains over non incremental BMC can be achieved. 
15:00  Reactive Synthesis using (Incremental) QBF Solving SPEAKER: Robert Koenighofer ABSTRACT. Reactive synthesis algorithms construct a correct reactive system fully automatically from a given formal specification. Besides the construction of full systems, such synthesis algorithms are also used for program sketching and automatic program repair. This talk will explain how reactive synthesis can be realized efficiently with QBF solving, and how recently proposed approaches for incremental QBF solving can be exploited. We present first experimental results and compare our QBFbased solution to SAT and BDDbased implementations. 
15:30  Conformant Planning as a Case Study of Incremental QBF Solving SPEAKER: Martin Kronegger ABSTRACT. We consider planning with uncertainty in the initial state as a case study of incremental quantified Boolean formula (QBF) solving. We report on experiments with a workflow to incrementally encode a planning instance into a sequence of QBFs. To solve this sequence of incrementally constructed QBFs, we use our generalpurpose incremental QBF solver DepQBF. Since the generated QBFs have many clauses and variables in common, our approach avoids redundancy both in the encoding phase and in the solving phase. Experimental results show that incremental QBF solving outperforms nonincremental QBF solving. Our results are the first empirical study of incremental QBF solving in the context of planning. Based on our case study, we argue that incremental QBF solving also has the potential to improve QBFbased workflows in other application domains. 
14:30  HOL4 Hidden Features SPEAKER: Anthony Fox ABSTRACT. I propose briefly talking about the following HOL4 “hidden features”.

15:00  New Styles of Proof SPEAKER: Ramana Kumar ABSTRACT. I propose to talk in general about the tensions between different approaches to constructing proofs: maintainability, construction speed, replay speed, readability, etc. and also to highlight a few relatively unknown tactics, packages, and ways of using them: lcsymtacs, match_rename etc., quant heuristics (maybe), writing conversions, using MATCH_MP. I hope to both share and learn tips and tricks for writing proofs effectively. For another kind of hidden feature, I propose to highlight the issue tracker and other resources that are being used, and could be used more effectively, to improve HOL development. Along these lines, I also propose regular developer workshops. (Additionally, if there is time, I would propose some longerterm feature requests including some that already appear in the issue tracker, and others such as partial functions, BNF datatypes, logic programming in proof automation, and mechanisms for localestyle development.) 
15:30  Hack Session 2 SPEAKER: Everyone 
14:30  Exploring Reasoning with the DMOP Ontology SPEAKER: C. Maria Keet ABSTRACT. We describe the Data Mining OPtimization Ontology (DMOP), which was developed to support informed decisionmaking at various choice points of the knowledge discovery (KD) process. DMOP contains indepth descriptions of DM tasks, data, algorithms, hypotheses, and workflows. Its development raised a number of nontrivial modeling problems, the solution to which demanded maximal exploitation of OWL 2 representational potential. The choices made led to v5.4 of the DMOP ontology. We report some evaluations on processing DMOP with a standard reasoner by considering different DMOP features. 
14:50  An update on Genomic CDS, a complex ontology for pharmacogenomics and clinical decision support SPEAKER: Matthias Samwald ABSTRACT. Genetic data can be used to optimize drug treatment based on the genetic profiles of individual patients, thereby reducing adverse drug events and improving the efficacy of pharmacotherapy. The Genomic Clinical Decision Support (Genomic CDS) ontology utilizes Web Ontology Language 2 (OWL 2) reasoning for this task. The ontology serves a clearcut medical use case that requires challenging OWL 2 DL reasoning. We present an update of the Genomic CDS ontology which covers a significantly larger number of clinical decision support rules and where inconsistencies present in previous versions of the ontology have been removed. 
15:10  A Family History Knowledge Base in OWL 2 SPEAKER: Nicolas Matentzoglu ABSTRACT. This paper presents a challenging family history knowledgebase (FHKB) authored in OWL 2 DL. Originally, the FHKB was designed to act as a tool for education, especially about OWL 2’s features and the use of automated reasoners. As a result, the FHKB has been constructed to maximise use of inference. For individuals representing people, only genealogical assertions on parentage and sparse assertions of siblinghood are given explicitly. All other genealogical inferences are driven by a rich property hierarchy, property characteristics and subproperty chains. A rich collection of entailments are generated, but reasoners struggle to handle a version with all of Robert’s known relatives. 
14:30  On the Limits of SecondOrder Unification SPEAKER: Jordi Levy ABSTRACT. SecondOrder Unification is a problem that naturally arises when applying automated deduction techniques with variables denoting predicates. The problem is undecidable, but a considerable effort has been made in order to find decidable fragments, and understand the deep reasons of its complexity. Two variants of the problem, Bounded SecondOrder Unification and Linear SecondOrder Unification where the use of bound variables in the instantiations is restricted, have been extensively studied in the last two decades. In this paper we summarize some decidability/undecidability/complexity results, trying to focus on those that could be more interesting for a wider audience, and involving less technical details. 
15:30  From Admissibility to a New Hierarchy of Unification Types SPEAKER: unknown ABSTRACT. Motivated by the study of admissible rules, a new hierarchy of ``exact'' unification types is introduced where a unifier is more general than another unifier if all identities unified by the first are unified by the second. A Ghilardistyle algebraic interpretation of this hierarchy is presented that uses exact algebras rather than projective algebras. Examples of equational classes distinguishing the two hierarchies are also provided. 
14:30  Lingeling Essentials, A Tutorial on Design and Implementation Aspects of the the SAT Solver Lingeling SPEAKER: Armin Biere ABSTRACT. One of the design principles of the stateoftheart SAT solver Lingeling is to use as compact data structures as possible. These reduce memory usage, increase cache efficiency and thus improve runtime, particularly, when using multiple solver instances on multicore machines, as in our parallel portfolio solver Plingeling and our cube and conquer solver Treengeling. The scheduler of a dozen inprocessing algorithms is an important aspect of Lingeling as well. In this talk we explain these design and implementation aspects of Lingeling and discuss new direction of solver design. 
14:30  Are matrix identities hard instances for strong proof systems? SPEAKER: Iddo Tzameret ABSTRACT. I will first talk about the complexity of generating identities of matrix rings. And demonstrate an unconditional lower bound on the minimal number of generators needed to generate a matrix identity, where the generators are substitution instances of elements from any given finite basis of the matrix identities. Based on the above, I will argue that matrix identities may be good hard candidates for strong proof systems, and I will discuss an initial study of this approach under different settings and assumptions. Under certain conditions, this approach can potentially lead up to exponentialsize lower bounds on arithmetic proofs, which are proofs of polynomial identities operating with arithmetic circuits and whose axioms are the polynomialring axioms (these proofs serve as an algebraic analogue of the Extended Frege propositional proof system). I will also discuss shortly the applicability of our approach to strong propositional proof systems. Based on a joint work with Fu Li. 
14:30  Basic Normalization SPEAKER: Nao Hirokawa ABSTRACT. The Normalization Theorem (O'Donnell 1977) states that for every leftnormal orthogonal rewrite system, the leftmost outermost strategy reduces any term to the normal form if it exists. Although the theorem ensures the normalization property of important systems like Combinatory Logic, the leftnormality condition rules out many functional programs. We revisit the problem to seek a solution for normalization of the leftmost outermost strategy. 
15:15  Rhythm Tree Rewriting SPEAKER: Florent Jacquemard ABSTRACT. In traditional western music notation, the durations of notes are expressed hierarchically by recursive subdivisions. This led naturally to a tree representation of melodies widespread in systems for assistance to music authoring. We will see how term rewriting techniques can be applied to computer music problems, in particular to the problem of rhythm quantization: the transcription of a sequence of dates (real time values) into a music score. Besides the question of rhythm representation, an interesting problem in this context is the design and concise description of large rewrite rule sets and decision problems for these descriptions. 
14:30  TBA SPEAKER: Alessandra Carbone 
15:15  Model Checking the Evolution of Gene Regulatory Networks SPEAKER: Thomas Henzinger ABSTRACT. Evolutionary biologists use a weighted model of gene regulatory networks (GRNs) to define and simulate the robustness of biological
systems against genetic perturbations. By replacing simulation with model checking we achieve higher degrees of assurance, precision,
and scalability. We first present an algorithm for computing the robustness of a GRN with respect to viability properties expressed in
linear temporal logic. We then employ symbolic bounded model checking to compute the space of GRNs that satisfy a given property,
which amounts to synthesizing a set of linear constraints on symbolic weights. Finally, we compute the robustness of the satisfying
space against perturbations. We apply our method on a number of gene regulatory networks and compare its performance and
precision against simulationbased approaches. 
14:30  Machinecode verification: experience of tackling mediumsized case studies using 'decompilation into logic’ SPEAKER: Magnus Myreen ABSTRACT. All software executes in the form of machine code. Ideally, software verification should reach down to precise statements about the concrete machine code that runs. During my PhD, I developed infrastructure in the HOL4 theorem prover for verification of programs at the level of machine code. The infrastructure includes a programming logic and a proofproducing decompiler and compiler. In the subsequent five years, this infrastructure has been used in two mediumsized case studies: construction of a verified implementation of readevalprint loop for Lisp, which can run Jared Davis' Milawa prover; and verification of the accuracy of the GCC compiler's output for the C implementation of the seL4 microkernel. These case studies resulted in HOL4 theorems about thousands of lines of concrete x8664 and ARM machine code, respectively. In this talk, I will explain the existing infrastructure for machinecode verification; describe how the infrastructure had to adapt to the case studies; and reflect on future research directions. This talk describes work done in collaboration with Anthony Fox (ARM ISA specification), Jared Davis (Milawa theorem prover), Thomas Sewell and Gerwin Klein (seL4 microkernel). 
14:30  Towards the Application of Formal Methods in Process Engineering SPEAKER: Umair Siddique ABSTRACT. Process engineering deals with the design, operation and optimization of different physical processes such as chemical reactions, biological mechanisms and petroleum processing. Recent technological advancements and short time to market in almost every industry bring new challenges and hence the complexity in the process engineering systems. One of the main techniques to analyze such models is to represent underlying systems using signalflowgraphs which provide a systematic procedure to evaluate the system performance in the form of transfer functions. Traditionally, the analysis of signalflowgraphs based models has been carried out by the paperandpencil based proofs and numerical techniques. In this paper, we present an overview of using theorem proving to formally analyze processing engineering models. The main motivation of this work is twofold: First, the application of formal methods in new domain to improve the analysis accuracy; Second, an effort to reduce the gap between formal methods and different engineering communities such as mechanical, chemical and engineering management. 
15:00  An Example Demonstrating the Requirement of Fully Formal Verification Method SPEAKER: Pratima K. Shah ABSTRACT. Concurrency and synchronization is an important behavior of hardware and software systems. Finding defects in the late phase of system development process is not acceptable anymore in industrial applications because of its highly capital investment in modern complex systems. This paper demonstrates system verification by two approaches: model checking and proof checker. Here verification of a concurrent running architecture is demonstrated in both approaches to exhibit the requirement of fully formal verification method in order to generate a better hardware and software quality. To support this, we will describe how the proof checker is used for complete verification of a system in terms of any input size and complexity. 
14:30  Cellular Automata are Generic SPEAKER: unknown ABSTRACT. We show that any effective algorithm over any arbitrary domain can be simulated by a particular kind of dynamic cellular automaton 
15:00  Quantum Circuits for the Unitary Permutation Problem SPEAKER: unknown ABSTRACT. We consider the Unitary Permutation problem which consists, given n unitary gates U_1,…,U_n and a permutation σ of {1,…,n}, in applying the unitary gates in the order specified by σ, i.e. in performing U_σ(n)…U_σ(1). This problem has been introduced and investigated by Colnaghi et al. where two models of computations are considered. This first is the (standard) model of query complexity: the complexity measure is the number of calls to any of the unitary gates U_i in a quantum circuit which solves the problem. The second model provides quantum switches and treats unitary transformations as inputs of second order. In that case the complexity measure is the number of quantum switches. In this seminal paper it has been shown that the problem can be solved within n^2 calls in the query model and n(n1)/2 quantum switches in the new model. We refine these results by proving that nlog_2(n)+\Theta(n) quantum switches are necessary and sufficient to solve this problem, whereas n^22n+4 calls are sufficient to solve this problem in the standard quantum circuit model. We prove, with an additional assumption on the family of gates used in the circuits, that n^2o(n^{7/4+r}) queries are required, for any r>0. The upper and lower bounds for the standard quantum circuit model are established by pointing out connections with the permutation as substring problem introduced by Karp. 
15:30  Propositional Logics Complexity and the subformula Property SPEAKER: Edward Hermann Haeusler ABSTRACT. In 1979 Richard Statman proved, using prooftheory, that the purely implicational fragment of Intuitionistic Logic (\mil) is PSPACEcomplete. He showed a polynomially bounded translation from full Intuitionistic Propositional Logic into its implicational fragment. By the PSPACEcompleteness of S4, proved by Ladner, and the G\"{o}del translation from S4 into Intuitionistic Logic, the PSPACEcompleteness of \mil~ is drawn. %% The subformula principle for a deductive system for a logic \Log{L} states that whenever ${\gamma_1,\ldots,\gamma_k\}\vdash_{\Log{L}}\alpha$ there is a proof in which each formula occurrence is either a subformula of $\alpha$ or of some of $\gamma_i$. In this work we extend Statman's result and show that any propositional (possibly modal) structural logic satisfying a particular statement of the subformula principle is PSPACEcomplete. As a consequence, EXPTIMEcomplete propositional logics, such as PDL and the commonknowledge epistemic logic with at least 2 agents satisfy this particular subformula principle, if and only if, PSPACE=EXPTIME. 
14:30  Infinitary Classical Logic: Recursive Equations and Interactive Semantics SPEAKER: Michele Basaldella ABSTRACT. In this paper, we present an interactive semantics for derivations in an infinitary extension of classical logic. The formulas of our language are possibly infinitary trees labeled by propositional variables and logical connectives. We show that in our setting every recursive formula equation has a unique solution. As for derivations, we use an infinitary variant of Taitcalculus to derive sequents. The interactive semantics for derivations that we introduce in this article is presented as a debate (interaction tree) between a test T (derivation candidate, Proponent) and an environment ¬S (negation of a sequent, Opponent). We show a completeness theorem for derivations, that we call interactive completeness theorem: the interaction between T (test) and ¬S (environment) does not produce errors (i.e., Proponent wins) just in case T comes from a syntactical derivation of S. 
15:00  Separable Sequent Calculus for Firstorder Classical Logic (Work in Progress) SPEAKER: Dominic Hughes ABSTRACT. This paper presents Separable Sequent Calculus as an extension of Gentzen’s firstorder classical sequent calculus with Herbandstyle structural rules (subformula contraction and weakening). Every proof can be transformed into separated form, in which all logical rules precede all structural rules, a result in the spirit of Gentzen’s midsequent theorem or sharpened Hauptsatz (where all propositional rules precede all quantifier rules), and Herbrand’s theorem. Every separated proof has a direct graphtheoretic semantics as a *combinatorial proof*. Since a standard Gentzen proof is a special case of a separable proof, every Gentzen proof has a combinatorial proof semantics via its separated form. The semantics identifies sequent proofs which differ by rule transpositions (such as pushing a contraction below a weakening). Since combinatorial proofs can be verified in polynomial time (conjectured linear time), they constitute a syntax in their own right, arguably a canonical abstraction of sequent calculus modulo inessential rule transposition. 
15:20  CutElimination in Schematic Proofs and Herbrand Sequents SPEAKER: unknown ABSTRACT. In a recent paper [2], a procedure was developed extending the firstorder CERES method [1] so that it can handle cutelimination in a schematic firstorder calculus. The goal of this work was to circumvent the problems reductive cut elimination methods face when the LK calculus is extended by an induction rule. The schematic calculus can be considered a replacement for certain types of induction. In this work, we used the schematic CERES method to analyse a proof formalized in a schematic sequent calculus. The statement being proved is a simple mathematical statement about total functions with a finite range. The goal of proof analysis using the firstorder CERES method [1] has been to produce an ACNF (Atomic Cut Normal Form) as the final output of cutelimination. However, due to the complexity of the schematic method, the value and usefulness of an ACNF quickly vanishes; it is not easily parsable by humans. The Herbrand sequent corresponding to an ACNF turned out to be a valuable, compact and informative structure, which may be considered the essence of a cutfree proof in firstorder logic [3]. We provide a method for extracting a schematic Herbrand sequent from the formalized proof and hint at how ,in future work, we can generalize the procedure to handle a class of proofs by a suitable schematic language and calculus, and not just for a particular instance. 
15:40  Stratified Nested Linear Logic SPEAKER: Matthieu Perrinel ABSTRACT. Implicit computational complexity (ICC) is the field aiming to characterize complexity classes by syntactic restrictions on models of computation. For example, Light Linear Logic (LLL) is a linear logic subsystem characterizing Ptime. Finding more expressive characterizations of $Ptime$ is a big challenge in ICC. This led, for instance, to L^4 and MS, which are orthogonal extensions of LLL. Here, we define a linear logic subsystem: Stratified Nested Linear Logic (SNLL) which extends LLL, L^4 and MS. This system is not just the union of those systems as it can type lambdaterms which are typed by none of the above systems. 
15:00  On Machines, Virtual Memory and Successful System Verification SPEAKER: Rafal Kolanski ABSTRACT. Formal verification of programs and systems lacks a concept of objective and absolute correctness. One may only claim correctness with respect to a model. How detailed can we make this model? At which point does our desire for completeness infringe upon project tractability? For operating system verification in particular, overly naïve models result in poor performance and ineffective use of hardware features. On the other hand, overly complex models result in human resource exhaustion and a graveyard of ambitious projects that burnt out in the race for realism. In this talk, I will discuss these tradeoffs based upon my long experience with the L4.verified project, a successful 20 personyear effort to verify the functional correctness of the seL4 microkernel. I will also discuss my own efforts in searching for an elegant semantics for reasoning about virtual memory based on separation logic. 
15:30  Confluence of linear rewriting and homology of algebras SPEAKER: Philippe Malbos ABSTRACT. We introduce the notion of higher dimensional linear rewriting system for presentations of algebras generalizing the notion of noncommutative Gröbner bases. We show how to use this notion to compute homological invariants of associative algebras, related to confluence properties of presentations of these algebras. Our method constitutes a new application of confluence in algebra. 
15:30  Generic CDCL  A Formalization of Modern Propositional Satisfiability Solvers SPEAKER: Tobias Philipp ABSTRACT. Modern propositional satisfiability (or SAT) solvers are very powerful due to recent developments on the underlying data structures, the used heuristics to guide the search, the deduction techniques to infer knowledge, and the formula simplification techniques that are used during pre and inprocessing. However, when all these techniques are put together, the soundness of the combined algorithm is not guaranteed any more, and understanding the complex dependencies becomes nontrivial. In this paper we present a small set of rules that allows to model modern SAT solvers in terms of a state transition system. With these rules all techniques which are applied in modern SAT solvers can be modeled adequately. Furthermore, we show that this set of rules results is sound, complete and confluent. Finnaly, we compare the proposed transition system to related systems, and show how widely used solving techniques can be modeled. 
15:30  Lower bounds for splittings by linear combinations SPEAKER: Dmitry Itsykson ABSTRACT. A typical DPLL algorithm for the Boolean satisfiability problem splits the input problem into two by assigning the two possible values to a variable; then it simplifies the two resulting formulas. In the talk we consider an extension of the DPLL paradigm. Our algorithms can split by an arbitrary linear combination of variables modulo two. These algorithms quickly solve formulas that explicitly encode linear systems modulo two, which were used for proving exponential lower bounds for conventional DPLL algorithms. 
15:30  Injectivity of Relational Semantics for (Connected) MELL ProofNets via Taylor Expansion SPEAKER: Luc Pellissier ABSTRACT. We show that: (1) the Taylor expansion of a cutfree MELL proofstructure R with atomic axioms is the (most informative part of the) relational semantics of R; (2) every (connected) MELL proofnet is uniquely determined by the element of order 2 of its Taylor expansion; (3) the relational semantics is injective for (connected) MELL proofnets. 
16:30  The Variable Hierarchy on Ordered Graphs SPEAKER: Benjamin Rossman 
17:30  Closing Remarks SPEAKER: Neil Immerman 
16:30  Ludics without Designs I: Triads SPEAKER: Michele Basaldella ABSTRACT. In this paper, we introduce the concept of triad. Using this notion, we study, revisit, discover and 
17:00  WaveStyle Token Machines and Quantum Lambda Calculi SPEAKER: unknown ABSTRACT. Particlestyle token machines are a way to interpret proofs and programs, when the latter are written following the principles of linear logic. In this paper, we show that token machines also make sense when the programs at hand are those of a simple quantum lambdacalculus. This, however, requires generalizing the concept of a token machine to one in which more than one particle travel around the term at the same time. The presence of multiple tokens is intimately related to entanglement and allows to give a simple operational semantics to the calculus, coherently with the principles of quantum computation. 
17:30  Geometry of Resource Interaction – A Minimalist Approach SPEAKER: Marco Solieri ABSTRACT. The Resource λcalculus (RC) is a variation of the λcalculus where arguments can be superposed and must be linearly used. Hence it is a model for nondeterministic and linear programming languages, and the target language of the Taylor expansion of λterms. In a strictly typed restriction of RC, we study the notion of path persistency and we define a Geometry of Interaction inspired by the dynamic algebra. We show that the GoI is invariant under reduction and characterises persistency by means of path regularity. Our construction is also complexity aware, in the sense that it is able to count the nonzero addends in the normal forms a resource term. 
18:00  A new point of view on the Taylor expansion of proofnets and uniformity SPEAKER: Giulio Guerrieri ABSTRACT. We introduce (in the multiplicative and exponential fragment of linear logic) the notion of protodifferential net. We then define a coherence relation on the set of protodifferential nets and prove the analogue of the result proven for differential lambdaterms, which does not hold for differential nets: a set of protodifferential nets is the subset of the protoTaylor expansion of some proofstructure if and only if it is a clique. 
16:30  Notes on StructurePreserving Transformations of Conditional Term Rewrite Systems SPEAKER: Karl Gmeiner ABSTRACT. Transforming conditional term rewrite systems (CTRSs) into unconditional systems (TRSs) is a common approach to analyze properties of CTRSs via the simpler framework of unconditional rewriting. In the past many different transformations have been introduced for this purpose. One class of transformations, socalled unravelings, have been analyzed extensively in the past. In this paper we provide an overview on another class of transformations that we call structurepreserving transformations. In these transformations the structure of the conditional rule, in particular their lefthand side is preserved in contrast to unravelings. We provide an overview of transformations of this type and define a new transformation that improves previous approaches. 
17:00  A Complexity Preserving Transformation from Jinja Bytecode to Rewrite Systems SPEAKER: unknown ABSTRACT. We revisit known transformations from objectoriented bytecode programs to rewrite systems from the viewpoint of runtime complexity. Suitably generalising the constructions proposed in the literature, we define an alternative representation of Jinja bytecode (JBC) executions as computation graphs from which we obtain a representation of JBC executions as constrained rewrite systems. We show that the transformation is complexity preserving. We restrict to nonrecursive methods and make use of heap shape preanalyses. 
17:30  Closing SPEAKER: Manfred SchmidtSchauß 
16:30  Using Isabelle/HOL to Develop and Maintain Separation Invariants for an Operating System (Extended Abstract) SPEAKER: Burkhart Wolff ABSTRACT. We describe work on an Isabelle/HOL model for the specification of a separation kernel done within the EUROMILS (http://www.euromils.eu/) project. We chose to extensible records to specify the state. By an example of a theory specifying a group of "event" API calls, it is shown how lemmas on local state are used for obtaining proof obligations for a global separation property. 
17:00  A Simpl Shortest Path Checker Verification SPEAKER: Christine Rizkallah ABSTRACT. Using current verification tools to verify complex algorithms in reasonable time is challenging. Certifying algorithms compute not only an output but also a witness certifying that the output is correct. A checker for a certifying algorithm is a simple program that decides whether the witness is correct for a particular input and output. Verification of checkers is feasible and leads to trustworthy computations. In previous work, we verified checkers from the algorithmic library LEDA using the interactive theorem prover Isabelle/HOL as a backend to the automatic code verifier VCC. More recently, we showed that verification can be carried out completely within Isabelle/HOL and compared this to the previous approach concluding that the more recent approach is more trustworthy with comparable efforts. Here, we use the more recent framework, and implement a shortest path checker algorithm for graphs with nonnegative edge weights in the imperative pseudocode language Simpl. Moreover, we formally verify the checker using Isabelle/HOL. 
17:20  Towards Structured Proofs for Program Verification (Ongoing Work) SPEAKER: Lars Noschinski ABSTRACT. The first step in program verification is often a call to a verification condition generator (VCG) which decomposes a program into a set of logical formulas, the verification conditions. These verification conditions must then be discharged manually. The process of constructing the verification conditions loses much of the structure of the original program. In this article, we extend the VCG to attach annotations to the verification conditions and hence preserve the program structure. We implemented this approach in the Isabelle theorem prover. This allows us to replace unstructured proof scripts by structured Isar proofs. 
17:40  Amortized Complexity Verified SPEAKER: Tobias Nipkow ABSTRACT. A framework for the analysis of the amortized complexity of (functional) data structures is formalized in Isabelle/HOL and applied to two nontrivial examples: skew heaps and splay trees. In the same spirit we show that the movetofront algorithm for the list update problem is 2competitive. 
16:30  An Implementation Model for Interaction Nets SPEAKER: Shinya Sato ABSTRACT. To study implementations and optimisations of interaction net systems we propose a calculus to allow us to reason about nets, a concrete datastructure that is in close correspondence with the calculus, and a lowlevel language to create and manipulate this data structure. These work together so that we can describe the compilation process for interaction nets, reason about the behaviours of the implementation, and study the efficiency and properties. 
17:00  Fixed Point Theory for Consistent Rewriting in Logic of Action with Reverse SPEAKER: Susumu Yamasaki ABSTRACT. This paper studies an action structure to recursively derive actions by rewriting, where the action is abstract as in process algebra and has its reverse action. A representation method of action denotations with action defaults is given by fixed point theory, where the denotations are owing to coexistence of actions and their reverses. As the next method, the reverse action is thought of as multiplicative inverse in a semiring, which can be formed by algebraic compositions of action structures to primarily express sequences of actions with the action denotations. In both ways, consistency to classify each action and its reverse into different sorts is respected. 
16:30  Link formulas in implication fragments of substructural logics SPEAKER: Tomasz Kowalski ABSTRACT. Two wellknown implication fragments of substructural logics are BCI (fragment of logic without contraction and weakening) and BCK (fragment of logic without contraction). The technique of link formulas was developed by the present author first for BCI, as a tool to solve an open problem. Then, a modification of the technique turned out to work for BCK and delivered a proof of its structural incompleteness. The technique can be used to obtain relatively easy proofs of admissibility of certain rules in BCK and BCI, yielding structural incompleteness proofs of several logics in the vicinity. I will present the technique in a uniform way, applicable to BCI and BCK, as well as to the implication fragments of the relevant logic R and the intuitionistic logic. Some applications (old and new) will be presented to illustrate the technique. 
16:55  A proof theoretic approach to standard completeness SPEAKER: Paolo Baldi ABSTRACT. We show extensions of the proof theoretic approach to standard completeness, based on the elimination of the density rule in suitable hypersequent calculi. 
17:20  On Affine Logic and Łukasiewicz Logic SPEAKER: Rob Arthan ABSTRACT. We report on a study of intutionistic fragments of Łukasiewicz logic. We present, in a structured human readable form, syntactic proofs of a number of important formulas originally found with the assistance of the Prover9 automated theorem prover. The identities include homomorphism properties that we put to work in an analysis of the double negation translations of Kolmogorov, Gödel, Gentzen and Glivenko. 
17:45  Simpler Proof Net Quantifiers: Unification Nets (Work in Progress) SPEAKER: Dominic Hughes ABSTRACT. Girard formulated an approach to proof net quantifiers over a series of three papers. Since the approach uses explicit witnesses for existential quantifiers, it suffers from two problems: limited proof identification and nonlocal cut elimination. This paper introduces a more abstract approach, unification nets, which solves these problems by leaving existential witnesses implicit: a unification net is merely an axiom linking. 
16:30  Synthesis of distributed systems using QBF SPEAKER: Adrià Gascón ABSTRACT. We revisit the interactive consistency problem introduced We discovered our algorithm

17:00  The QBF Gallery 2014 SPEAKER: unknown 
16:30  Discussion Session SPEAKER: Everyone ABSTRACT. Interspersed with our hack sessions on implementing and discussing bugs and new features for HOL4, we will have time for discussion of e.g. future directions for HOL workshops, cool applications and theorem proving ideas, impromptu talks, Q&A, etc. 
17:00  Hack Session 3 SPEAKER: Everyone 
16:30  Normalization Equivalence of Rewrite Systems SPEAKER: Aart Middeldorp ABSTRACT. Métivier (1983) proved that every confluent and terminating rewrite system can be transformed into an equivalent canonical rewrite system. He also proved that equivalent canonical rewrite systems which are compatible with the same reduction order are unique up to variable renaming. In this note we present simple and formalized proofs of these results. The latter result is generalized to the uniqueness of normalization equivalent reduced rewrite systems. 
17:00  NonEoverlapping and weakly shallow TRSs are confluent (Extended abstract) SPEAKER: Mizuhito Ogawa ABSTRACT. This paper briefly sketches that "nonE"overlapping and weaklyshallow TRSs are confluent" by extending reduction graph in our previous work (M.Sakai and M.Ogawa. Weaklynonoverlapping noncollapsing shallow term rewriting systems are confluent. Information Processing Letters vol.110, 2010) by introducing constructor expansion. A term is weakly shallow if each defined function symbol appears either at the root or in the ground subterms, and a TRS is weakly shallow if the both sides of rules are weakly shallow. The nonEoverlapping property is undecidable for weakly shallow TRSs and a decidable sufficient condition is the strongly nonoverlapping condition. A Turing machine can be simulated by a weakly shallow TRS (p.27 in J.W.Klop, Term Rewriting Systems, 1993); thus the word problem is undecidable,in contrast to shallow TRSs. 
17:30  Confluence Competition SPEAKER: Harald Zankl ABSTRACT. The 3rd Confluence Competition (CoCo 2014) runs live. The competition is part of the FLoC'14 Olympic Games. 
16:30  MiniME 2.0: powering the Semantic Web of Things SPEAKER: Floriano Scioscia ABSTRACT. This paper presents an updated version of MiniME, a mobile reasoner for the Semantic Web of Things. Building upon previous stronger elements, i.e. computational efficiency and support for nonstandard inference services, novel features have been added. Particularly, the Concept Covering reasoning task for request answering via service/resource composition has been included among allowed inferences, Protégé plugins have been released and the support for OWLlink protocol is now available. As a proof of concept, two use cases are presented, both in the mobile and ubiquitous computing field: a wireless semantic sensor network and a mobile semantic augmented reality scenario. 
16:50  Incremental and Persistent Reasoning in FaCT++ SPEAKER: Dmitry Tsarkov ABSTRACT. Reasoning in complex DLs is well known to be expensive. However, in numerous application scenarios, the ontology in use is either never modified at all (e.g., in query answering), or the amount of updates is negligible in comparison with the whole ontology (e.g., minor manual edits, addition of a few individuals). In order to efficiently support these scenarios, FaCT++ implements the following two techniques: persistent and incremental reasoning. In persistent reasoning mode, after classification, the reasoner saves its internal state, including computed information (e.g., concept taxonomy) on a persistent medium; the next use of the ontology will not require classification to be performed from scratch, but just reading an input file. In incremental reasoning mode, the reasoner is notified of a change and identifies a (usually small) portion of its internal state that is affected by the change. This is the only part that requires recomputation. This approach can lead to greater overall efficiency, when compared with having to reload and reclassify the whole ontology. 
16:30  iDQ: InstantiationBased DQBF Solving SPEAKER: Gergely Kovásznai ABSTRACT. Dependency Quantified Boolean Formulas (DQBF) are obtained by adding Henkin quantifiers to Boolean formulas and have seen growing interest in the last years. Since deciding DQBF is NEXPTIMEcomplete, efficient ways of solving it would have many practical applications. Still, there is only few work on solving this kind of formulas in practice. In this paper, we present an instantiationbased technique to solve DQBF efficiently. Apart from providing a theoretical foundation, we also propose a concrete implementation of our algorithm. Finally, we give a detailed experimental analysis evaluating our prototype iDQ on several DQBF as well as QBF benchmarks. 
17:00  Instancebased Selection of CSP Solvers using Short Training SPEAKER: unknown ABSTRACT. Many different approaches for solving Constraint Satisfaction Problems (CSP) (and related Constraint Optimization Problems (COP)) exist. However, there is no single solver that performs well on all classes of problems and many portfolio approaches for selecting a suitable solver based on simple syntactic features of the input CSP instance are developed. Unlike other existing portfolio approaches for CSP, our methodology is based on training with very short timeouts significantly reducing overall training time. Still, thorough evaluation has been performed on large publicly available corpora and our portfolio method gives good results, improves upon the efficiency of single stateoftheart tools used in comparison, and is comparable to classical methods that use long timeout during the training phase. 
17:30  Incremental SATbased Method with Native Boolean Cardinality Handling for the Hamiltonian Cycle Problem SPEAKER: Takehide Soh ABSTRACT. The Hamiltonian cycle problem (HCP) is a problem of finding a spanning cycle in a given graph. HCP is NPcomplete and has been known as an important problem due to its close relation to the travelling salesman problem (TSP). TSP can be seen as an optimization variant of finding a minimum cost cycle. In this paper, we propose an incremental SATbased method for solving HCP. The increase of clauses often prevents SATbased methods from being scalable. Our method reduces the number of encoded clauses by relaxing constraints and using the native handling of cardinality constraint which is realized by a special propagator in Sat4j. Experimental evaluations are carried out on four benchmark sets and we have compared our incremental SATbased method with an existing eager SATbased method and specialized methods. 
16:30  Narrow Proofs May Be Maximally Long SPEAKER: Massimo Lauria ABSTRACT. We prove that there are 3CNF formulas over n variables that can be refuted in resolution in width w but require resolution proofs of size n^{Ω(w)}. This shows that the simple counting argument that any formula refutable in width w must have a proof in size n^{O(w)} is essentially tight. Moreover, our lower bounds can be generalized to polynomial calculus resolution (PCR) and SheraliAdams, implying that the corresponding size upper bounds in terms of degree and rank are tight as well. Our results do not extend all the way to Lasserre, however  the formulas we study have Lasserre proofs of constant rank and size polynomial in both n and w. 
17:00  Parity games and propositional proofs SPEAKER: Neil Thapen ABSTRACT. Parity games are a class of two player games played by moving a token around a finite graph. They have important applications in automata theory, logic and verification. The main computational problem for such games is to decide which player has a winning strategy. This problem is reducible to a search problem in the intersection of the classes PLS and PPAD, but is not known to be in P, despite intensive research work. A propositional proof system is weakly automatizable if there is a polynomial time algorithm which separates satisfiable formulas from formulas which have a short refutation in the system, with respect to a given length bound. We show that if resolution is weakly automatizable, then the decision problem for parity games is in P. We also give simple proofs that the same holds for depth1 propositional calculus (where resolution has depth 0) with respect to the related classes of mean payoff games and simple stochastic games. We define a new type of combinatorial game and prove that resolution is weakly automatizable if and only if one can separate, by a set decidable in polynomial time, the games in which the first player has a positional winning strategy from the games in which the second player has a positional winning strategy. Our main technique is to show that a suitable weak bounded arithmetic theory proves that both players in a game cannot simultaneously have a winning strategy, and then to translate this proof into propositional form. 
16:30  Discussion on the Future of RTA (+ TLCA) SPEAKER: Georg Moser 
17:00  Discussion on the Open Problems List in Rewriting SPEAKER: Nachum Dershowitz 
17:30  Business Meeting SPEAKER: Jürgen Giesl 
16:30  The role of domain specific languages in simulation studies  Analyzing the impact of membrane dynamics on the Wnt signaling pathway SPEAKER: Adelinde Uhrmacher ABSTRACT. In a simulation study aimed at developing a model that answers questions of interest about a target system, phases of model development and refinement are interwoven with phases of experimentation. During this process, a variety of experiments are typically executed, including parameter scans, optimization, or sensitivity analysis. Domainspecific languages, their expressiveness and succinctness, play a central role, not only to describe and revise a model, but also to specify and generate experiments, and thus to facilitate their reproduction. We will elucidate this role with the external domainspecific language MLRules (a rulebased multilevel modeling language for biochemical systems), SESSL (an internal domainspecific language for specifying simulation experiments), and a case study analyzing the impact of membrane dynamics on the Wnt signaling pathway. 
16:30  Scrapping Your DependentlyTyped Boilerplate is Hard SPEAKER: Ahmad Salim AlSibahi ABSTRACT. More complex programs that model programming languages or specific domain information end up with large nested data structures, and as a consequence changing even simple properties require a full traversal of the structure. The Haskell community has developed various techniques that try to reduce this boilerplate by providing generic frameworks querying and traversal, such as Scrap Your Boilerplate (SYB)[1] and Uniplate [2]. Since many of the modern dependentlytyped languages such as Agda[3] or Idris[4] are inspired by Haskell, it would be reasonable to expect that such frameworks could be provided in these languages. However, as I will show in my presentation there are some unresolved issues that prevent these approaches from being used asis. 
17:00  On a style of presentation of type systems SPEAKER: Alvaro Tasistro ABSTRACT. We outline a way of formulating and explaining type systems/functional programming languages that we believe could be both logically wellfounded and didactic. We propose to discuss the details for both simple and dependent type systems. 
17:30  Sequential decision problems and a computational theory of avoidability SPEAKER: unknown ABSTRACT. please, see uploaded PDF file 
16:30  Probabilistic Types and Function Overloading SPEAKER: Alessandra Di Pierro ABSTRACT. We present a theory of types where, besides the function type, we introduce a new type constructor for probabilistic choice. This constructor allows for the selection of a particular type among a finite set of options according to a given probability. This theory induces a type assignment system for a probabilistic $\lambda$calculus which we show to be sound with respect to a probabilistic term reduction. We also illustrate the use of this type system in order to express a new form of ad hoc polymorphism, which allows for a probabilistic selection of the code of a polymorphic function to be executed. 
17:00  Differential privacy at work: Verification of approximate probabilistic programs and models for choosing epsilon SPEAKER: unknown ABSTRACT. This submission surveys some ongoing work on the application of differential privacy—a strong probabilistic privacy guarantee that has recently been under intensive study. We will discuss ideas introduced in previous works, with the overall theme of putting differential privacy to practical use. We have three main goals. First, we wish to explain the concept of differential privacy, and give an idea of why it is useful. Next, from a verification perspective, differential privacy can be seen as an approximate notion of equivalence between probabilistic programs. So, our second goal is to direct attention towards problems related to the probabilitic programming model, which has so far been less explored. Finally, verifying the differential privacy property is only half the challenge—as we will see, the property is parameterized by a numeric value ε, which controls the strength of the guarantee. Hence, a program might be successfully verified to be differentially private, but for an ε that gives a uselessly weak guarantee. Thus, our final, more openended goal is to investigate which values of ε give practically useful guarantees, and what this means for verification. 
17:30  Interactive Particle Systems and Random Walks on Hypergraphs SPEAKER: unknown ABSTRACT. We study hypergraph analogues of interacting particle systems and random walks, notably generalizations of coalescing and annihilating random walks. Their definition is motivated by the problem of analyzing the expected running time of a local search procedure for the $k$XOR SAT problem, as well as a certain constrained triad dynamics in the theory of social balance. 
16:30  A CrossLanguage Framework for Verifying Compiler Optimizations SPEAKER: William Mansky ABSTRACT. Most compiler correctness efforts, whether based on validation or onceandforall verification, are tightly tied to the particular language(s) under consideration. Proof techniques may be replicated for other targets, and parts of the compiler chain may be shared for new input or output languages, but the extent to which common elements can be generalized across multiple targets has not been fully explored. In this paper, we lay out a general approach to specifying and verifying optimizations and transformations on lowlevel intermediate languages. By generalizing across elements such as concurrent memory models and singlethread operational semantics, we can build a library of facts that can be reused in verifying optimizations for dramatically different target languages, such as stackmachine and registermachine languages. 
17:00  CallbyValue in a Basic Logic for Interaction SPEAKER: Ulrich Schöpp ABSTRACT. In game semantics and related approaches to programming language semantics, programs are modelled by interaction dialogues. Such models have recently been used by a number of authors for the design of compilation methods, in particular for applications where resource control is important. The work in this area has focused on callbyname languages. In this talk I will consider the compilation of callbyvalue into a firstorder target language by means of an interpretation in an interactive model. The main result is that Plotkin’s standard callbyvalue CPStranslation and its soundness proof can be refined to target this intermediate language. This refined CPStranslation amounts to a direct compilation of the source language into a firstorder language. 
16:30  A sheaf model of the algebraic closure SPEAKER: unknown ABSTRACT. In constructive algebra one cannot in general decide the irreducibility of a polynomial over a field K. This poses some problems to showing the existence of the algebraic closure of K. We give a possible constructive interpretation of the existence of the algebraic closure of a field in characteristic 0 by building, in a constructive metatheory, a suitable site model where there is such an algebraic closure. One can then extract computational content from this model. We give examples of computation based on this model. 
17:00  A "Game Semantical" Intuitionistic Realizability Validating Markov's Principle SPEAKER: unknown ABSTRACT. We propose a very simple modification of Kreisel's modified realizability in order to computationally realize Markov's Principle in the context of Heyting Arithmetic in all finite types. Intuitively, realizers correspond to arbitrary strategies in HintikkaTarski games, while in Kreisel's realizability they can only represent winning strategies. Our definition, however, does not employ directly game semantical concepts and remains in the style of functional interpretations. As term calculus, we employ a purely functional language, which is Godel's System T enriched with some syntactic sugar. Our work is different from the Dialectica in two ways: first, the realizability notion is in the style of Kreisel's and thus less technically involved; second, the term assignment to proofs follows the standard CurryHoward correspondence for Heyting Arithmetic and thus is considerably simpler. Note: talk extracted from the paper http://perso.enslyon.fr/federico.aschieri/Types2013AschieriZorzi.pdf to appear in the PostProceedings of TYPES 2013 
17:20  Relative Computability and Uniform Continuity of NonExtensional (aka Multivalued) 'Functions' SPEAKER: Arno Pauly ABSTRACT. In this short submission we report on (and advertise :) our recent work that appeared in Journal of Logic & Analysis vol.5:7 (2013) 1–39: A computable real function is necessarily continuous; and this remains true for computations relative to any oracle. Conversely, by the Weierstrass Approximation Theorem, every continuous f:[0;1]>R is computable relative to some oracle. In their search for a similar topological characterization of relatively computable MULTIvalued functions f:[0;1]=>R (also known as multifunctions or relations), Brattka and Hertling (1994) have considered two notions: weak continuity (which is weaker than relative computability) and strong continuity (which is stronger than relative computability). Observing that UNIFORM continuity plays a crucial role in the Weierstrass Theorem, we propose and compare several notions of uniform continuity for relations. Here, due to the additional quantification over values y in f(x), new ways arise of (linearly) ordering quantifiers  yet none turns out as satisfactory. We are thus led to a concept of uniform continuity based on the Henkin quantifier; and prove it necessary for relative computability of compact real relations. In fact iterating this condition yields a strict hierarchy of notions each necessary  and the omegath level also sufficient  for relative computability. A refined, quantitative analysis exhibits a similar topological characterization of relative polynomialtime computability. 
17:40  Negative Translations and Heyting Algebra Expansions SPEAKER: Tadeusz Litak ABSTRACT. I discuss the behaviour of generalizations of standard negative translations  Kolmogorov, Kuroda, GoedelGentzen, Glivenko  in extensions of intuitionistic logics with additional propositional connectives and additional axioms. The main focus will be on extensions with a single, unary, normal "boxlike" modality. The question, in other words, is: for which axioms the negative translation embeds faithfully the resulting classical modal logic into its intuitionistic counterpart? As it turns out, even the Kolmogorov translation can go wrong with rather natural modal axioms. Nevertheless, there are several sufficient syntactic and semantic criteria which ensure most of these translation play well with most standard normal modal logics. 