VSL 2014: VIENNA SUMMER OF LOGIC 2014
PROGRAM FOR MONDAY, JULY 21ST, 2014
Days:
previous day
next day
all days

View: session overviewtalk overview

09:00-10:15 Session 137: Modeling Clinical Guidelines (KR4HC)
Location: EI, EI 8
09:00
Preliminary Result on Finding Treatments for Patients with Comorbidity
SPEAKER: unknown

ABSTRACT. According to some research, comorbidity is reported in 35 to 80% of all ill people. Multiple guidelines are needed for patients with comorbid diseases. However, it is still a challenging problem to automate the application of multiple guidelines to patients because of redundancy, contraindicated, potentially discordant recommendations. In this paper, we propose a mathematical model for the problem. It formalizes and generalizes a recent approach proposed by Wilks and colleagues. We also demonstrate that our model can be encoded, in a straightforward and simple manner, in Answer Set Programming (ASP) -- one class of Knowledge Representation languages. Our preliminary experiment also shows our ASP based implementation is efficient enough to process the examples used in the literature.

09:25
Towards a Conceptual Model for Enhancing Reasoning about Clinical Guidelines: A case-study on Comorbidity
SPEAKER: unknown

ABSTRACT. Computer-Interpretable Guidelines (CIGs) are representations of Clinical Guidelines (CGs) in computer interpretable languages. CIGs have been pointed as an alternative to deal with the various limitations of paper based CGs to support healthcare activities. Although the improvements offered by existing CIG languages, the complexity of the medical domain requires advanced features in order to reuse, share, update, combine or personalize their contents. To this end, we propose a conceptual model for representing the content of CGs as a result from an iterative approach that take into account the content of real CGs, existent CIGs languages and foundational ontologies in order to enhance the reasoning capabilities required to address CG use-cases. In particular, we apply our approach to the comorbidity use-case and illustrate it with a realistic case study.

09:50
Using First-Order Logic to Represent Clinical Practice Guidelines and to Mitigate Adverse Interactions
SPEAKER: unknown

ABSTRACT. Clinical practice guidelines (CPGs) were originally designed to help with evidence-based management of a single disease and such single disease focus has impacted research on CPG computerization. This computerization is mostly concerned with supporting different representation formats and identifying potential inconsistencies in the definitions of CPGs. However, one of the biggest challenges facing physicians is the application of multiple CPGs to comorbid patients. Various research initiatives propose ways of mitigating adverse interactions in concurrently applied CPGs, however, there are no attempts to develop a generalized framework for mitigation that captures generic characteristics of the problem while handling nuances such as precedence relationships. In this paper we present our research towards developing a mitigation framework that relies on a first-order logic-based representation and related theorem proving and model finding techniques. The application of the proposed framework is illustrated with a simple clinical example.

10:45-12:45 Session 138A: Description Logic 1 (KR)
Location: EI, EI 7
10:45
Polynomial Combined Rewritings for Existential Rules

ABSTRACT. We consider the scenario of ontology-based data access where a conjunctive query is evaluated against an ontological database. It is generally accepted that true scalability of conjunctive query answering in this setting can only be achieved by using standard relational database management systems (RDBMSs). An approach to conjunctive query answering that enables the use of RDBMSs is the so-called polynomial combined approach. We investigate this approach for the main guarded- and sticky-based classes of existential rules, and we highlight the assumptions on the underlying schema which are sufficient for the polynomial combined first-order rewritability of those classes. To the best of our knowledge, this is the first work which explicitly studies the polynomial combined approach for existential rules.

11:15
Nested Regular Path Queries in Description Logics
SPEAKER: unknown

ABSTRACT. Two-way regular path queries (2RPQs) have received increased attention recently due to their ability to relate pairs of objects by flexibly navigating graph structured data. They are present in property paths in SPARQL 1.1, the new standard RDF query language, and in the XML query language XPath. In line with XPath, we consider the extension of 2RPQs with nesting, which allows one to require that objects along a path satisfy complex conditions, in turn expressed through (nested) 2RPQs. We study the computational complexity of answering nested 2RPQs and conjunctions thereof (CN2RPQs) in the presence of domain knowledge expressed in description logics (DLs). We establish tight complexity bounds in data and combined complexity for a variety of DLs, ranging from lightweight DLs (DL-Lite, EL) up to highly expressive ones. Interestingly, we are able to show that adding nesting to (C)2RPQs does not affect worst-case data complexity of query answering for any of the considered DLs. However, in the case of lightweight DLs, adding nesting to 2RPQs leads to a surprising jump in combined complexity, from P-complete to ExpTime-complete.

11:45
Query Inseparability for Description Logic Knowledge Bases
SPEAKER: unknown

ABSTRACT. We investigate conjunctive query inseparability of description logic (DL) knowledge bases (KBs) w.r.t. a given signature, a fundamental problem for KB versioning, module extraction, forgetting and knowledge exchange. We study the data and combined complexity of deciding KB query inseparability for fragments of Horn-ALCHI, including the DLs underpinning OWL 2 QL and OWL 2 EL. While all of these DLs turn out to be P-complete for data complexity, the combined complexity ranges from P to EXPTIME and 2EXPTIME. We also resolve two major open problems for OWL 2 QL by showing that TBox query inseparability and the membership problem for universal UCQ solutions in knowledge exchange are both EXPTIME-complete for combined complexity.

12:15
Answering Instance Queries Relaxed by Concept Similarity
SPEAKER: Andreas Ecke

ABSTRACT. In Description Logic (DL) knowledge bases (KBs) information is typically captured by crisp concepts. For many applications, querying the KB by crisp query concepts is too restrictive. A controlled way of gradually relaxing a query concept can be achieved by the use of concept similarity measures. In this paper we formalize the task of instance query answering for crisp DL KBs using concepts relaxed by concept similarity measures. We investigate computation algorithms for this task in the DL EL, their complexity and properties for the employed similarity measure regarding whether unfoldable or general TBoxes are used.

10:45-12:45 Session 138B: Argumentation (KR)
Location: EI, EI 9
10:45
Characteristics of Multiple Viewpoints in Abstract Argumentation
SPEAKER: Paul E. Dunne

ABSTRACT. The study of extension-based semantics within the seminal abstract argumentation model of Dung has largely focused on definitional, algorithmic and complexity issues. In contrast, matters relating to comparisons of representational limits, in particular, the extent to which given collections of extensions are expressible within the formalism, have been under-developed. As such, little is known concerning conditions under which a candidate set of subsets of arguments are “realistic” in the sense that they correspond to the extensions of some argumentation framework AF for a semantics of interest. In this paper we present a formal basis for examining extension-based semantics in terms of the sets of extensions that these may express within a single AF. We provide a number of characterization theorems which guarantee the existence of AFs whose set of extensions satisfy specific conditions and derive preliminary complexity results for decision problems that require such characterizations.

11:15
On the Revision of Argumentation Systems: Minimal Change of Arguments Statuses
SPEAKER: unknown

ABSTRACT. In this paper, we investigate the revision of argumentation systems à la Dung. We focus on revision as minimal change of the arguments status. Contrarily to most of the previous works on the topic, the addition of new arguments is not allowed in the revision process, so that the revised system has to be obtained by modifying the attack relation only. We introduce a language of revision formulae which is expressive enough for enabling the representation of complex conditions on the acceptability of arguments in the revised system. We show how AGM belief revision postulates can be translated to the case of argumentation systems. We provide a corresponding representation theorem in terms of minimal change of the arguments statuses. Several distance-based revision operators satisfying the postulates are also pointed out, along with some methods to build revised argumentation systems. We also discuss some computational aspects of those methods.

11:45
An SCC Recursive Meta-Algorithm for Computing Preferred Labellings in Abstract Argumentation

ABSTRACT. This paper presents a meta-algorithm, based on the general recursive schema for argumentation semantics called SCC-Recursiveness, for the computation of preferred labellings. The idea is to recursively decompose a framework so as to compute semantics labellings on restricted sub-frameworks, in order to reduce the computational effort. The meta-algorithm can be instantiated with a specific "base algorithm", applied to the base case of the recursion, which can be obtained by generalizing existing algorithms in order to compute labellings in restricted sub-frameworks. We devise to this purpose a generalization of a SAT-based state-of-the art algorithm, and provide an empirical investigation to show the significant improvement of performances obtained by exploiting the SCC-recursive schema.

12:15
A Dynamic Logic Framework for Abstract Argumentation
SPEAKER: Sylvie Doutre

ABSTRACT. We provide a logical analysis of abstract argumentation frameworks and their dynamics. Following previous work, we express attack relation and argument status by means of propositional variables and define acceptability criteria by formulas of propositional logic. We here study the dynamics of argumentation frameworks in terms of basic operations on these propositional variables, viz. change of their truth values. We describe these operations in a uniform way within a well-known variant of Propositional Dynamic Logic PDL: the Dynamic Logic of Propositional Assignments, DL-PA. The atomic programs of DL-PA are assignments of propositional variables to truth values, and complex programs can be built by means of the connectives of sequential and nondeterministic composition and test. We start by showing that in DL-PA, the construction of extensions can be performed by a DL-PA program that is parametrized by the definition of acceptance. We then mainly focus on how the acceptance of one or more arguments can be enforced and show that this can be achieved by changing the truth values of the propositional variables describing the attack relation in a minimal way.

10:45-13:05 Session 138C: Games and Synthesis (CAV)
Location: FH, Hörsaal 1
10:45
Safraless Synthesis for Epistemic Temporal Specifications

ABSTRACT. In this paper we address the synthesis problem for specifications given in linear temporal single-agent epistemic logic, KLTL (or KL_1), over single-agent systems having imperfect information of the system state. [Vardi and Van der Meyden, CCONCUR'98] have shown that this problem is 2Exptime complete. However, their procedure relies on complex automata constructions that are notoriously resistant to efficient implementations as they use Safra-like determinization.

We propose a "Safraless" synthesis procedure for a large fragment of KLTL. The construction transforms first the synthesis problem into the problem of checking emptiness for universal co-B\"{u}chi tree automata using an information-set construction. Then we build a safety game that can be solved using an antichain-based symbolic technique exploiting the structure of the underlying automata. The technique is implemented and applied to a couple of case studies.

11:05
Minimizing Running Costs in Consumption Systems
SPEAKER: Petr Novotný

ABSTRACT. A standard approach to optimizing long-run running costs of discrete systems is based on minimizing the mean-payoff, i.e., the long-run average amount of resources (``energy'') per transition. However, this approach inherently assumes that the energy source has an unbounded capacity, which is not always realistic. For example, an autonomous robotic device has a battery of finite capacity that has to be recharged periodically, and the total amount of energy consumed between two successive charging cycles is bounded by the capacity. Hence, a controller minimizing the mean-payoff must obey this restriction. In this paper we study the controller synthesis problem for consumption systems with a finite battery capacity, where the task of the controller is to minimize the mean-payoff while preserving the functionality of the system encoded by a given linear-time property. We show that an optimal controller always exists, and it may either need only finite memory or require infinite memory (it is decidable in polynomial time which of the two cases holds). Further, we show how to compute an effective description of an optimal controller in polynomial time. Finally, we consider the limit values achievable by larger and larger battery capacity, show that these values are computable in polynomial time, and we also analyze the corresponding rate of convergence. To the best of our knowledge, these are the first results about optimizing the long-run running costs in systems with bounded energy stores.

11:25
CEGAR for Qualitative Analysis of Probabilistic Systems

ABSTRACT. We consider Markov decision processes (MDPs) which are a standard model for probabilistic systems. We focus on qualitative properties for MDPs that can express that desired behaviors of the system arise almost-surely (with probability 1) or with positive probability. We introduce a new simulation relation to capture the refinement relation of MDPs with respect to qualitative properties, and present discrete graph theoretic algorithms with quadratic complexity to compute the simulation relation. We present an automated technique for assume-guarantee style reasoning for compositional analysis of MDPs with qualitative properties by giving a counter-example guided abstraction-refinement approach to compute our new simulation relation. We have implemented our algorithms and show that the compositional analysis leads to significant improvements.

11:45
Optimal Guard Synthesis for Memory Safety
SPEAKER: Thomas Dillig

ABSTRACT. This paper presents a new synthesis-based approach for helping programmers write low-level memory-safe code. Specifically, given a partial program with missing guards, our algorithm synthesizes concrete predicates to plug in for the missing guards such that all buffer accesses in the program are guaranteed to be memory safe. Furthermore, our algorithm gives a guarantee of optimality in that the synthesized guards are provably the simplest (that is, with the fewest number of variables) and weakest among guards that guarantee memory safety. Our approach is fully automatic and does not require any hints, such as templates, from the user. We have implemented our proposed algorithm in a prototype synthesis tool for C programs. Our experimental evaluation on a set of real-world C programs shows that the proposed approach is able to successfully synthesize guards that closely match hand-written programmer code.

12:05
Don't sit on the fence: A static analysis approach to automatic fence insertion
SPEAKER: Vincent Nimal

ABSTRACT. Modern architectures rely on memory fences to prevent undesired weakenings of memory consistency. As the fences' semantics may be subtle, the automation of their placement is highly desirable. But precise methods restoring consistency do not scale to deployed systems code. We choose to trade some precision for genuine scalability: we present a novel technique suitable for large code bases. We implement this method in our new musketeer tool, and detail experiments on more than 350 executables of packages found in a Debian Linux distribution, e.g. memcached (about 10000 LoC).

12:25
MCMAS-SLK: A Model Checker for the Verification of Strategy Logic Specifications
SPEAKER: Petr Čermák

ABSTRACT. We introduce MCMAS-SLK, a BDD-based model checker for the verification of systems against specifications expressed in a novel, epistemic variant of strategy logic. We give syntax and semantics of the specification language and a introduce a labelling algorithm for epistemic and strategy logic modalities. We provide details of the checker which can also be used for synthesising the agents' strategies so that a specification is satisfied by the system. We evaluate the efficiency of the implementation by discussing the results obtained for the dining cryptographers protocol and the cake-cutting problem.

12:35
Solving Games without Controllable Predecessor

ABSTRACT. Two-player games are a useful formalism for the synthesis of reactive systems. The traditional approach to solving such games iteratively computes the set of winning states for one of the players. This requires keeping track of all discovered winning states and can lead to space explosion even when using efficient symbolic representations. We propose a new method for solving reachability games. Our method works by exploring a subset of the possible concrete runs of the game and proving that these runs can be generalised into a winning strategy on behalf of one of the players. We use counterexample-guided backtracking search to identify a subset of runs that are sufficient to consider to solve the game. We evaluate our algorithm on several families of benchmarks derived from real-world device driver synthesis problems.

12:45
G4LTL-ST: Automatic Generation of PLC Programs

ABSTRACT. G4LTL-ST automatically generates IEC 61131-3-compatible control code for industrial field-level devices from behavioral, timed specifications of input-output signals. Behavioral specifications are expressed in a linear temporal logic (LTL) extended language with (1) arithmetic constraints for describing data-intensive control behavior and (2) timers which are based on the industry standard IEC 61131-3. The code generated by G4LTL-ST is in the format of Structured Text which can readily be compiled into executable code for a large number of industrial field-level devices. The synthesis algorithm of G4LTL-ST implements a pseudo-Boolean abstraction of data constraints and a compilation of timer specifications into LTL, together with a CEGAR-like abstraction-refinement loop. Since temporal logic specifications are notoriously difficult to use in practice, G4LTL-ST includes a number of heuristics for suggesting additional environment assumptions for making the control problem at hand realizable.

12:55
SYNTCOMP - Synthesis Competition for Reactive Systems
SPEAKER: Swen Jacobs

ABSTRACT. We present results of the first competition for reactive synthesis tools. The performance of synthesis tools is compared on safety specifications in an extension of the AIGER format, and tools are ranked with respect to the time needed for realizability checks and the size of circuits produced by synthesis.

10:45-13:00 Session 138D: Technical Communications: Track 1 - Knowledge Representation (ICLP)
Location: FH, Hörsaal 8
10:45
Transaction Logic with (Complex) Events
SPEAKER: unknown

ABSTRACT. This work deals with the problem of combining reactive features, such as the ability to respond to events and define complex events, with the execution of ACID transactions over general Knowledge Bases (KBs).

With this as goal, we build on Transaction Logic (TR), a logic precisely designed to model and execute (ACID) transactions in KBs defined in arbitrary logic theories. In it, transactions are written in a logic-programming style, by combining primitive update operations over a general KB, with the usual logic programming connectives and some additional connectives e.g. to express sequence of actions.

While TR is a natural choice to deal with transactions, it remains the question whether if TR can be used to express complex events, but also to deal simultaneously with the detection of complex events and the execution of transactions. In this paper we show that the former is possible while the latter is not. For that, we start by illustrating how TR can express and detect complex events, and in particular, how SNOOP event expressions can be translated in the logic. Afterwards, we show why TR fails to deal with the two issues together, and propose Transaction Logic with Events to solve the intended problem.

The achieved solution is a non-monotonic conservative extension of TR, which guarantees that every complex event detected in a transaction is necessarily responded. Along with its syntax, model theory and executional semantics, in this paper we prove the correspondence of Transaction Logic with Events with TR, and that it enjoys from important properties of non-monotonic logics, like support.

11:00
Properties of Stable Model Semantics Extensions

ABSTRACT. The stable model (SM) semantics lacks the properties of existence, relevance and cumulativity. This is due to the set of SM models being generally smaller than the set of all minimum models of a normal logic program (the semantics that consists in all the minimal models of a normal logic program, let it be SEMmin, trivially verifies these three properties). If we prospectively consider the class of conservative extensions of SM semantics (i.e., semantics that for each normal logic program P retrieve a superset of the set of stable models of P), other than the trivial SEMmin, one may wander how do the semantics of this class behave in what concerns the aforementioned properties. That is the type of issue we deal with in this paper. We define a large class of conservative extensions of the SM semantics, dubbed affix stable model semantics, ASM, and study the above referred properties into two non-disjoint subfamilies of the class ASM, here dubbed ASMh and ASMm. From this study a number of results stem which facilitate the assessment of semantics in the class ASMh ∪ASMm with respect to the properties of existence, relevance and cumulativity, whilst unveiling relations among these properties. The following results should be emphasized: (1) We present a refined definition of cumulativity for semantics of in the class ASMh ∪ ASMm, which turns into an easier job the dismissal of this property by resorting to counter-examples; (2) We divide the sets of rules of normal logic programs into layers, and use the decomposition of models into that layered structure to define three new (structural) properties, defectivity, excessiveness and irregularity, which allow to state a number of relations between the properties of existence, relevance and cumulativity for semantics of the ASMh ∪ ASMm class, and at the same time facilitate the assessment of semantics in this class with respect to those properties; (3) As a result of the approach in our work light is shed on the characterization of the SM semantics, as we show that the properties of (lack of) existence and (lack of) cautious monotony are equivalent, which opposes statements on this issue that may be found in the literature; we also characterize the relevance failure of SM semantics in a more clear way than usually stated in the literature.

11:15
A Well-Founded Semantics for FOL-Programs
SPEAKER: unknown

ABSTRACT. An FOL-program consists of a background theory in a decidable fragment of first-order logic and a collection of rules possibly containing first-order formulas. The formalism stems from recent approaches to tight integrations of ASP with description logics. In this paper, we define a well-founded semantics for FOL-programs based on a new notion of unfounded sets on consistent as well as inconsistent sets of literals, and study some of its properties. The semantics is defined for all FOL-programs, including those where it is necessary to represent inconsistencies explicitly. The semantics supports a form of combined reasoning by rules under closed world as well as open world assumptions, and it is a generalization of the standard well-founded semantics for normal logic programs. We also show that the well-founded semantics defined here approximates the well-supported answer set semantics for normal DL programs.

11:30
On Strong and Default Negation in Answer-Set Program Updates
SPEAKER: Martin Balaz

ABSTRACT. Existing semantics for answer-set program updates fall into two categories: either they consider only strong negation in heads of rules, or they primarily rely on default negation in heads of rules and optionally provide support for strong negation by means of a syntactic transformation. In this paper we pinpoint the limitations of both these approaches and argue that both types of negation should be first-class citizens in the context of updates. We identify principles that plausibly constrain their interaction but are not simultaneously satisfied by any existing rule update semantics. Then we extend one of the most advanced semantics with direct support for strong negation and show that it satisfies the outlined principles as well as a variety of other desirable properties.

11:45
C-Log: A Knowledge Representation Language of Causality
SPEAKER: Bart Bogaerts

ABSTRACT. Cause-effect relations are an important part of human knowledge. In real life, humans often reason about complex causes linked to complex effects. By comparison, existing formalisms for representing knowledge about causal relations are quite limited in the kind of specifications of causes and effects they allow. In this paper, we present the new language C-Log, which offers a significantly more expressive representation of effects. We show how Approximation Fixpoint Theory can be used to define a formal semantics that captures the intuitions underlying this language. We also compare C-Log with several related languages and paradigms, including inductive definitions, disjunctive logic programming, and extensions of Datalog and we show that we provided a new semantics for (a generalisation of) disjunctive logic programming with existential quantifications in rule heads.

12:00
ESmodels: An Epistemic Specification Solver

ABSTRACT. This paper introduces an epistemic specification solver ESmodels, which is designed and implemented as an experiment platform to investigate the semantics, language, related reasoning algorithms, and possible applications of epistemic specifications.We first give the epistemic specification language of ESmodels and its semantics. The language only employs one modal operator K but we prove that it is able to represent luxuriant modal operators by presenting transformation rules. Then, we describe basic algorithms and optimization approaches used in ESmodels. After that, we discuss several applications of ESmodels in commonsense reasoning, conformant planning, constraint satisfaction. Finally, we conclude with perspectives.

12:15
Grounding Bound Founded Answer Set Programs
SPEAKER: unknown

ABSTRACT. Bound Founded Answer Set Programming (BFASP) is an extension of Answer Set Programming (ASP) that extends stable model semantics to numeric variables. While the theory of BFASP is defined on ground rules, in practice BFASP programs are written as complex non-ground expressions. Flattening of BFASP is a technique used to simplify arbitrary expressions of the language to a small and well defined set of primitive expressions. In this paper, we first show how we can flatten arbitrary BFASP rule expressions, to give equivalent BFASP programs. Next, we extend the bottom-up grounding technique and magic set transformation used by ASP to BFASP programs. Our implementation shows that for BFASP problems, these techniques can significantly reduce the ground program size, and improve subsequent solving.

12:30
An ASP-Based Architecture for Autonomous UAVs in Dynamic Environments
SPEAKER: unknown

ABSTRACT. Traditional AI reasoning techniques have been used successfully in many domains, including logistics, scheduling and game playing. This paper is part of a project aimed at investigating how such techniques can be extended to coordinate teams of unmanned aerial vehicles (UAVs) in dynamic environments. Specifically challenging are real-world environments where UAVs and other network-enabled devices must communicate to coordinate---and communication actions are neither reliable nor free. Such network-centric environments are common in military, public safety and commercial applications, yet most research (even multi-agent planning) usually takes communications among distributed agents as a given. We address this challenge by developing an agent architecture and reasoning algorithms based on Answer Set Programming (ASP). ASP has been chosen for this task because it enables high flexibility of representation, both of knowledge and of reasoning tasks. Although ASP\ has been used successfully in a number of applications, and ASP-based architectures have been studied for about a decade, to the best of our knowledge this is the first practical application of an ASP based agent architecture. It is also the first practical application of ASP\ involving a combination of centralized reasoning, decentralized reasoning, execution monitoring, and reasoning about network communications. This work has been empirically validated using a distributed network-centric software evaluation testbed and the results provide guidance to designers in how to understand and control intelligent systems that operate in these environments.

10:45-13:05 Session 138E: Modal and Temporal Reasoning (IJCAR)
Location: FH, Hörsaal 5
10:45
Optimal tableaux-based decision procedure for testing satisfiability in the Alternating-time temporal logic ATL+
SPEAKER: Amélie David

ABSTRACT. We develop a tableaux-based decision method for constructive satisfiability testing and model synthesis in the fragment ATL+ of the full Alternating time temporal logic ATL*. The method extends in an essential way a previous developed tableaux-based decision method for ATL and works in 2EXPTIME, which is the optimal worst case complexity of the satisfiability problem for ATL+. We also discuss how suitable parameterizations and syntactic restrictions on the class of input ATL+ formulae can reduce the complexity of the satisfiability problem.

11:15
dTL²: Differential Temporal Dynamic Logic with Nested Modalities for Hybrid Systems
SPEAKER: unknown

ABSTRACT. The differential temporal dynamic logic dTL² is a logic to specify temporal properties of hybrid systems. It combines differential dynamic logic with temporal logic to reason about the intermediate states reached by a hybrid system. The logic dTL² supports some branching and linear time temporal properties of CTL*. We provide a semantics and a proof system for the logic dTL², and show its usefulness for nontrivial temporal properties of hybrid systems. We take particular care to handle the case of alternating universal dynamic and existential temporal modalities and its dual, solving an open problem formulated in previous work.

11:45
Axioms vs Hypersequent Rules with Context Restrictions: Theory and Applications

ABSTRACT. We investigate the connections between hypersequent rules with context restrictions and Hilbert axioms extending classical (and intuitionistic) propositional logic. We introduce sufficient syntactic conditions for cut elimination in such hypersequent calculi, translations between rules with context restrictions and axioms (and vice versa), and general decidability and complexity results. Our work subsumes many logic-tailored results, entails finite axiomatisability for a number of normal modal logics given by simple frame properties, and provides a method to construct cut-free hypersequent calculi from a set of axioms. As a case study we obtain a new cut-free hypersequent calculus and complexity bound for the logic of uniform deontic frames.

12:15
Clausal Resolution for Modal Logics of Confluence

ABSTRACT. We present a clausal resolution-based method for normal modal logics of confluence, whose Kripke semantics are based on frames characterised by appropriate instances of the Church-Rosser property. Here we restrict attention to eight families of such logics. We show how the inference rules related to the normal logics of confluence can be systematically obtained from the parametrised axioms that characterise such systems. We discuss soundness, completeness, and termination of the method. In particular, completeness can be modularly proved by showing that the conclusions of each newly added inference rule ensures that the corresponding conditions on frames hold. Some examples are given in order to illustrate the use of the method.

12:45
Implementing Tableaux Calculi Using BDDs: BDDTab System Description
SPEAKER: unknown

ABSTRACT. Binary Decision Diagrams (BDDs) have been used to decide satisfiability and validity in various non-classical logics by directly implementing the finite property method for these logics. We instead present a method for using Binary Decision Diagrams as an underlying data structure for implementing the tableau method itself. We demonstrate our method by implementing the standard tableau calculi for automated reasoning in propositional modal logics K and S4, along with extensions to the multiple and inverse modalities of ALCI. We evaluate our implementation of such a reasoner using several K and S4 benchmark sets, as well as some ALC and ALCI ontologies. We show, with comparison to FaCT++ and InKreSAT, that it can compete with other state of the art methods of reasoning in propositional modal logic. We also discuss how this technique extends to tableaux for other propositional logics.

10:45-12:45 Session 138F: Protocol Verification (CSF)
Location: FH, Hörsaal 6
10:45
Decidability for Lightweight Diffie-Hellman Protocols
SPEAKER: unknown

ABSTRACT. Many protocols use the Diffie-Hellman method for key agreement, in combination with certified long-term values or digital signatures for authentication. These protocols aim at security goals such as key secrecy, forward secrecy, resistance to key compromise attacks, and various flavors of authentication. However, analysis of these protocols has been challenging, both in the computational model and in the symbolic model. An obstacle in the symbolic model is the undecidability of unification in many theories such as rings.

In this paper, we develop an algebraic version of the symbolic approach, in which we work directly within finite fields. These are natural structures for the protocols. The adversary, in giving a counterexample to a protocol goal in a finite field, may rely on any identity in that field. He defeats the protocol if there are counterexamples in infinitely many finite fields.

We prove that, even for this strong adversary, security goals for a wide class of protocols are decidable.

11:15
Modeling Diffie-Hellman Derivability for Automated Analysis
SPEAKER: unknown

ABSTRACT. Automated analysis of protocols involving Diffie-Hellman key exchange is challenging, in part because of the undecidability of the unification problem in relevant theories. In this paper, we justify the use of a more restricted theory that includes multiplication of exponents but not addition, providing unitary and efficient unification.

To justify this theory, we link it to a computational model of non-uniform circuit complexity through several incremental steps. First, we give a model closely analogous to the computational model, in which derivability is modeled by closure under simple algebraic transformations. This model retains many of the complex features of the computational model, including defining success based on asymptotic probability for a non-uniform family of strategies. We describe an intermediate model based on formal polynomial manipulations, in which success is exact and there is no longer a parametrized notion of the strategy. Despite the many differences in form, we are able to prove an equivalence between the asymptotic and intermediate models by showing that a sufficiently successful asymptotic strategy implies the existence of a perfect strategy. Finally, we describe a symbolic model in which addition of exponents is not modeled, and prove that (for expressible problems), the symbolic model is equivalent to the intermediate model.

11:45
Actor Key Compromise: Consequences and Countermeasures
SPEAKER: Marko Horvat

ABSTRACT. Despite Alice’s best efforts, her long-term secret keys may be revealed to an adversary. Possible reasons include weakly generated keys, compromised key storage, subpoena, and coercion. However, Alice may still be able afterwards to communicate securely with other parties. Whether this is possible depends on the protocol used. We call the associated property resilience against Actor Key Compromise (AKC). We formalise this property in a symbolic model and identify conditions under which it can and cannot be achieved. In case studies that include TLS and SSH, we find that many protocols are not resilient against AKC. We implement a concrete AKC attack on the mutually authenticated TLS protocol.

12:15
A Sound Abstraction of the Parsing Problem

ABSTRACT. In formal verification, cryptographic messages are often represented by algebraic terms. This abstracts not only from the intricate details of the real cryptography, but also from the details of the non-cryptographic aspects: the actual formatting and structuring of messages.

We introduce a new algebraic model to include these details and define a small, simple language to precisely describe message formats. We support fixed-length fields, variable-length fields with offsets, tags, and encodings into smaller alphabets like Base64, thereby covering both classical formats as in TLS and modern XML-based formats.

We define two reasonable properties for formats. First, they should be un-ambiguous: for a given format, any string can be parsed in at most one way. Second, they should be pairwise disjoint: a string can be parsed as at most one of the formats. We show how to easily establish these properties for many practical formats.

An abstract model is obtained by replacing the formats with free function symbols. This abstract model is compatible with all existing verification tools. We prove that the abstraction is sound for un-ambiguous, disjoint formats: there is an attack in the concrete message model if (and actually only if) there is one in the abstract message model.

10:45-13:00 Session 138G: Technical Communications: Track 2 - Systems and Paradigms (ICLP)
Location: FH, Seminarraum 134
10:45
Multi-criteria optimal planning for Energy policies in CLP

ABSTRACT. The number of issues, stakeholders and regulations that a policy must consider in the current world is so high that trying to address them only with good-will and common-sense is unthinkable. Policy makers have to consider disparate issues, as diverse as economic development, pollution, traffic, as well as the acceptance of the policy from the population, just to name a few. This holds also for relatively low-scale levels, such as the Regional scope, not to mention National or European levels. A single person cannot be expert in all these subjects, and to obtain a suitable policy in the current complex world one can adopt decision support systems featuring optimization components.

Leveraging on previous work on Strategic Environmental Assessment, we developed a fully-fledged system that is able to provide optimal plans with respect to a given objective, to perform multi-objective optimization and provide sets of Pareto optimal plans, and to visually compare plans. While the previous version only considered qualitative information, the new version provides also the environmental footprint of the plan in a quantitative information, considering emissions, global warming effect, human toxicity, and acidification. The heart of the system is an application developed in a popular Constraint Logic Programming system on the Reals sort. It has been equipped with a web service module that can be queried through standard interfaces. An intuitive graphic user interface has been added to provide easy access to the web service and the CLP application.

11:00
Numerical Properties of Min-closed CSPs
SPEAKER: G. Narboni

ABSTRACT. Min-closed constraints are numerical relationships characterised by a simple algebraic property. Yet, with finite domain variables, min-closed systems give rise to a polynomial class of Constraint Satisfaction Problems. Propagation alone checks them for satisfiability. Solving is therefore search-free. Can this noteworthy result be generalized from a discrete to a continuous (or mixed) setting, with constraints involving real variables, using interval solvers? In this paper, we show that the completeness property observed in the discrete case gracefully degrades into a 'close approximation' property in the continuous case, which is also more general. When switching from finite to infinite domains, we show that the pruning power of propagation remains intact in the sense that it provides a box enclosure whose lower bound cannot be further updated (even by domain splitting). In the corresponding integer case, this box, when not empty, has a least element. This witness point thus ensures satisfiability and restores decidability. Applications of this analysis to scheduling, rule-based reasoning and scientific simulation are finally briefly mentioned.

11:15
Clingo = ASP + Control
SPEAKER: unknown

ABSTRACT. We present the new ASP system clingo 4. Unlike its predecessors, being mere monolithic combinations of the grounder gringo with the solver clasp, the new clingo 4 series offers high-level constructs for realizing complex reasoning processes. Among others, such processes feature advanced forms of search, as in optimization or theory solving, or even interact with an environment, as in robotics or query-answering. Common to them is that the problem specification evolves during the reasoning process, either because data or constraints are added, deleted, or replaced. In fact, clingo 4 carries out such complex reasoning within a single integrated ASP grounding and solving process. This avoids redundancies in relaunching grounder and solver programs and benefits from the solver's learning capacities. clingo 4 accomplishes this by complementing ASP's declarative input language by control capacities expressed via the embedded scripting languages lua and python. On the declarative side, clingo 4 offers a new directive that allows for structuring logic programs into named and parameterizable subprograms. The grounding and integration of these subprograms into the solving process is completely modular and fully controllable from the procedural side, viz. the scripting languages. By strictly separating logic and control programs, clingo 4 also abolishes the need for dedicated systems for incremental and reactive reasoning, like iclingo and oclingo, respectively, and its flexibility goes well beyond the advanced yet still rigid solving processes of the latter.

11:30
Logic and Constraint Logic Programming for Distributed Constraint Optimization
SPEAKER: unknown

ABSTRACT. The field of Distributed Constraint Optimization Problems (DCOPs) has gained momentum, thanks to its suitability to capture complex problems (e.g., resource allocation and management, multi-agent coordination) that are naturally distributed and cannot be realistically addressed in a centralized manner. The state-of-the-art in solving DCOPs relies on the use of ad-hoc infrastructures and ad-hoc constraint solving procedures. This paper investigates an infrastructure for solving DCOPs that is completely built on logic programming technologies. In particular, the paper explores the use of a general constraint solver (CLP in this context) to handle the agent-level constraint solving. The preliminary experiments shows that logic programming provides benefits over a state-of-the-art DCOP system in terms of performance and scalability, opening the doors to the use more advanced technology (e.g., search strategies, complex constraints) for solving DCOPs.

11:45
Guarding Corecursion in Logic Programming
SPEAKER: unknown

ABSTRACT. Coalgebraic Logic Programming (CoALP) is a dialect of Logic programming designed to work with coinductive definitions of infinite objects. Its main goal is to introduce guarded lazy corecursion akin functional theorem provers into logic programming. In this paper, we give the first full formal account of guarded corecursion in CoALP, and present its implementation.

12:00
Adaptive MCMC-Based Inference in Probabilistic Logic Programs
SPEAKER: unknown

ABSTRACT. Probabilistic Logic Programming (PLP) languages enable programmers to specify systems that combine logical models with statistical knowledge. The inference problem, to determine the probability of query answers in PLP, is intractable in general, thereby motivating the need for approximate techniques. In this paper, we present a technique for approximate inference of conditional probabilities for PLP queries. It is an Adaptive Markov Chain Monte Carlo (MCMC) technique, where the distribution from which samples are drawn is modified as the Markov Chain is explored. In particular, the distribution is progressively modified to increase the likelihood that a generated sample is consistent with evidence. In our context, each sample is uniquely characterized by the outcomes of a set of random variables. Inspired by reinforcement learning, our technique propagates rewards to random variable/outcome pairs used in a sample based on whether the sample was consistent or not. The cumulative rewards of each outcome is used to derive a new ``adapted distribution'' for each random variable. For a sequence of samples, the distributions are progressively adapted after each sample. For a query with ``Markovian evaluation structure'', we show that the adapted distribution of samples converges to the query's conditional probability distribution. For Markovian queries, we present a modified adaptation process that can be used in adaptive MCMC as well as adaptive independent sampling. We empirically evaluate the effectiveness of the adaptive sampling methods for queries with and without Markovian evaluation structure.

12:15
A Framework for Bottom-Up Simulation of SLD-Resolution
SPEAKER: Stefan Brass

ABSTRACT. This paper introduces a framework for the bottom-up simulation of SLD-resolution based on partial evaluation. The main idea is to use database facts to represent a set of SLD goals. For deductive databases it is natural to assume that the rules defining derived predicates are known at "compile time", whereas the database predicates are known only later at runtime. The framework is inspired by the author's own SLDMagic method, and a variant of Earley deduction recently introduced by Heike Stephan and the author. However, it opens a much broader perspective.

10:45-13:00 Session 138H: Technical Communications: Track 3 - Prolog (ICLP)
Location: FH, Seminarraum 134A
10:45
Joint Tabling of Logic Program Abductions and Updates

ABSTRACT. Abductive logic programs offer a formalism to declaratively represent and reason about problems  in a variety of areas: diagnosis, decision making, hypothetical reasoning, etc. On the other hand, logic program updates allow us to express knowledge changes, be they internal (or self) and external (or world) changes. Abductive logic programs and logic program updates thus naturally coexist in problems that are susceptible to hypothetical reasoning about change. Taking this as a motivation, in this paper we integrate abductive logic programs and logic program updates by jointly exploiting tabling features of logic programming. The integration is based on and benefits from the two implementation techniques we separately devised previously, viz., tabled abduction and incremental tabling for query-driven propagation of logic program updates. A prototype of the integrated system is implemented in XSB Prolog.

11:00
A Simple and Efficient Lock-Free Hash Trie Design for Concurrent Tabling
SPEAKER: Miguel Areias

ABSTRACT. A critical component in the design of a concurrent tabling system is the implementation of the table space. One of the most successful data structure for representing tables is based on a two-level trie data structure, where one trie level stores the tabled subgoal calls and the other stores the computed answers. In the previous work, we have presented a sophisticated lock-free design where both levels of the tries where shared among threads in a concurrent environment. To implement lock-freedom we used the CAS atomic instruction that nowadays can be widely found on many common architectures. The CAS reduces the granularity of the synchronization when threads access concurrent areas. However, different concurrent areas can interfere with each other on strong memory hardware architectures, creating problems such as false sharing or cache memory ping pong effects. In this work, we present a simpler and efficient lock-design design based on hash tries that minimizes these problems by dispersing as much as possible the concurrent areas. Experimental results show that our new lock-free design can effectively reduce the execution time and scale better, when increasing the number of threads, than the previous designs.

11:15
Towards Assertion-based Debugging of Higher-Order (C)LP Programs
SPEAKER: unknown

ABSTRACT. Higher-order constructs extend the expressiveness of first-order (Constraint) Logic Programming ((C)LP). At the same time assertions have been in use for some time in (C)LP systems helping programmers detect errors and validate programs. We report on our work towards filling this gap by extending the assertion-based approach to error detection and program validation to the higher-order context. We propose an extension of properties and assertions as used in (C)LP in order to be able to fully describe arguments that are predicates. We have developed syntax and semantics for (higher-order) properties and assertions, as well as for programs which contain such assertions, as well as several alternatives for performing run-time checking of such programs.

11:30
Analysis and Transformation Tools for Constrained Horn Clause Verification
SPEAKER: unknown

ABSTRACT. Several techniques and tools have been developed for verification in Horn clauses with constraints over some background theory (CHC). Current CHC verification tools implement intricate algorithms and are limited to certain subclasses of CHC problems. Our aim in this work is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLP) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components.

11:45
Towards an Efficient Prolog System by Code Introspection

ABSTRACT. Several Prolog interpreters are based on the Warren Abstract Machine (WAM), an elegant model to compile Prolog programs. In order to improve the performance several strategies have been proposed, such as: optimize the selection of clauses, specialize the unification, global analysis, native code generation and tabling. This paper proposes a different strategy to implement an efficient Prolog System, the creation of specialized emulators on the fly. The proposed strategy was implemented and evaluated at YAP Prolog System, and the experimental evaluation showed interesting results.

12:00
Customisable Handling of Java References in Prolog Programs
SPEAKER: unknown

ABSTRACT. Integration techniques for combining programs written in distinct language paradigms facilitate the implementation of specialised modules in the best language for their task. In the case of Java-Prolog integration, a known problem is the proper representation of references to Java objects on the Prolog side. To solve it adequately, multiple dimensions should be considered, including reference representation, opacity of the representation, identity preservation, reference life span, and scope of the inter-language conversion policies. This paper presents an approach that addresses all these dimensions, generalising and building on existing representation patterns of foreign references in Prolog, and taking inspiration from similar inter-language representation techniques found in other domains. Our approach maximises portability by making few assumptions about the Prolog engine interacting with Java (e.g., embedded or executed as an external process). We validate our work by extending JPC, an open-source integration library, with features supporting our approach.

12:15
Entanglement Patterns and Logic Programming Language Constructs
SPEAKER: Fahmida Hamid

ABSTRACT. Unification of logic variables instantly connects present and future observations of their value,independently of their location in the data areas of the runtime system. 

The paper extends this property to "interclausal logic variables'', an easy to implement Prolog extension  that supports instant globa information exchanges  without dynamic database updates.

We illustrate their usefulness with two of algorithms, graph coloring and minimum spanning tree. 

Implementations of interclausal variables  as source-level transformations and as abstract machine adaptations are given.

To address the need for globally visible chained transitions of logic variables we describe a DCG-based program transformation that extends the functionality of interclausal variables. 

10:45-13:00 Session 138I: Exploring and Assessing Clinical Guidelines (KR4HC)
Location: EI, EI 8
10:45
Conformance Analysis of the Execution of Clinical Guidelines with Basic Medical Knowledge and Clinical Terminology

ABSTRACT. Clinical Guidelines (CGs) are developed for specifying the ``best'' clinical procedures for specific clinical circumstances. However, a CG is executed on a specific patient, with her peculiarities, and in a specific context, with its limitations and constraints. Physicians have to use Basic Medical Knowledge (BMK) in order to adapt the general CG to each specific case, even if the interplay between CGs and the BMK can be very complex, and the BMK should rely on medical terminological knowledge. In this paper, we focus on a posteriori analysis of conformance, intended as the adherence of an observed execution trace to CG and BMK knowledge. A CG description in the GLARE language is mapped to Answer Set Programming (ASP); the BMK and conformance rules are also represented in ASP. The BMK relies on the SNOMED-CT terminology and additional (post-coordinated) concepts. Conformance analysis is performed in Answer Set Programming and identifies non-adherence situations to the CG and/or BMK, pointing out, in particular, discrepancies from one knowledge source that could be justified by another source, and discrepancies that cannot be justified.

11:10
Semantic Representation of Evidence-based Clinical Guidelines
SPEAKER: unknown

ABSTRACT. Evidence-based Clinical Guidelines (EbCGs) are that the document or recommendation has been created using the best clinical research ndings of the highest value to aid in the delivery of optimum clinical care to patients. In this paper, we propose a lightweight formalism of evidence-based clinical guidelines by introducing the Semantic Web Technology for it. With the help of the tools which have been developed in the Semantic Web and Natural Language Processing (NLP), the generation of the formulations of evidence-based clinical guidelines become much easy. We will discuss several use cases of the semantic representation of EbCGs, and argue that it is potentially useful for the applications of the semantic web technology on the medical domain.

11:35
Real Rules, Real Data -- Addressing Challenges of Reasoning HIPAA Privacy Rule in Health Information Exchange (HIE)
SPEAKER: unknown

ABSTRACT. The rules associated with Health Information Exchange (HIE) are voluminous and complex. We had previously prototyped an Accountable System - using Linked Data, we demonstrated the ability to fully represent complex law in machine-readable policy language, to employ it to reason over hypothetical data usage events, and to produce a result which explains likely compliance or non-compliance with the law. Also, a small number of others had worked on representing health law in program code. In this project, we prototyped a HIE under the Privacy Rule of the Health Insurance Portability and Accountability Act (HIPAA), in which a doctor seeks records for a patient without the patient's written consent. Such events can occur when a patient is unconscious or incompetent as well as when records are misplaced and time is of the essence. In this paper we closely modeled real world problems by incorporating many, different data standards associated with the component parts of the problem and anonymized, but real patient data. During this process, we addressed a challenge previously identified by others - how to reason rules that are not obviously operational? We did so in part by including serial rule-traversal and reasoning obscured dependencies between rules. Both of these techniques are critical to proper implementation of HIE and will be applicable in other domains as well.

12:00
META-GLARE: a meta-system for defining your own CIG system: Architecture and Acquisition

ABSTRACT. Clinical practice guidelines (CPGs) play an important role in medical practice, and computerized support to CPGs is now one of the most central areas of research in Artificial Intelligence in medicine. In recent years, many groups have developed different computer-assisted management systems of Computer Interpretable Guidelines (CIGs). From one side, there are several commonalities between different approaches; from the other side, each approach has its own peculiarities and is geared towards the treatment of specific phenomena. In our work, we propose a form of generalization: instead of defining “yet another CIG system”, we propose a META-GLARE, a “meta”-system (or, in other words, a shell) to define new CIG systems. From one side, we try to capture the commonalities, by providing (i) a general tool for the acquisition, consultation and execution of hierarchical bipartite graphs (representing the control flow of actions in CIGs), parametrized over the types of nodes and of arcs constituting it, and (ii) a library of different elementary components of guidelines nodes (actions) and arcs, in which each type definition involves the specification of how objects of this type can be acquired, consulted and executed. From the other side, we provide generality and flexibility, by allowing free aggregations of such elementary components to define new primitive node and arc types. In this paper, we first propose META-GLARE general architecture and then, for the sake of brevity, we will focus only on the acquisition issue.

12:15
Assessment of Clinical Guideline Models Based on Metrics for Business Process Models
SPEAKER: unknown

ABSTRACT. The formalisation of clinical guidelines is a long and demanding task which usually involves both clinical and IT staff. Because of the features of guideline representation languages, a clear understanding of the final guideline model may prove complicated for clinicians. In this context, an assessment of the understandability of the guideline model becomes crucial. In the field of Business Process Modelling (BPM) there is research on structural metrics and their connection with the quality of process models, concretely with understandability and modifiability. In this paper we adapt the structural metrics that have been proposed in the field of BPM in terms of the features of a specific guideline representation language, which is PROforma. Additionally, we present some experiments consisting in the application of these adapted metrics to the assessment of guideline models described in PROforma. Although it has not been possible to draw meaningful conclusions on the overall quality of the models, our experiments have served to shed light on important aspects to be considered, such as the hierarchical decomposition of processes.

12:30
An Algorithm for Guideline Transformation: from BPMN to PROforma
SPEAKER: unknown

ABSTRACT. In healthcare domain, business process modelling technologies are able to support clinical processes recommended in guidelines. It has been shown that BPMN is intuitively understood by all stakeholders, including domain experts. However, if we want to develop any computer system using clinical guidelines, we need them in an executable format. Thus, we need computer-interpretable guidelines. Although there are several formalisms tailored to capture medical processes, encoding a guideline in any of them is not as intuitive. We propose an automatic transformation from a guideline represented in BPMN to a computer-interpretable formalism, in this case, PROforma. To tackle this problem, we have studied the approaches that transform graph-oriented languages into block-oriented languages. We have adapted the solution to our specic-domain problem and to our target language, PROforma, which has features of both, graph and block-oriented paradigms.

12:45
A Process-oriented Methodology for Modelling Cancer Treatment Trial Protocols
SPEAKER: unknown

ABSTRACT. Treatment of cancer in the context of a multi-center treatment trial requires follow-ing a complex detailed process involving multidisciplinary patient treatment as well as study-related tasks, described in narrative protocol documents. We pre-sent a process-oriented approach for modelling clinical trial treatment protocols (CTTPs) to be used for enabling applications that support protocol-based care process delivery, monitoring and analysis. This modelling approach provides an understand-able visualized representation of the protocol document catering for change management, intra-center and national adaptations to the master protocol, and multi-level share-ability. The methodology can be re-used in CTTPs of dif-ferent cancer domains due to the similarity of the CTTPs in terms of required content.

12:45-13:00 Session 139A: 3 Short presentations of 5 minutes each. (KR)
Location: EI, EI 7
12:45
Rough Set Semantics for Identity on the Web
SPEAKER: Wouter Beek

ABSTRACT. Identity relations are at the foundation of many logic-based knowledge representations. We argue that the traditional notion of equality, based on Leibniz’ principle of indiscernability, is unsuited for many realistic knowledge representation settings. The classical interpretation of equality is simply too strong when the equality statements are re-used outside their original context.

The most prominent modern example of such re-use of knowledge across contexts is the Linked Open Data initiative, and the SemanticWeb in general. On the SemanticWeb, equality statements are used to interlink multiple descriptions of the same object, using owl:sameAs assertions. And indeed, many practical uses of owl:sameAs are known to violate the formal Leibniz-style semantics.

First, we survey the problems with the classical definition of identity, both in general, and on the Semantic Web in particular. In order to provide a more flexible semantics to identity, we then propose a method that assigns meaning to the subrelations of an identity relation using the predicates that are used in a knowledge-base. Using those indiscernabilitypredicates, we define upper and lower approximations of equality in the style of rought-set theory, resulting in a quality-measure for the identity relation.

We illustrate our approach on a realistic Semantic Web dataset, and we experimentally verify that our approach yields the intuitively desired results on this dataset.

12:50
Predicting Performance of OWL Reasoners: Locally or Globally?
SPEAKER: unknown

ABSTRACT. We propose a novel approach to performance prediction of OWL reasoners. The existing strategies take a view of an entire ontology corpus: they extract multiple features from the ontologies in the corpus and use them for training machine learning models. We call these global approaches. In contrast, our approach is a local one: it examines a single ontology (independent of any corpus), selects suitable, small ontology subsets, and extrapolates their performance measurements to the whole ontology. Our results show that this simple idea leads to accurate performance predictions, comparable or superior to global approaches. Our second contribution concerns ontology features: we are the first to investigate intercorrelation of ontology features using Principal Component Analysis (PCA). We report that extracting multiple features - as global approaches do - makes surprisingly little difference for performance prediction. In fact, it turns out that the ontologies in two major corpora basically only differ in one or two features.

12:55
Concept Dissimilarity with Triangle Inequality
SPEAKER: unknown

ABSTRACT. Several researchers have developed properties that ensure compatibility of a concept similarity or dissimilarity measure with the formal semantics of Description Logics. While these authors have highlighted the relevance of the triangle inequality, none of their proposed dissimilarity measures satisfies it. In this work we present several dissimilarity measures with this property: first, a simple dissimilarity measure, based on description trees for the lightweight Description Logic EL; secondly, a general framework based on concept relaxations; last, an instantiation of the general framework using dilation operators from mathematical morphology, exploiting the link between Hausdorff distance and dilations using balls of the ground distance as structuring elements. A comparison between these definitions and their properties is provided as well.

12:45-13:00 Session 139B: 3 Short presentations of 5 minutes each. (KR)
Location: EI, EI 9
12:45
Interval Methods for Judgment Aggregation in Argumentation
SPEAKER: unknown

ABSTRACT. Given a set of conflicting arguments, there can exist multiple plausible opinions about which arguments should be accepted, rejected, or deemed undecided. Recent work explored some operators for deciding how multiple such judgments should be aggregated. Here, we generalize this line of study by introducing a family of operators called *interval aggregation methods*, and show that they contain existing operators as instances. While these methods fail to output a complete labelling in general, we show that it is possible to *transform* a given aggregation method into one that *does* always yield collectively rational labellings. This employs the *down-admissible* and *up-complete* constructions of Caminada and Pigozzi. For interval methods, collective rationality is attained at the expense of a strong *Independence* postulate, but we show that an interesting weakening of the Independence postulate is retained by this transformation. Our results open up many new directions in the study of generalized judgment aggregation in argumentation

12:50
How to Argue for Anything: Enforcing Arbitrary Sets of Labellings Using AFs

ABSTRACT. We contribute to the study of the meta-logical properties of argumentation frameworks, as introduced by Dung. Such frameworks are used in a variety of contexts in contemporary AI, not just as a means to model argumentative scenarios, but also as an abstract mathematical representation of non-monotonic reasoning patterns more generally. Importantly, Dung observed that by representing such patterns by directed graphs, various semantics for non-monotonic reasoning could be formulated in graph theoretic terms. In this paper we address so-called sceptical reasoning about argumentation frameworks, asking the following question: what are the propositional validities of abstract argumentation? We answer this by giving graph constructions demonstrating that the set of validities for a range of different argumentation semantics coincide and that it is axiomatized by three-valued Lukasiewicz logic.

12:55
A Psychology-Inspired Approach to Automated Narrative Text Comprehension
SPEAKER: unknown

ABSTRACT. This paper brings together the theory of argumentation in AI with the psychology of story comprehension to develop a KR\&R framework for narrative text comprehension. The proposed framework is based on a computational interpretation of the general psychological principles of coherency and cognitive economy for text comprehension. Part of a wider study, this paper concentrates on the central problem of integration (or elaboration) of explicit information from the text with implicit (in the reader's mind) common sense world knowledge pertaining to the topic(s) of the story. We show how some of the major challenges for successful story comprehension are closely related to those of NMR, as found in Default Reasoning and Reasoning about Action and Change, and more specifically Knowledge Qualification and Revision. We show how an appropriate formulation through argumentation can give a basis for automating the construction and revision of a coherent comprehension model of a narrative. We report on a first set of empirical experiments designed to harness the implicit world knowledge used by humans reading a story to guide the background knowledge given as input to our framework. We present a prototype system that is used to evaluate the ability of our approach to capture both the majority and the variability of understanding of a story by the human readers in the experiment.

13:00-14:30Lunch Break
14:30-16:00 Session 140A: Planning, Strategies, Diagnosis 1 (KR)
Location: EI, EI 7
14:30
An Abstraction Technique for the Verification of Multi-Agent Systems Against ATL Specifications
SPEAKER: unknown

ABSTRACT. We introduce an abstraction methodology for the verification of multi-agent systems against specifications in alternating time temporal logic (ATL). Inspired by methodologies such as predicate abstraction, we define a three-valued semantics for the interpretation of ATL formulas on concurrent game structures and compare it to the standard two-valued semantics. We define abstract models and establish preservation results on the three-valued semantics between abstract models and their concrete counterparts. We illustrate the methodology on the large state spaces resulting from a card game.

15:00
Reasoning about Equilibria in Game-like Concurrent Systems

ABSTRACT. Our aim is to develop techniques for reasoning about game-like concurrent systems, where the components of the system act rationally and strategically in pursuit of logically-specified goals. We first present a computational model for such systems, and investigate its properties. We then define and investigate a branching-time logic for reasoning about the equilibrium properties of such systems. The key operator in this logic is a path quantifier [NE]phi, which asserts that phi holds on all Nash equilibrium computations of the system.

15:30
A Temporal Logic of Strategic Knowledge

ABSTRACT. The paper presents an extension of temporal epistemic logic with epistemic operators that capture what agents could deduce from knowledge of the strategies of some subset of the set of agents. Some examples are presented to motivate the framework, and relationships to several variants of alternating temporal epistemic logic are discussed. Results on model checking the logic are then presented.

14:30-16:00 Session 140B: Belief Revision and Nonmonotonicity 1 (KR)
Location: EI, EI 9
14:30
Belief Change and Semiorders
SPEAKER: Pavlos Peppas

ABSTRACT. A central result in the AGM framework for belief revision is the construction of revision functions in terms of total preorders on possible worlds. These preorders encode comparative plausibility: r <= r' states that the world r is at least as plausible as r'. Indifference in plausibility, denoted r ~ r', is defined as r ~ r' iff r <= r' and r' <= r. Herein we take a closer look at plausibility indifference. We contend that the transitivity of indifference assumed in the AGM framework is not, in general, an intuitive property for comparative plausibility. Our argument originates from similar concerns in preference modelling, where a structure weaker than a total preorder, called a semiorder, is widely consider to be a more adequate model of preference. In this paper we essentially re-construct revision functions using semiorders instead of total preorders. We formulate postulate to characterise this new, wider, class of revision functions, and prove that the postulates are soundness and completeness with respect to the semiorders-based construction. Moreover, through the Levi and Harper Identities, we characterise axiomatically the corresponding class of contraction functions.

15:00
On Egalitarian Belief Merging
SPEAKER: unknown

ABSTRACT. Belief merging aims at defining the beliefs of a group from the beliefs of each member of the group. It is related to more general notions of aggregation from economy (social choice theory). Two main subclasses of belief merging operators exist: majority operators which are related to utilitarianism, and arbitration operators which are related to egalitarianism. Though utilitarian (majority) operators have been extensively studied so far, there is much less work on egalitarian operators. In order to fill the gap, we investigate possible translations in a belief merging framework of some egalitarian properties and concepts coming from social choice theory, such as Sen-Hammond equity, Pigou-Dalton property, median, and Lorenz curves. We study how these properties interact with the standard rationality conditions considered in belief merging. Among other results, we show that the distance-based merging operators satisfying Sen-Hammond equity are mainly those for which leximax is used as the aggregation function.

15:30
David Poole’s Specificity Revised

ABSTRACT. In the middle of the 1980s, David Poole introduced a semantical, model-theoretic notion of specificity to the artificial-intelligence community. Since then it has found further applications in non-monotonic reasoning, in particular in defeasible reasoning. Poole’s notion, however, turns out to be intricate and problematic, which — as we show — can be overcome to some extent by a closer approximation of the intuitive human concept of specificity. Besides the intuitive advantages of our novel specificity ordering over Poole’s specificity relation in the classical examples of the literature, we also report some hard mathematical facts: Contrary to what was claimed before, we show that Poole’s relation is not transitive. Our new notion of specificity is also monotonic w.r.t. conjunction.

14:30-16:00 Session 140C: Concurrency (CAV)
Location: FH, Hörsaal 1
14:30
Automatic Atomicity Verification for Clients of Concurrent Data Structures
SPEAKER: unknown

ABSTRACT. Mainstream programming languages offer libraries of concurrent data structures. Each method call on a concurrent data structure appears to take effect atomically. However, clients of such data structures often require stronger guarantees. For instance, a histogram class that is implemented using a concurrent map may require a method to atomically increment a histogram bar, but its implementation requires multiple calls to the map and hence is not atomic by default. Indeed, prior work has shown that atomicity errors in clients of concurrent data structures occur frequently in production code. We present an automatic and modular verification technique for clients of concurrent data structures. We define a novel sufficient condition for atomicity of clients called condensability. We present a tool called Snowflake that generates proof obligations for condensability of Java client methods and discharges them using an off-the-shelf SMT solver. We applied Snowflake to an existing suite of client methods from several open-source applications. It successfully verified 76.9% of the atomic methods without any change and verified the rest of them with small code refactoring. It rejected all non-atomic methods.

14:50
Regression-free Synthesis for Concurrency

ABSTRACT. While fixing concurrency bugs, program repair algorithms may introduce new concurrency bugs. We present a new repair algorithm that avoids such regressions. The solution space is given by the program transformations we consider in the repair process. These include reordering of instructions within a thread, inserting atomic sections, and inserting wait-notify mechanisms. The new algorithm, PACES, is an extension of the CEGIS loop. It learns constraints on the space of candidate solutions, from both positive examples (error-free traces) and counterexamples (error traces). From counterexamples, the algorithm learns a constraint necessary to remove them. From positive examples, it learns constraints that are necessary in order to prevent the repair from turning the trace into a counterexample. We implemented the algorithm and evaluated it on simplified Linux device drivers with known bugs. The tool is able to fix the bugs while avoiding regressions.

15:10
Bounded Model Checking of Multi-Threaded C Programs via Lazy Sequentialization
SPEAKER: unknown

ABSTRACT. Bounded model checking (BMC) has successfully been used for many practical program verification problems, but concurrency still poses a challenge. Here we describe a new approach to BMC of sequentially consistent C programs using POSIX threads. Our approach first translates a multi-threaded C program into a non-deterministic sequential C program that preserves reachability for all round-robin schedules with a given bound on the number of rounds. It then re-uses existing high-performance BMC tools as back-ends for the sequential verification problem. Our translation is carefully designed to introduce very small memory overheads and very few sources of non-determininsm, so that it produces tight SAT/SMT formulae, and is thus very effective in practice: our implementation won the the concurrency category of SV-COMP14. It solved all verification tasks successfully and was 30x faster than the best tool with native concurrency handling.

15:30
An SMT-Based Approach to Coverability Analysis
SPEAKER: Filip Niksic

ABSTRACT. Model checkers based on Petri net coverability have been used successfully in recent years to verify safety properties of concurrent shared-memory or asynchronous message-passing software. We revisit a constraint approach to coverability based on classical Petri net analysis techniques. We show how to utilize an SMT solver to implement the constraint approach, and additionally, to generate an inductive invariant from a safety proof. We empirically evaluate our procedure on a large set of existing Petri net benchmarks. Even though our technique is incomplete, it can quickly discharge most of the safe instances. Additionally, the inductive invariants computed are usually orders of magnitude smaller than those produced by existing solvers.

15:50
LEAP: A Tool for the Parametrized Verification of Concurrent Datatypes

ABSTRACT. This paper presents Leap, a tool for the verification of concurrent datatypes and parametrized systems composed by an unbounded number of threads that manipulate infinite data.

Concurrent datatypes are accessed by numerous threads which manipulate shared data in the heap using liberal patterns of synchronization like lock-freedom and fine-grain locking. Formally verifying concurrent datatypes is a challenging task, because of the combination of complex dynamic memory manipulation, and the rather unstructured concurrent execution. The difficulty is increased because one must also deal with concurrent system composed by the execution of an unbounded number of threads.

Leap receives as input a concurrent program description and a specification. Leap generates automatically a finite collection of verification conditions and discharges these to specialized decision procedures built on top of state-of-the-art SMT solvers. The validity of all verification conditions implies that the program executed by any number of threads satisfies the specification. Leap does not only include decision procedures for integers and booleans, but also implements specific theories for heap memory layouts such as linked-lists and skiplist.

14:30-16:00 Session 140D: Best Doctoral Consortium Talk / Technical Session - Abduction (ICLP)
Location: FH, Hörsaal 8
14:30
Best Doctoral Consortium Talk
15:00
A Measure of Arbitrariness in Abductive Explanations
SPEAKER: unknown

ABSTRACT. In this paper, we study the framework of abductive logic programming extended with integrity constraints. For this framework, we introduce a new measure of the simplicity of an explanation in terms of its degree of arbitrariness. The more arbitrary the explanation the less appealing it is, with constrained explanations having no arbitrariness and so, being the preferred ones. In the paper, we study basic properties of constrained explanations. For the case when programs in abductive theories are stratified we establish results providing a detailed picture of the complexity of the problem to decide whether constrained explanations exist.

15:30
Contextual Abductive Reasoning with Side-Effects

ABSTRACT. The belief bias effect is a phenomenon which occurs when we think that we judge an argument based on our reasoning, but are actually influenced by our beliefs and prior knowledge. Evans, Barston and Pollard carried out a psychological syllogistic reasoning task to prove this effect. Participants were asked whether they would accept or reject a given syllogism. We discuss one specific case which is commonly assumed to be
believable but which is actually not logically valid. By introducing abnormalities, abduction and background knowledge, we adequately model this case under the weak completion semantics. Our formalization reveals new questions about possible extensions in abductive reasoning. For instance, observations and their explanations might include some relevant prior abductive contextual information concerning some side-effect or leading to a contestable or refutable side-effect. A weaker notion indicates the support of some relevant consequences by a prior abductive context. Yet another definition describes jointly supported relevant consequences, which captures the idea of two observations containing mutually supportive side-effects. Though motivated with and exemplified by the running psychology application, the various new general abductive context definitions are introduced here and given a declarative semantics for the first time, and have a much wider scope of application. Inspection points, a concept introduced by Pereira and Pinto, allows us to express these definitions syntactically and intertwine them into an operational semantics.

14:30-16:00 Session 140E: FLoC Inter-Conference topic: SAT/SMT/QBF (CAV and IJCAR)
Location: FH, Hörsaal 5
14:30
Approximations for Model Construction
SPEAKER: unknown

ABSTRACT. We consider the problem of efficiently computing models for satisfiable constraints, in the presence of complex background theories such as floating-point arithmetic (FPA). Model construction has various applications, for instance the automatic generation of test inputs. It is well-known that naive encoding of constraints into simpler theories (for instance, bit-vectors or propositional logic) can lead to a drastic increase in size, and be unsatisfactory in terms of memory and runtime needed for model construction. We define a framework for systematic application of approximations in order to speed up model construction. Our method is more general than previous techniques in the sense that approximations that are neither under- nor over-approximations (or the combination of both) can be used, and shows promising results in practice.

15:00
A Tool that Incrementally Approximates Finite Satisfiability in Full Interval Temporal Logic
SPEAKER: Martin Lange

ABSTRACT. Interval Temporal Logic (ITL) is a powerful formalism to reason about sequences of events that can occur simultaneously and in an overlapping fashion. Despite its importance for various application domains, little tool support for automated ITL reasoning is available, possibly also owed to ITL's undecidability.

We consider the bounded satisfiability which approximates finite satisfiability and is only NP-complete. We provide an encoding into SAT that makes use of the power of modern incremental SAT solvers and present a tool which tests an ITL specification for finite satisfiability.

15:20
StarExec: a Cross-Community Infrastructure for Logic Solving

ABSTRACT. We introduce StarExec, a public web-based service built to facilitate the experimental evaluation of logic solvers, broadly understood as automated tools based on formal reasoning. Examples of such tools include theorem provers, SAT and SMT solvers, constraint solvers, model checkers, and software verifiers. The service, running on a compute cluster with 380 processors and 23 terabytes of disk space, is designed to provide a single piece of storage and computing in- frastructure to logic solving communities and their members. It aims at reducing duplication of effort and resources as well as enabling individual researchers or groups with no access to comparable infrastructure. StarExec allows community organizers to store, manage and make available benchmark libraries; competition organizers to run logic solver competitions; and community members to do com- parative evaluations of logic solvers on public or private benchmark problems.

15:40
Skeptik [System Description]
SPEAKER: unknown

ABSTRACT. This paper introduces Skeptik: a system for checking, compressing and improving proofs obtained by sat- and smt-solvers.

14:30-16:00 Session 140F: Information Flow 2 (CSF)
Location: FH, Hörsaal 6
14:30
Compositional Information-flow Security for Interactive Systems
SPEAKER: unknown

ABSTRACT. To achieve end-to-end security in a system built from parts, it is important to ensure that the composition of secure components is itself secure. This work investigates the compositionality of two popular conditions of possibilistic noninterference. The first condition, progress-insensitive noninterference (PINI), is the security property enforced by practical tools like JSFlow, Paragon, LIO, Jif, FlowCaml, and SPARK Examiner. We show that this condition is not preserved under fair parallel composition: composing a PINI system fairly with another PINI system can yield an insecure system. We explore constraints that allow recovering compositionality for PINI. Further, we develop a theory of compositional reasoning and enforcement for progress-sensitive noninterference (PSNI). Our work is performed within a general framework for nondeterministic interactive systems.

15:00
Stateful Declassification Policies for Event-Driven Programs
SPEAKER: unknown

ABSTRACT. We propose a novel mechanism for enforcing information flow policies with support for declassification on event-driven programs. Declassification policies consist of two functions. First, a projection function specifies for each confidential event what information in the event can be declassified directly. This generalizes the traditional security labelling of inputs. Second, a stateful release function specifies the aggregate information about all confidential events seen so far that can be declassified. We provide evidence that such declassification policies are useful in the context of JavaScript web applications. An enforcement mechanism for our policies is presented and its soundness and precision is proven. Finally, we give evidence of practicality by implementing and evaluating the mechanism in a browser.

15:30
Additive and multiplicative notions of leakage, and their capacities

ABSTRACT. Protecting sensitive information from improper disclosure is a fundamental security goal. It is complicated, and difficult to achieve, often because of unavoidable or even unpredictable operating conditions that lead to breaches in planned security defences. An attractive approach is to frame the goal as a quantitative problem, and then to design methods that allow system vulnerabilities to be measured in terms of the amount of information that they leak. A consequence of this view is that the precise operating conditions and any assumptions about prior knowledge can play a crucial role in assessing the severity of any measured vunerability.

We develop this theme by concentrating on robustness of vulnerability measurements: we wish to allow general leakage bounds to be placed on a program, whatever its operating conditions and whatever the prior knowledge might be. In particular we propose a theory of channel capacity, generalizing Shannon capacity from information theory, that can apply both to additive- and to multiplicative forms of a recently-proposed measure known as g-leakage. Further, we explore the computational aspects of calculating these capacities: we show that one of these scenarios can be solved efficiently by expressing it as a Kantorovich distance, and that another is NP-complete. We also show capacity bounds for a scenario related to Dalenius's Desideratum.

15:30-16:00 Session 141: Training and Learning (KR4HC)
Location: EI, EI 8
15:30
Development of Digital Repositories of Multimedia Learning Modules and Guideline-Driven Interactive Case Simulation Tools for New York State HIV/HCV/STD Clinical Education Initiative
SPEAKER: unknown

ABSTRACT. We have developed multimedia learning modules (LMs) and guideline-driven interactive case simulation tools (ICSTs) as two major categories of resources for the New York State HIV/HCV/STD Clinical Education Initiative. We have adopted the existing standards on health professional training and clinical guideline representation to capture the important information of LMs and ICSTs. We have synthesized a domain ontology of HIV/HCV/STD clinical topics to index the content of LMs and ICSTs. Based on this, we have built system functions for browse/search of specific resources and checking of related resources. We have disseminated LMs and ICSTs through multiple platforms and recorded heavy usage by audiences from around the world. Plans for next stages include cross-linkages of multiple categories of resources, deeper level of resource mapping through internal structures, and full-scale usage analyses.

15:45
Training residents in the application of clinical guidelines for differential diagnosis of the most frequent causes of arterial hypertension with decision tables
SPEAKER: unknown

ABSTRACT. Arterial hypertension (AH) is an abnormal high blood pressure in the arteries with many possible etiologies. Differential diagnosis of the causes of AH is a complex clinical process that requires the simultaneous consideration of many clinical practice guidelines. Training clinicians to manage, assimilate, and correctly apply the knowledge contained in the guidelines of the most frequent causes of AH is a challenge that we have addressed with the combined use of different sorts of decision tables. After extracting the diagnostic knowledge available in eight clinical practice guidelines of the most frequent secondary causes of hypertension, we have represented this knowledge as decision tables, and have used these tables to train 23 residents at the Hospital Clinic de Barcelona. During the training, the decisions of the residents along the differential diagnostic steps were compared with the decisions provided by the decision tables so that we could analyze the progressive adaptation of clinicians' decisions to the guidelines' recommendations. The study shows a progressive improvement of the adherence of the residents to the guidelines as new AH cases are considered, reaching full adherence after a training with 30 clinical cases.

16:00-16:50 Session 142: Modal Logic (IJCAR)
Location: FH, Hörsaal 5
16:00
Terminating Minimal Model Generation Procedures for Propositional Modal Logics
SPEAKER: unknown

ABSTRACT. Model generation and minimal model generation are useful for tasks such as model checking and for debugging of logical specifications. This paper presents terminating procedures for the generation of models minimal modulo subset-simulation for the modal logic K and all combinations of extensions with the axioms T, B, D, 4 and 5. Our procedures are minimal model sound and complete. Compared with other minimal model generation procedures, our procedures benefit from smaller search spaces and fewer models are returned. To make the models most effective for users, our minimal model criterion is designed to be semantically meaningful, intuitive and contain minimal amount of information. Depending on the logic, termination is ensured by a variation of equality blocking.

16:30
COOL -- A Generic Satisfiability Checker For Coalgebraic Logics with Global Assumptions (System Description)

ABSTRACT. We describe the Coalgebraic Ontology Logic solver COOL, a generic reasoner that decides the satisfiability of modal formulas (with nominals) with respect to a set of global assumptions --a TBox in Description Logics parlance. The level of generality is that of coalgebraic logic, a logical framework covering a wide range of modal logics, beyond relational semantics. The core of COOL is an efficient unlabelled tableaux search procedure using global caching. Concrete logics are added by implemening the corresponding (one-step) tableaux rules. The logics covered at the moment are (multi-modal) K and KD (i.e., serial relations), graded modal logic and Pauly's Coalition Logic (the next-step fragment of Alternating-time Temporal Logic), plus every logic that arises from the above as a product (modal fusion) or coproduct of the underlying functors, which are handled automatically in a generic way. We compare the performance of COOL with state-of-the-art reasoners.

16:00-16:30Coffee Break
16:30-17:30 Session 143A: Knowledge Representation and Reasoning 1 (KR)
Location: EI, EI 7
16:30
Generalized Multi-Context Systems

ABSTRACT. Multi-context systems (MCSs) define a versatile framework for integrating and reasoning about knowledge from different (heterogeneous) sources. In MCSs, different types of non-monotonic reasoning are characterized by different semantics such as equilibrium semantics and grounded equilibrium semantics [Brewka and Eiter, 2007].

We introduce a novel semantics of MCSs, a supported equilibrium semantics. Our semantics is based on a new notion of support. The ``strength'' of supports determines a spectrum of semantics that, in particular, contains the equilibrium and grounded equilibrium semantics. In this way, our supported equilibrium semantics generalizes these previously defined semantics. Moreover, the ``strength'' of supports gives us a measure to compare different semantics of MCSs.

17:00
Qualitative Spatial Representation and Reasoning in Angry Birds: the Extended Rectangle Algebra
SPEAKER: Peng Zhang

ABSTRACT. Angry Birds is a popular video game where the task is to kill pigs protected by a structure composed of different building blocks that observe the laws of physics. The structure can be destroyed by shooting the angry birds at it. The fewer birds we use and the more blocks we destroy, the higher the score. One approach to solve the game is by analyzing the structure and identifying its strength and weaknesses. This can then be used to decide where to hit the structure with the birds.

In this paper we use a qualitative spatial reasoning approach for this task. We develop a novel qualitative spatial calculus for representing and analyzing the structure. Our calculus allows us to express and evaluate structural properties and rules, and to infer for each building block which of these properties and rules are satisfied. We use this to compute a heuristic value for each block that corresponds to how useful it is to hit that block. We evaluate our approach by comparing the consequences of the suggested shot with other possible shots.

16:30-17:30 Session 143B: Automated Reasoning and Computation 1 (KR)
Location: EI, EI 9
16:30
On OBDDs for CNFs of Bounded Treewidth
SPEAKER: Igor Razgon

ABSTRACT. Knowledge compilation is a rewriting approach to propositional knowledge representation. The `knowledge base' is initially represented as a CNF for which many important types of queries are NP-complete to answer. Therefore, the CNF is compiled into another representation for which the minimal requirement is that the clause entailment query (can the given partial assignment be extended to a complete satisfying assignment?) can be answered in a polynomial time. Such transformation can result in exponential blow up of the representation size. A possible way to circumvent this issue is to identify a structural parameter of the input CNF such that the resulting transformation is exponential in this parameter and polynomial in the number of variables. A notable result in this direction is an $O(2^kn)$ upper bound on the size of Decomposable Negation Normal Form (DNNF) proposed by (Darwiche, JACM 2001), where $n$ is the number of variables of the given CNF and $k$ is the treewidth of its primal graph. Quite recently this upper bound has been shown to hold for Sentential Decision Diagrams (Darwiche, IJCAI2011), a subclass of DNNF that can be considered as a generalization of the famous Ordered Binary Decision Diagrams (OBDD) and shares with the OBDD the key nice features (e.g. poly-time equivalence testing). Under this parameterization, the best known upper bound for an OBDD is $O(n^k)$ (Ferrara, Pan, Vardi; LPAR2005). A natural question is whether a fixed parameter upper bound holds for OBDD.

In this paper we answer this question negatively. In particular, we demonstrate an infinite class of CNFs of the primal graph treewidth at most $k$ for which the OBDD size is at least $O(n^{k/4})/o(n)$ thus providing a separation from SDD and essentially matching the upper bound of Ferrara, Pan, and Vardi. We also establish a more optimistic version of the $O(n^k)$ upper bound for the OBDD showing that it in fact holds when $k$ is the treewidth of the incidence graph of the given CNF and thus applies for `sparse' CNFs with large clauses.

17:00
Probabilistic Sentential Decision Diagrams
SPEAKER: unknown

ABSTRACT. In this paper, we propose the Probabilistic Sentential Decision Diagram (PSDD), which is a canonical representation for probability distributions that are defined over the models of a propositional theory. Each parameter of a given PSDD can be viewed as the (conditional) probability of making a decision in a corresponding Sentential Decision Diagram (SDD), which itself is a recently proposed canonical representation of propositional theories. We explore a number of interesting properties of PSDDs, including the context-specific independencies that underlie them. We further show how the parameters of a PSDD can be efficiently estimated, in closed form, from complete data. We finally empirically evaluate the quality of PSDDs learned from data, when we have knowledge, a priori, of the logical theory supporting an otherwise unknown probability distribution.

16:30-17:30 Session 143C: Invited Talk (CSF)
Location: FH, Hörsaal 6
16:30
New Directions in Computed-Aided Cryptography
SPEAKER: Gilles Barthe
16:30-17:40 Session 143D: Linking Electronic Patient Records (KR4HC)
Location: EI, EI 8
16:30
Exploiting the Relation between Environmental Factors and Diseases: A Case Study on COPD
SPEAKER: unknown

ABSTRACT. The raise of chronic diseases poses a challenge for the health care sector worldwide. In many cases, diseases are aected by an environmental component that, until now, could be hardly controlled. However, with the recent advances in information and communication technologies applied to cities, it becomes possible to collect real-time environmental data and use them to provide chronic patients with recommendations able to adapt to the changing environmental conditions. In this article, we study the use of the sensing capabilities of the socalled smart cities in the context of the recently proposed concept of Smart Health. We propose a way to exploit the relation between environmental factors and diseases, and we show how to obtain a comfort level for patients. Moreover, we study the application of our proposal to outdoor exercises and rehabilitative activities related to the treatment of Chronic Obstructive Pulmonary Disease.

16:55
Towards Linked Vital Registration Data for Reconstituting Families and Creating Longitudinal Health Histories
SPEAKER: unknown

ABSTRACT. The Irish Record Linkage 1864-1913 project aims to create a knowledge base containing historical birth, marriage and death records translated into RDF to reconstitute families and create longitudinal health histories. The goal is to interlink the different persons across these records as well as with supplementary datasets that provide additional context. With the help of knowledge engineers who will create the ontologies and set up the platform and the digital archivist who will curate, ingest and maintain the RDF, the historians will be able to analyse reconstructed "virtual" families of Dublin in the 19th and early 20th centuries, allowing them to address questions about the accuracy of officially reported maternal mortality and infant mortality rates. In the longer term, this platform will allow researchers to investigate how official historical datasets can contribute to modern-day epidemiological planning.

17:10
A Logic-Based Framework for Medical Assessment Questionnaires
SPEAKER: unknown

ABSTRACT. In this position paper we propose a logic-based framework for the computation of scores used in standard medical assessment questionnaires. In line with the medical specification, our degree-based logic allows to model the relationships between evaluated syndromes, corresponding items of the questionnaires, and their scores. The frequently used method of computation of final scores in terms of mean values is naturally reproduced within our framework.

17:25
Process Information and Guidance Systems in the Hospital
SPEAKER: Marcin Hewelt

ABSTRACT. Doctors, nurses, and health care professionals interact in various hospital processes to achieve the desired outcome: an effective patient treatment. However, seldom all information relevant for diagnosis and treatment is accessible at the point-of-care. Therefore this position paper introduces the concept of process information and guidance systems, the aim of which is twofold: Visualization of treatment history and recommendation of treatment steps based on medical guidelines, patient data, and organizational rules.

08:45-10:15 Session 144A: VSL Keynote Talk (VSL)
Location: EI, EI 7 + EI 9, EI 10 + FH, Hörsaal 1
08:45
VSL Keynote Talk: Ontology-Based Monitoring of Dynamic Systems
SPEAKER: Franz Baader

ABSTRACT. Our understanding of the notion "dynamic system" is a rather broad one: such a system has states, which can change over time. Ontologies are used to describe the states of the system, possibly in an incomplete way. Monitoring is then concerned with deciding whether some run of the system or all of its runs satisfy a certain property, which can be expressed by a formula of an appropriate temporal logic. We consider different instances of this broad framework, which can roughly be classified into two cases. In one instance, the system is assumed to be a black box, whose inner working is not known, but whose states can be (partially) observed during a run of the system. In the second instance, one has (partial) knowledge about the inner working of the system, which provides information on which runs of the system are possible.

In this talk, we will review some of our recent research that investigates different instances of this general framework of ontology-based monitoring of dynamic systems. We will also sketch possible extensions towards probabilistic reasoning and the integration of mathematical modelling of dynamical systems.

10:15-10:45Coffee Break
19:00-20:00 Session 149A: VSL Public Lecture 2 (VSL)
Location: MB, Kuppelsaal
19:00
VSL Public Lecture: Vienna Circle(s) - Between Philosophy and Science in Cultural Context

ABSTRACT. The Vienna Circle of Logical Empiricism, which was part of the intellectual movement of Central European philosophy of science, is certainly one of the most important currents in the emergence of modern philosophy of science. Apart from this uncontested historical fact there remains the question of the direct and indirect influence, reception and topicality of this scientific community in contemporary philosophy of science in general as well as in the philosophy of the individual sciences, including the formal and social sciences. 

First, I will characterize the road from the Schlick-Circle to contemporary philosophy of science. Second, I will refer to "the present situation in the philosophy of science" by identifying relevant impacts, findings, and unfinished projects since the classical Vienna Circle. Third, I will address some specific European features of this globalized philosophical tradition up to the present, and outline future perspectives after the linguistic, historical and pragmatic turns – looking back to the "received view", or standard view of the Vienna Circle.

16:30-19:00 Session 151A: VSL Joint Award Ceremony 2 (VSL)
Location: MB, Kuppelsaal
16:30
FLoC Olympic Games Award Ceremony 2

ABSTRACT. The aim of the FLoC Olympic Games is to start a tradition in the spirit of the ancient Olympic Games, a Panhellenic sport festival held every four years in the sanctuary of Olympia in Greece, this time in the scientific community of computational logic. Every four years, as part of the Federated Logic Conference, the Games will gather together all the challenging disciplines from a variety of computational logic in the form of the solver competitions.

At the Award Ceremonies, the competition organizers will have the opportunity to present their competitions to the public and give away special prizes, the prestigious Kurt Gödel medals, to their successful competitors. This reinforces the main goal of the FLoC Olympic Games, that is, to facilitate the visibility of the competitions associated with the conferences and workshops of the Federated Logic Conference during the Vienna Summer of Logic.

This award ceremony will host the

  • Fifth Answer Set Programming Competition (ASPCOMP 2014);
  • The 7th IJCAR ATP System Competition (CASC-J7);
  • Hardware Model Checking Competition (HWMCC 2014);
  • OWL Reasoner Evaluation (ORE 2014);
  • Satisfiability Modulo Theories solver competition (SMT-COMP 2014);
  • Competition on Software Verification (SV-COMP 2014);
  • Syntax-Guided Synthesis Competition (SyGuS-COMP 2014);
  • Synthesis Competition (SYNTCOMP 2014); and
  • Termination Competition (termCOMP 2014).
18:00
Lifetime Achievement Award

ABSTRACT. Werner "Jimmy" DePauli-Schimanovich is being honored for his contributions to Gödel research, for his efforts towards re-establishing Vienna as a center of logic in the second half of the past century, and for extending the outreach of formal logic and logical thinking to other disciplines. The books about Gödel he authored, co-authored, or edited contain precious contributions of historical, philosophical, and mathematical value. His film on Gödel, and related articles and TV testimonials, attracted the general public to Gödel and his work. Moreover, DePauli-Schimanovich has unceasingly and successfully fought to re-establish logic in Vienna. He has inspired and encouraged a large number of students and young researchers, has organised meetings and social gatherings, and has created an intellectual home for scholars of logic. His activities significantly contributed to filling the intellectual vacuum in logic research that had developed in Austria in the years 1938–1945 and well into the post-war period. He was a main initiator and co-founder of the International Kurt Gödel Society. Jimmy’s research in logic involved criticism of current axiomatisations of set theory and development of new systems of naïve set theory. In informatics, DePauli-Schimanovich was co-founder and co-organiser of EuroCAST, a conference series on computer-assisted systems theory meeting alternatively in Vienna and Las Palmas, Gran Canaria. Finally, Jimmy has furthered implementation of logical ideas in disparate fields such as mechanical engineering, automated game playing, urban planning, and by inventing a number of logical games.

18:10
Lifetime Achievement Award
SPEAKER: Mingyi Zhang

ABSTRACT. Zhang Mingyi is a pioneer in Artificial Intelligence and Knowledge Representation in China. He has built up a school of non-monotonic logic in Guiyang and has fostered the exchange between Chinese and Western scientists. He has contributed a large number of results on various forms of non-monotonic reasoning, including default logic, answer set programming, and belief revision. In particular, as early as 1992, he has provided important characterizations of various forms of default logic that paved the way for clarifying their computational properties and for developing algorithms for default reasoning.  Jointly with collaborators, he proposed a characterization of answer sets based on sets of generating rules, and introduced the concept of first-order loop formulas to relate answer set programming to classical logic. Other important results relate to theory revision (for example, the proof of the existence and uniqueness of finest splitting of Horn formulae) and to normal forms for Gödel logics. Many of Zhang Mingyi's former students have become internationally respected researchers.

18:20
EMCL Distinguished Alumni Award

ABSTRACT. The Free University of Bozen-Bolzano, the Technische Universität Dresden, the Universidade Nova de Lisboa, the Universidad Politécnica de Madrid (until 2010), the Technische Universität Wien, and Australia's Information Communications Technology Research Centre of Excellence (since 2007) are jointly running the European Master's Program in Computational Logic (EMCL) since 2004. The program is supported by the European Union within Erasmus Mundus and Eramsus+. So far, more than 100 students from 39 different countries have successfully completed the distributed master's program with a double or joint degree.

For the first time, the partner institutions are announcing the EMCL Distinguished Alumni Award for outstanding contributions of EMCL graduates to the field of Computational Logic.

http://www.emcl-study.eu/

18:30
FLoC Closing Week 2
SPEAKER: Helmut Veith