VSL 2014: VIENNA SUMMER OF LOGIC 2014
ITP ON WEDNESDAY, JULY 16TH, 2014
Days:
previous day
next day
all days

View: session overviewtalk overviewside by side with other conferences

10:15-10:45Coffee Break
10:45-13:00 Session 56C: Mathematics
Location: FH, Hörsaal 8
10:45
A Computer-Algebra-Based Formal Proof of the Irrationality of ζ(3)

ABSTRACT. This paper describes the formal verification of an irrationality proof of ζ(3), the evaluation of the Riemann zeta function, using the Coq proof assistant. This result was first proved by Apéry in 1978, and the proof we have formalized follows the path of his original presentation. The crux of this proof is to establish that some sequences verify a common recurrence. We formally prove this result by an a posteriori verification of calculations performed by computer algebra algorithms in a Maple session. The rest of the proof combines arithmetical ingredients and some asymptotic analysis that we conduct by extending the Mathematical Components libraries.

11:15
A Coq Formalization of Finitely Presented Modules

ABSTRACT. This paper presents a formalization of constructive module theory in the intuitionistic type theory of Coq. We build an abstraction layer on top of matrix encodings, in order to represent finitely presented modules, and obtain clean definitions with short proofs justifying that it forms an abelian category. The goal is to use it as a first step to compute certified topological invariants, like homology groups and Betti numbers.

11:45
Formal Verification of Optical Quantum Flip Gate
SPEAKER: unknown

ABSTRACT. Quantum computers are promising to efficiently solve hard computational problems, especially NP problems. In this paper, we propose to tackle the formal verification of quantum circuits using theorem proving. In particular, we focus on the verification of quantum computing based on coherent light, typically light produced by laser sources. We formally verify the behavior of the quantum flip gate in HOL Light: we prove that it can flip a zero-quantum-bit to a one-quantum-bit and vice versa. To this aim, we model two optical devices: the beam splitter and the phase conjugating mirror. This requires the formalization of some fundamental mathematics like exponentiation of linear transformations.

12:15
On the Formalization of Z-Transform in HOL

ABSTRACT. System analysis based on difference or recurrence equations is the most fundamental technique to analyze biological, electronic, control and signal processing systems. Z-transform is one of the most popular tool to solve such difference equations. In this paper, we present the formalization of Z-transform to extend the formal linear system analysis capabilities using theorem proving. In particular, we use differential, transcendental and topological theories of multivariate calculus to formally define Z-transform in higher-order logic and reason about the correctness of its properties, such as linearity, time shifting and scaling in z-domain. To illustrate the practical effectiveness of the proposed formalization, we present the formal analysis of an infinite impulse response (IIR) digital signal processing filter.

12:45
Mechanical Certification of Loop Pipelining Transformations: A Preview
SPEAKER: unknown

ABSTRACT. We describe our ongoing effort using theorem proving to certify loop pipelining, a critical and complex transformation employed by behavioral synthesis. Our approach is mechanized in the ACL2 theorem prover. We discuss how our approach connects to a larger framework for certifying synthesized hardware through a combination of automated and interactive verification techniques. We also discuss some of the formalization and proof challenges and our early attempts at addressing them.

08:45-10:15 Session 57: VSL Keynote Talk
Location: EI, EI 7 + EI 9, EI 10 + FH, Hörsaal 1
08:45
VSL Keynote Talk: The theory and applications of o-minimal structures
SPEAKER: Alex Wilkie

ABSTRACT. This is a talk in the branch of logic known as model theory, more precisely, in o-minimality. The first example of an o-minimal structure is the ordered algebraic structure on the set of real numbers and I shall focus on expansions of this structure. Being o-minimal means that the first order definable sets in the structure do not exhibit wild phenomena (this will be made precise). After discussing some basic theory of such structures I shall present some recent applications to diophantine geometry.

13:00-14:30Lunch Break
14:30-16:00 Session 59C: Invited Talk
Location: FH, Hörsaal 8
14:30
Showing invariance compositionally for a process algebra for network protocols

ABSTRACT. This paper presents the mechanization of a process algebra for Mobile Ad hoc Networks and Wireless Mesh Networks, and the development of a compositional framework for proving invariant properties. Mechanizing the core process algebra in Isabelle/HOL is relatively standard, but its layered structure necessitates special treatment. The control states of reactive processes, such as nodes in a network, are modelled by terms of the process algebra. We propose a technique based on these terms to streamline proofs of inductive invariance. This is not sufficient, however, to state and prove invariants that relate states across multiple processes (entire networks). To this end, we propose a novel compositional technique for lifting global invariants stated at the level of individual nodes to networks of nodes.

15:00
Invited talk: Retrofitting Rigour
SPEAKER: Peter Sewell

ABSTRACT. We rely on an enormous number of software components, which continue to be developed ---as they have been since the 1940s--- by methods based on a test-and-debug cycle and informal-prose specifications. Replacing this legacy infrastructure and development practices wholesale is not (and may never be) feasible, so if we want to improve system quality by using mathematically rigorous methods, we need a strategy to do so incrementally. A first step is to focus on the more stable extant interfaces: processor architectures, programming languages, and key APIs, file formats, and protocols. By characterising these with new precise abstractions, we can start to retrofit rigour into the system, using those (1) to improve existing components by testing, (2) as interfaces for replacement formally verified components, and (3) simply to gain understanding of what the existing behavioural interfaces are. Many issues arise in building such abstractions and in establishing confidence in them. To validate them against existing systems, they must be made executable, not necessarily in the conventional sense but as test oracles, to check whether observed behaviour of the actual system is admitted by the specification. The appropriate treatment of loose specification is key here, and varies from case to case. To validate them against the intent of the designers, they must be made broadly comprehensible, by careful structuring, annotation, and presentation. And to support formal reasoning by as broad a community as possible, they must be portable into a variety of reasoning tools. Moreover, the scale of real-world specifications adds further engineering concerns. This talk will discuss the issues of loose specification and some lessons learnt from developing and using our Ott and Lem lightweight specification tools, which are aimed at supporting all these pre-proving activities (and which target multiple theorem provers).

16:00-16:30Coffee Break
16:30-18:00 Session 61C: Automation
Location: FH, Hörsaal 8
16:30
A heuristic prover for real inequalities

ABSTRACT. We describe a general method for verifying inequalities between real-valued expressions, especially the kinds of straightforward inferences that arise in interactive theorem proving. In contrast to approaches that aim to be complete with respect to a particular language or class of formulas, our method establishes claims that require heterogeneous forms of reasoning, relying on a Nelson-Oppen-style architecture in which special-purpose modules collaborate and share information. The framework is thus modular and extensible. A prototype implementation shows that the method is promising, complementing techniques that are used by contemporary interactive provers.

17:00
Unified Decision Procedures for Regular Expression Equivalence
SPEAKER: Tobias Nipkow

ABSTRACT. We formalize a unified framework for verified decision procedures for regular expression equivalence. Five recently published formalizations of such decision procedures (three based on derivatives, two on marked regular expressions) can be obtained as instances of the framework. We discover that the two approaches based on marked regular expressions, which were previously thought to be the same, are different, and one seems to produce uniformly smaller automata. The common framework makes it possible to compare the performance of the different decision procedures in a meaningful way.

17:30
Verified Abstract Interpretation Techniques for Disassembling Low-level Self-modifying Code

ABSTRACT. Static analysis of binary code is challenging for several reasons. In particular, standard static analysis techniques operate over control flow graphs, which are not available when dealing with self-modifying programs which can modify their own code at runtime. We formalize in the Coq proof assistant some key abstract interpretation techniques that automatically extract memory safety properties from binary code. Our analyzer is formally proved correct and has been run on several self-modifying challenges, provided by Cai et al. in their PLDI 2007 paper.