The main aim of the paper is to construct a logic by which we can formalize properties of programs. Inductive definition or recursive definition plays a very important role for this purpose. Inductive definition has been studied for untyped theories, predicative typed theories and impredicative typed theories. Monotone recursive definition in an untyped theory is studied in this paper. The main point is realizability interpretation of monotone recursive definition. Untyped predicative theory TID0 and TID1 are presented, which have monotone recursive definition of predicates. TID1 has full monotone recursive definition and TID0 has only restricted monotone recursive definition. q-realizability interpretation of TID0 and TID1 is defined. It is proved that the realizability interpretation of TID0 is sound and that the realizability interpretation of TID1 is not sound, though TID1 and its interpretation seem very natural.
Various theories have been discussed to formalize computation and to develop new methods of programming. The theory EON+mu which has the facility of defining predicates recursively is presented. The theory EON+mu is constructed by adding the facility of recursive definition of predicates to Beeson's EON. Recursively defined data such as lists and trees can be introduced naturally using recursively defined predicates. This facility also enables us to represent a specification formula of a program more easily. In Hayashi's PX, the facility of recursive definition of predicates, called conditional inductive generation, can be used under the syntactical condition that recursive calls of the original predicate occur only in Harrop subformulae. In this paper, a less restricted class of recursively defined predicates can be used. Not only a predicate whose recursive calls occur in Harrop subformulae but a predicate whose recursive calls occur at strictly positive positions is available. This class includes the class of inductive definition with Horn clauses used in Prolog, because of our weaker restriction of strictly positive occurrences of recursive calls. The extended q-realizability interpretation of EON+mu in EON+mu is given to treat recursive definition of predicates. A recursively defined predicate is interpreted as another recursively defined predicate naturally corresponding to the original predicate by this q-realizability interpretation. The soundness of the interpretation is proved in favor of the syntactical condition of recursively defined predicates. An method of program synthesis using q-realizability interpretation in this theory is presented. This method of program synthesis are explained by several examples.
The main aim of this paper is to construct a logic by which properties of programs can be formalized for verification, synthesis and transformation of programs. This paper has 2 main points. One point is realizability interpretation of coinductive definitions of predicates. The other point is an extraction of programs which treat streams. An untyped predicative theory TIDnu is presented, which has the facility of coinductive definitions of predicates and is based on a constructive logic. Properties defined by the greatest fixed point, such as streams and the extensional equality of streams, can be formalized by the facility of coinductive definitions of predicates in TIDnu. q-realizability interpretation for TIDnu is defined and the realizability interpretation is proved to be sound. By the realizability interpretation, a program which treats streams can be extracted from a proof of its specification in TIDnu. General program extraction theorem and stream program extraction theorem are presented.
A minimal formula is a formula which is minimal in provable formulas with respect to the substitution relation. This paper shows the followings: (1) A beta-normal proof of a minimal formula of depth 2 is unique in NJ. (2) There exists a minimal formula of depth 3 whose beta-eta-normal proof is not unique in NJ. (3) There exists a minimal formula of depth 3 whose beta-eta-normal proof is not unique in NK.
The main aim of the paper is to construct a logical system in which properties of programs can be formalized for verification, synthesis and transformation of programs. The paper has 2 main points. One point is a realizability interpretation of coinductive definitions of predicates. The other point is extraction of programs which treat streams. An untyped predicative theory TIDnu is presented, which has the facility of coinductive definitions of predicates and is based on constructive logic. Properties defined by the greatest fixed point, such as streams and the extensional equality of streams, can be formalized by the facility of coinductive definitions of predicates in TIDnu. A q-realizability interpretation for TIDnu is defined and the soundness of the interpretation is proved. By the realizability interpretation, a program which treats streams can be extracted from a proof of its specification in TIDnu. General program extraction theorem and stream program extraction theorem are presented.
Generalized inductive definitions give a way of defining a predicate as the least solution P of the equation P <--> A[P] where a predicate variable P may occur in a formula A[P] positively. This paper gives a q-realizability interpretation of generalized inductive definitions and prove the soundness of the interpretation.
Realizability of monotone inductive definitions is studied to construct a logical system in which recursive data structures and recursive control structures of programs are naturally formalized by using inductive definitions and programs can be produced from constructive proofs by using realizability. Two q-realizability interpretations of monotone inductive definitions are presented. One interpretation is defined so that a realizer of an inductively defined predicate is the same as a realizer of its expansion. It is proved that the interpretation of restricted monotone inductive definitions is sound and that the interpretation of full monotone inductive definitions is not sound, though the interpretation seems very natural. Another interpretation is presented by using the higher order coding and q-realizability for the second order logic. The soundness of this interpretation of full monotone inductive definitions is also proved. To prove this result, greatest-lower-bound inductive definitions are introduced.
This paper proves that the division operation a/m is provably total in the system S_2^0 of Bounded Arithmetic if and only if m is of the form 2^n for some n.
The uniform continuity property MCONT for a partial combinatory algebra with pairing and natural numbers is that there exists a functional which uniformly produces from a type 2 functional its associate in the finite type structure on the algebras. This paper shows that actually either the the partial function model or the graph model does not have MCONT.
Two realizability interpretations of monotone coinductive definitions are studied. One interpretation is defined so that a realizer of a coinductively defined predicate is the same as that of its expansion. For this interpretation, the paper proves that full monotone coinductive definitions are not sound and restricted monotone coinductive definitions are sound. The other interpreration is based on second order logic and can interpret least-upper-bound coinductive definitions, which is generalization of monotone coinductive definitions. By using these interprerations, the paper shows that a program which treats coinductively defined infinite data structures such as streams can be synthesized from a constructive proof of its specification.
This paper gives a q-realizability interpretation for Feferman's constructive theory T0 of functions and classes by using a set completion program and without doubling variables, and proves its soundness. This result solves an open problem proposed by Feferman in 1979. Moreover by using this interpretation we can prove a program extraction theorem for T0, which enables us to use constructive sets of T0 for program synthesis.
Number of normal proofs has been studied widely. This paper discusses the notion of D-normal proofs to present one of the weakest condition for uniqueness of normal proofs. This paper shows that a beta-eta-D-normal proof of a formula is unique, and by using this this paper provides another proof of the fact that (1) a beta-eta-normal proof of a formula satisfying the PNN-condition is unique, and (2) in BCK logic, a beta-eta-normal proof of a minimal formula is unique.
There have been much interest in the conditions for uniqueness of normal form proofs in intuitionistic logic. [Tatsuta99] introduced a notion of D-normal form proof and showed that D-normal form proof is unique for any implicational formula. But this result means the uniqueness of D-normal form proof. There may be some other normal form proof which is not D-normal. We show that D-normality yields the uniqueness of proofs as long as the long-normal form proofs are concerned.
A simple and complete proof of strong normalization for first and second order intuitionistic natural deduction including disjunction, first-order existence and permutative conversions is given. The paper follows Tait-Girard approach via computability predicates (reducibility) and saturated sets. Strong normalization is first established for a set of conversions of a new kind, then deduced for the standard conversions. Difficulties arising for disjunction are resolved using a new logic where disjunction is restricted to atomic formulas.
A clear and complete proof of strong normalization of second order natural deduction with permutative conversions is given by using Prawitz's strong validity. This paper completes Prawitz's original proof.
A model characterising strong normalisation for Klop's extension of lambda-calculus is presented. The main technical tools for this result are an inductive definition of strongly normalising terms of Klop's calculus and an intersection type system for terms of Klop's calculus.
This paper analyses the computational behaviour of lambda-term applications. The properties we are interested in are weak normalisation (i.e. there is a terminating reduction) and strong normalisation (i.e. all reductions are terminating). One can prove that the application of a lambda-term M to a fixed number n of copies of the same arbitrary strongly normalising lambda-term is strongly normalising if and only if the application of M to n different arbitrary strongly normalising lambda-terms is strongly normalising. I.e. one has that M X ... X (n times) is strongly normalising, for an arbitrary strongly normalising X, if and only if M X1 ... Xn is strongly normalising for arbitrary strongly normalising X1,...,Xn. The analogous property holds when replacing strongly normalising by weakly normalising. As an application of the result on strong normalisation the lambda-terms whose interpretation is the top element (in the environment which associates the top element to all variables) of the Honsell-Lenisa model turn out to be exactly the lambda-terms which, applied to an arbitrary number of strongly normalising lambda-terms, produces always strongly normalising lambda-terms. This proof uses a finitary logical description of the model by means of intersection types. This answers an open question stated by Dezani, Honsell and Motohama.
This paper gives simple saturated sets for disjunction and second-order existential quantification by using the idea of segments in Prawitz's strong validity. Saturated sets for disjunction are defined by Pi-0-1 comprehension and those for second-order existential quantification are defined by Sigma-1-1 comprehension. Saturated-set semantics and a simple strong normalization proof are given to the system with disjunction, second-order existential quantification, and their permutative conversions. This paper also introduces the contraction property to saturated sets, which gives us saturated sets closed under union. This enables us to have saturated-set semantics for the system with union types, second-order existential types, and their permutative conversions, and prove its strong normalization.
This paper gives the exact number of maximum length of mu-reduction and permutative conversions of disjunction for an untyped term in lambda mu-calculus with disjunction. This number is described by using induction on the number of symbols in a term.
This paper shows the equivalence for provability between two infinitary systems with the omega-rule. One system is the positive one-sided fragment of Peano arithmetic without Exchange rules. The other system is two-sided Heyting Arithmetic plus the law of Excluded Middle for Sigma^0_1-formulas, and it includes Exchange. Thus, the logic underlying positive Arithmetic without Exchange, a substructural logic, is shown to be a logic intermediate between Intuitionism and Classical Logic, hence a subclassical logic. As a corollary, the authors derive the equivalence for positive formulas among provability in those two systems and validity in two apparently unrelated semantics: Limit Computable Mathematics, and Game Semantics with 1-backtracking.
This paper proves strong normalization of classical natural deduction with disjunction and permutative conversions, by using CPS-translation and augmentations. By them, this paper also proves strong normalization of classical natural deduction with general elimination rules for implication and conjunction, and their permutative conversions. This paper also proves natural deduction can be embedded into natural deduction with general elimination rules, strictly preserving proof normalization.
Klop's Problem is finding a type for characterizing hereditary head normalizing terms, that is, lambda-terms whose B\"ohm trees do not contain the bottom. This paper proves that this problem does not have any solution by showing that the set of those terms is not recursively enumerable. This paper also gives a best-possible solution by providing an intersection type system with a countably infinite set of types such that typing in all these types characterizes hereditary head normalizing terms. By using the same technique, this paper also shows that the set of lambda-terms normalizing by infinite reduction is not recursively enumerable.
This paper answers the open problem of finding a type system that characterizes hereditary permutators. First this paper shows that there does not exist such a type system by showing that the set of hereditary permutators is not recursively enumerable. The set of positive primitive recursive functions is used to prove it. Secondly this paper gives a best-possible solution by providing a countably infinite set of types such that a term has every type in the set if and only if the term is a hereditary permutator. By the same technique for the first claim, this paper also shows that a set of normalizing terms in infinite lambda-calculus is not recursively enumerable if it contains some term having a computable infinite path, and shows the set of streams is not recursively enumerable.
This paper shows undecidability of type-checking and type-inference problems in domain-free typed lambda-calculi with existential types: a negation-conjunction fragment, and an implicational fragment. These are proved by reducing type-checking and type-inference problems of the domain-free polymorphic typed lambda-calculus to those of the lambda-calculi with existential types by continuation passing style translations.
The study of type isomorphisms for different lambda calculi started over twenty years ago, and a very wide body of knowledge has been established, both in terms of results and in terms of techniques. A notable missing piece of the puzzle was the characterization of type isomorphisms in the presence of intersection types. While at first thought this may seem to be a simple exercise, it turns out that not only finding the right characterization is not simple, but that the very notion of isomorphism in intersection types is an unexpectedly original element in the previously known landscape, breaking most of the known properties of isomorphisms of the typed lambda calculus. In particular, types that are equal in the standard models of intersection types may be non-isomorphic.
This paper shows the inhabitation in the lambda calculus with negation, product, and existential types is decidable. This is proved by showing existential quantification can be eliminated and reducing the problem to provability in intuitionistic propositional logic. By the same technique, this paper also shows existential quantification followed by negation can be replaced by a specific witness in both that system and the system with implication and bottom.
This paper investigates the system obtained from the second-order natural deduction by removing implication and disjunction, and shows (1) the decidability of its provability, (2) Glivenko's theorem, and (3) Double Negation Shift. This paper also gives a simpler proof to (4) implication from negation of universal quantification to existential quantification of negation, and (5) equivalence of provability of a formula without assumptions in this system and the classical logic obtained from this system.
Klop's Problem and TLCA open problem 20 were soloved recently by using the same techniques, coding primitive recursive functions by infinite objects and a countably infinite set of types. This talk discusses these techniques and gives future directions. Klop's Problem was finding a type for characterizing hereditary head normalizing terms, that is, lambda-terms whose Bohm trees do not contain the bottom. In the FLOPS2008 conference paper, Tatsuta proved that this problem does not have any solution by showing that the set of those terms is not recursively enumerable. The paper also gave a best-possible solution by providing an intersection type system with a countably infinite set of types such that typing in all these types characterizes hereditary head normalizing terms. TLCA open problem 20 was a problem of finding a type system that characterizes hereditary permutators. In the LICS2008 conference paper, Tatsuta showed that there does not exist such a type system by showing that the set of hereditary permutators is not recursively enumerable. Secondly this paper gave a best-possible solution by providing a countably infinite set of types such that a term has every type in the set if and only if the term is a hereditary permutator. This talk explains these techniques can be used to investigate types for streams, coinductively defined data, and infinite lambda-calculus.
This paper proves undecidability of type checking and type inference problems in some variants of typed lambda calculi with polymorphic and existential types. First, type inference in the domain-free polymorphic lambda calculus is proved to be undecidable, and then it is proved that type inference is undecidable in the negation, conjunction, and existence fragment of the domain-free typed lambda calculus. Secondly, their variants with multiple quantifier rules are introduced, and their type checking and type inference are proved to be undecidable. Finally, it is proved that we can reduce undecidability of type checking and type inference problems in the Curry-style lambda calculus in negation, conjunction, and existential fragment to undecidability of those problems in another variant of the domain-free polymorphic lambda calculus.
This paper gives an extension of Dual Calculus by introducing inductive types and coinductive types. The same duality as Dual Calculus is shown to hold in the new system, that is, this paper presents its involution for the new system and proves that it preserves both typing and reduction. The duality between inductive types and coinductive types is shown by the existence of the involution that maps an inductive type and a coinductive type to each other. The strong normalization in this system is also proved. First, strong normalization in second-order Dual Calculus is shown by translating it into second-order symmetric lambda calculus. Next, strong normalization in Dual Calculus with inductive and coinductive types is proved by translating it into second-order Dual Calculus.
The study of type isomorphisms for different lambda-calculi started over twenty years ago, and a very wide body of knowledge has been established, both in terms of results and in terms of techniques. A notable missing piece of the puzzle was the characterization of type isomorphisms in the presence of intersection types. While at first thought this may seem to be a simple exercise, it turns out that not only finding the right characterization is not simple, but that the very notion of isomorphism in intersection types is an unexpectedly original element in the previously known landscape, breaking most of the known properties of isomorphisms of the typed lambda-calculus. In particular, isomorphism is not a congruence and types that are equal in the standard models of intersection types may be non-isomorphic.
This paper investigates a non-commutative first-order sequent calculus NCLK. For that, this paper extends a non-commutative positive fragment to a full first-order sequent calculus LK- having antecedent-grouping and no right exchange rule. This paper shows (1) NCLK is equivalent to LJ, (2) NCLK with the exchange rule is equivalent to LK, (3) LK- is equivalent to LJ, and (4) translations between LK- and NCLK.
Reynolds' separation logical system for pointer program verification is investigated. This paper proves its completeness theorem as well as the expressiveness theorem that states the weakest precondition of every program and every assertion can be expressed by some assertion. This paper also introduces the predicate that represents the next new cell, and proves the completeness and the soundness of the extended system under deterministic semantics.
This paper shows the inhabitation problem in the lambda calculus with negation, product, polymorphic, and existential types is decidable, where the inhabitation problem asks whether there exists some term that belongs to a given type. In order for that, this paper proves the decidability of the provability in the logical system defined from the second-order natural deduction by removing implication and disjunction. This is proved by showing the quantifier elimination theorem and reducing the problem to the provability in propositional logic. The magic formulas are used for quantifier elimination so that they replace quantifiers. As a byproduct, this paper also shows the second-order witness theorem which states a quantifier followed by negation can be replaced by a witness obtained only from the formula. As a corollary of the main results, this paper also shows Glivenko's theorem, Double Negation Shift, and conservativeness for anticedent-empty sequents between the logical system and its classical version.
A multiple quantifier is a quantifier having inference rules that introduce or eliminate arbitrary number of quantifiers by one inference. This paper introduces the lambda calculus with negation, conjunction, and multiple existential quantifiers, and the lambda calculus with implication and multiple universal quantifiers. Their type checking and type inference are proved to be undecidable. This paper also shows that the undecidability of type checking and type inference in the type-free-style lambda calculus with negation, conjunction, and existence is reduced to the undecidability of type checking and type inference in the type-free-style polymorphic lambda calculus.
Static analysis of multi-staged programs is challenging because the basic assumption of conventional static analysis no longer holds: the program text itself is no longer a fixed static entity, but rather a dynamically constructed value. This article presents a semantic-preserving translation of multi-staged call-by-value programs into unstaged programs and a static analysis framework based on this translation. The translation is semantic-preserving in that every small-step reduction of a multi-staged program is simulated by the evaluation of its unstaged version. Thanks to this translation we can analyze multi-staged programs with existing static analysis techniques that have been developed for conventional unstaged programs: We first apply the unstaging translation, then we apply conventional static analysis to the unstaged version, and finally we cast the analysis results back in terms of the original staged program. Our translation is practical enough; it handles staging constructs that have been evolved to be useful in practice (typified in Lisp's quasi-quotation): open code as values, unrestricted operations on references and intentional variable-capturing substitutions.
This paper shows (1) the undecidability of the type checking and the typability problems in the domain-free lambda calculus with negation, product, and existential types, (2) the undecidability of the typability problem in the domain-free polymorphic lambda calculus, and (3) the undecidability of the type checking and the typability problems in the domain-free lambda calculus with function and existential types. The first and the third results are proved by the second result and CPS translations that reduce those problems in the domain-free polymorphic lambda calculus to those in the the domain-free lambda calculi with existential types. The key idea is the conservativity of the domain-free lambda calculi with existential types over the images of the translations.
This paper proposes bimorphic recursion, which is restricted polymorphic recursion such that every recursive call in the body of a function definition has the same type. Bimorphic recursion allows us to assign two different types to a recursively defined function: one is for its recursive calls and the other is for its calls outside its definition. Bimorphic recursion in this paper can be nested. This paper shows bimorphic recursion has principal types and decidable type inference. Hence bimorphic recursion gives us flexible typing for recursion with decidable type inference. This paper also shows that its typability becomes undecidable because of nesting of recursions when one removes the instantiation property from the bimorphic recursion.
Does there exist any sequent calculus such that it is a subclassical logic and it becomes classical logic when the exchange rules are added? The first contribution of this paper is answering this question for infinitary Peano arithmetic. This paper defines infinitary Peano arithmetic with non-commutative sequents, called non-commutative infinitary Peano arithmetic, so that the system becomes equivalent to Peano arithmetic with the omega-rule if the the exchange rule is added to this system. This system is unique among other non-commutative systems, since all the logical connectives have standard meaning and specifically the commutativity for conjunction and disjunction is derivable. This paper shows that the provability in non-commutative infinitary Peano arithmetic is equivalent to Heyting arithmetic with the omega rule and the law of excluded middle for Sigma-0-1 formulas. Thus, non-commutative infinitary Peano arithmetic is shown to be a subclassical logic. The cut elimination theorem in this system is also proved. The second contribution of this paper is introducing infinitary Peano arithmetic having antecedent-grouping and no right exchange rules. The first contribution of this paper is achieved through this system. This system is obtained from the positive fragment of infinitary Peano arithmetic without the exchange rules by extending it from a positive fragment to a full system, preserving its 1-backtracking game semantics. This paper shows that this system is equivalent to both non-commutative infinitary Peano arithmetic, and Heyting arithmetic with the omega rule and the Sigma-0-1 excluded middle.
This paper considers Girard's internal coding of each term of System F by some term of a code type. This coding is the type-erasing coding definable already in the simply typed lambda-calculus using only abstraction on term variables. It is shown that there does not exist any decompiler for System F in System F, where the decompiler maps a term of System F to its code. An internal model of F is given by interpreting each type of F by some type equipped with maps between the type and the code type. This paper gives a decompiler-normalizer for this internal model in F, where the decompiler-normalizer maps any term of the internal model to the code of its normal form. It is also shown that for any model of F the composition of this internal model and the model produces another model of F whose equational theory is below untyped beta-eta-equality.
Substitution Theorem is a new theorem in untyped lambda calculus, which was proved in 2006. This theorem states that for a given lambda term and given free variables in it, the term becomes weakly normalizing when we substitute arbitrary weakly normalizing terms for these free variables, if the term becomes weakly normalizing when we substitute a single arbitrary weakly normalizing term for these free variables. This paper formalizes and verifies this theorem by using the higher-order theorem prover HOL. A control path, which is the key notion in the proof, explicitly uses names of bound variables in lambda terms, and it is defined only for lambda terms without bound variable renaming. The lambda terms without bound variable renaming are formalized by using the HOL package based on contextual alpha-equivalence. The verification uses 10119 lines of HOL code and 326 lemmas.
This paper extends the dual calculus with inductive types and coinductive types. The paper first introduces a non-deterministic dual calculus with inductive and coinductive types. Besides the same duality of the original dual calculus, it has the duality of inductive and coinductive types, that is, the duality of terms and coterms for inductive and coinductive types, and the duality of their reduction rules. Its strong normalization is also proved, which is shown by translating it into a second-order dual calculus. The strong normalization of the second-order dual calculus is proved by translating it into the second-order symmetric lambda calculus. This paper then introduces a call-by-value system and a call-by-name system of the dual calculus with inductive and coinductive types, and shows the duality of call-by-value and call-by-name, their Church-Rosser property, and their strong normalization. Their strong normalization is proved by translating them into the non-deterministic dual calculus with inductive and coinductive types.
This paper introduces a game semantics for Arithmetic with various sub-classical logics that have implication as a primitive connective. This semantics clarifies the infinitary sequent calculus that the authors proposed for intuitionistic arithmetic with Excluded Middle for Sigma-0-1-formulas, a formal system motivated by proof mining and by the study of monotonic learning, for which no game semantics is known. This paper proposes games with Sequential Backtracking, and proves that they provide a sound and complete semantics for the logical system and other various subclassical logics. In order for that, this paper also defines a one-sided version of the logical system, whose proofs have a tree isomorphism with respect to the winning strategies of the game semantics.
This paper extends Reynolds' separation logical system for pointer-based while program verification by adding inductive definitions. Inductive definitions give us a great advantage for verification, since they enable us for example, to formalize linked lists and to support the lemma reasoning mechanism. This paper proves its completeness theorem that states that every true asserted program is provable in the logical system. In order to prove its completeness, this paper shows an expressiveness theorem that states the weakest precondition of every program and every assertion can be expressed by some assertion.
This paper proves the decidability of entailments in separation logic with monadic inductive definitions and implicit existentials. This system is obtained from the bounded-treewidth separation logic SLRDbtw of Iosif et al in 2013 by adding implicit existential variables and restricting inductive definitions to monadic ones. The system proposed in this paper is a decidable system where one can use general recursive data structures with pointers as data, such as lists of pointers. The key idea is to reduce the problem to the decidability of SLRDbtw, by assigning local addresses or some distinguished address to implicit existential variables so that the resulting definition clauses satisfy the establishment condition of SLRDbtw. This paper also proves the undecidability of the entailments when one adds implicit existentials to SLRDbtw. This shows that the implicit existentials are critical for the decidability.
This paper discusses implementation of an entailment checker for the logical system of symbolic heaps with monadic inductive definitions proposed by the same authors. The logical system is worth implementation, since its entailments are decidable and it has general inductive definitions as well as implicit existentials. The implementation is done by translating the system into monadic second-order logic and checking the resulting MSO formula by the decision procedure MONA developed in Denmark. This paper discusses the equivalence theorem of the translation. This paper also explains some ideas for efficient implementation, in particular, optimization of equivalence relation on tree nodes.
This paper proves the completeness of an extension of Hoare's logic and separation logic for pointer programs with mutual recursive procedures. This paper shows the expressiveness of the assertion language as well. This paper achieves a new system by introducing two new inference rules, and removes an axiom that is unsound in separation logic and other redundant inference rules for showing completeness. It introduces a novel expression that is used to describe complete information of a given state in a precondition. This work also uses the necessary and sufficient precondition of a program for the abort-free execution, which enables us to utilize strongest postconditions.
A cyclic proof system, called CLKIDomega, gives us another way of representing inductive definitions and efficient proof search. The 2011 paper by Brotherston and Simpson showed that the provability of CLKIDomega includes the provability of Martin-Lof's system of inductive definitions, called LKID, and conjectured the equivalence. Since then, the equivalence has been left an open question. This paper shows that CLKIDomega and LKID are indeed not equivalent. This paper considers a statement called 2-Hydra in these two systems with the first-order language by 0, the successor, the natural number predicate, and a binary predicate symbol used to express 2-Hydra. This paper shows that the 2-Hydra statement is provable in CLKIDomega, but the statement is not provable in LKID, by constructing some Henkin model where the statement is false.
This paper proves the decidability of validity of entailments in the logical system of symbolic heaps with arrays. This paper also discusses an implementation of a decision procedure based on it. Separation logic is successfully used for verification of memory safety of pointer programs. However, dynamically allocated arrays have been investigated only by a manuscript by Brotherston et al, and they discuss only a separation logic system that contains arrays and does not contain the points-to predicate. This paper shows the first decidability result for the logical system that contains both arrays and the points-to predicate. This paper first gives a procedure that checks if a given entailment in the logical system is valid. Our decision procedure reduces the validity of entailments with arrays to that of Presburger arithmetic. Then the correctness of the procedure is shown, by induction on the execution steps of the procedure. The implementation of our decision procedure is also discussed. It first translates the given entailment into the corresponding formula in Presburger arithmetic. Finally the validity of the formula is checked by an external SMT-solver. The straightforward implementation is not so efficient. We also discuss some possible improvements towards a more efficient implementation.
A cyclic proof system, called CLKID-omega, gives us another way of representing inductive definitions and efficient proof search. The 2011 paper by Brotherston and Simpson showed that the provability of CLKID-omega includes the provability of the classical system of Martin-Lof's inductive definitions, called LKID, and conjectured the equivalence. By this year the equivalence has been left an open question. In general, the conjecture was proved to be false in FoSSaCS 2017 paper by Berardi and Tatsuta. However, if we restrict both systems to only the natural number inductive predicate and add Peano arithmetic to both systems, the conjecture is proved to be true in FoSSaCS 2017 paper by Simpson. This paper shows that if we add arithmetic to both systems, they become equivalent, namely, the conjecture holds. The result of this paper includes that of the paper by Simpson as a special case. In order to construct a proof of LKID for a given cyclic proof, this paper shows every bud in the cyclic proof is provable in LKID, by cutting the cyclic proof into subproofs such that in each subproof the conclusion is a companion and the assumptions are buds. The global trace condition gives some induction principle, by using an extension of Podelski-Rybalchenko termination theorem from well-foundedness to induction schema. In order to show this extension, this paper also shows that infinite Ramsey theorem is formalizable in Peano arithmetic.
We consider the satisfiability problem for a fragment of separation logic including inductive predicates with shape and arithmetic properties. We show that the fragment is decidable if the arithmetic properties can be represented as semilinear sets. Our decision procedure is based on a novel algorithm to infer a finite representation for each inductive predicate which precisely characterises its satisfiability. Our analysis shows that the proposed algorithm runs in exponential time in the worst case. We have implemented our decision procedure and integrated it into an existing verification system. Our experiment on benchmarks shows that our procedure helps to verify the benchmarks effectively.
This paper gives a decision procedure for the validity of entailment of symbolic heaps in separation logic with Presburger arithmetic and arrays. The correctness of the decision procedure is proved under the condition that sizes of arrays in the succedent are not existentially bound. This condition is independent of the condition proposed by the CADE-2017 paper by Brotherston et al, namely, one of them does not imply the other. For improving efficiency of the decision procedure, some techniques are also presented. The main idea of the decision procedure is a novel translation of an entailment of symbolic heaps into a formula in Presburger arithmetic, and to combine it with an external SMT solver. This paper also gives experimental results by an implementation, which shows that the decision procedure works efficiently enough to use.
We address the problem of un-supervised soft-clustering that we call micro-clustering. The aim of the problem is to enumerate all groups composed of records strongly related to each other, while standard clustering methods find boundaries at which records are few. The existing methods have several weak points; generating intractable amount of clusters, quite biased size distribution, unrobustness, etc. We propose a new methodology data polishing. Data polishing clarifies the cluster structures in the data by perturbating the data according to feasible hypothesis. More precisely, for graph clustering problems, data polishing replaces dense subgraphs that would correspond to clusters by cliques, and deletes edges not included in any dense subgraph. The clusters are clarified as maximal cliques, thus easy to be found, and the number of maximal cliques is reduced to tractable numbers. We also propose an efficient algorithm so that the computation is done in few minutes even for large scale data. The computational experiments demonstrate the efficiency of our formulation and algorithm, i.e., the number of solutions is small, such as 1,000, the members of each group are deeply related, and the computation time is short.
We address the problem of un-supervised soft-clustering that we call micro-clustering. The aim of the problem is to enumerate all groups composed of records strongly related to each other, while standard clustering methods find boundaries at which records are few. The existing methods have several weak points; generating intractable amount of clusters, quite biased size distribution, unrobustness, etc. We propose a new methodology data polishing. Data polishing clarifies the cluster structures in the data by perturbating the data according to feasible hypothesis. More precisely, for graph clustering problems, data polishing replaces dense subgraphs that would correspond to clusters by cliques, and deletes edges not included in any dense subgraph. The clusters are clarified as maximal cliques, thus easy to be found, and the number of maximal cliques is reduced to tractable numbers. We also propose an efficient algorithm so that the computation is done in few minutes even for large scale data. The computational experiments demonstrate the efficiency of our formulation and algorithm, i.e., the number of solutions is small, such as 1,000, the members of each group are deeply related, and the computation time is short.
This paper gives a decision procedure of the validity of entailment of symbolic heaps of separation logic with Presburger arithmetic, arrays and lists. The main idea of our decision procedure is to eliminate lists from a given entailment. In order to eliminate lists in the antecedent of the entailment, we extend the unroll collapsing technique (Berdine et.al. 2004). After eliminating lists in the antecedent, lists in the succedent can also be eliminated by proof searching method. Finally we apply our previous decision procedure to entailments without lists, which was presented in our APLAS2017 paper (Kimura and Tatsuta 2017). By these steps we obtain a new decision procedure.
A cyclic proof system gives us another way of representing inductive and coinductive definitions and efficient proof search. Podelski-Rybalchenko termination theorem is important for program termination analysis. This paper first shows that Heyting arithmetic HA proves Kleene-Brouwer theorem for induction and Podelski-Rybalchenko theorem for induction. Then by using this theorem this paper proves the equivalence between the provability of the intuitionistic cyclic proof system and that of the intuitionistic system of Martin-Lof's inductive definitions when both systems contain HA.
We propose the strong wand and the Factor rule in a cyclic-proof system for the separation-logic entailment checking problem with general inductive predicates. The strong wand is similar to but stronger than the magic wand and is defined syntactically. The Factor rule, which uses the strong wand, is an inference rule for spatial factorization to expose an implicitly allocated cell in inductive predicates. By this rule, we can void getting stuck in the Unfold-Match-Remove strategy. We show a semi-decision algorithm of proof search in the cyclic-proof system with the Factor rule and the experimental results of its prototype implementation.
This talk defines g-realizability and r-realizability of Mendler-style nonmonotonic inductive and coinductive definitions, and proves their soundness.