*NII Logic Seminar* is a regular academic meeting held every week
in National Institute of Informatics for those who are interested in

- Mathematical Logic (constructive logic, proof theory, linear logic)
- Theoretical Computer Science (lambda calculus, type theory, computational complexity)
- Mathematical Linguistics and Natural Language Semantics

- Makoto Tatsuta (Professor)
- Zhenjiang Hu (Professor)
- Makoto Kanazawa (Associate Professor)
- Kazushige Terui (Associate Professor)
- Daisuke Kimura (Postdoctoral Researcher)
- Takayuki Koai (PhD student)
- Mahmudul Faisal Al Ameen (PhD student)

- Ryo Yoshinaka (PhD student)
- Sylvain Salvati (Visiting Scholar)

- (Jan 24, 2008) Wei-Ngan Chin (National University of Singapore), Dependable Software via Automated Verification (abstract)
- (Dec 18, 2007) Benjamin Werner (INRIA and Ecole Polytechnique), Simple Types and Binders inside Coq (abstract)
- (Dec 5, 2007) Jean-Pierre Jouannaud (Ecole Polytechnique and INRIA), The Calculus of Congruent Inductive Constructions (abstract)
- (July 17, 2007) Kwangkeun Yi (Seoul National University), A Polymorphic Modal Type System for General Multi-Staged Languages (abstract)
- (June 6, 2007) Tatsuya Abe (University of Tokyo), Full completeness of an extended standard-like translation (abstract)
- (May 30, 2007) Nobuko Yoshida (Imperial College London), Mobile Processes, Types and Session Interaction (abstract)
- (December 12, 2006) Tadeusz Litak (JAIST), Algebraization of hybrid logic with binders and the bounded fragment (abstract)
- (Dec 8, 2006) Jean-Pierre Jouannaud (Univ of Paris-Sud, Ecole polytechnique), Modularity of Church-Rosser : the Complete Picture (abstract)
- (September 20, 2006) Elio Giovannetti (Torino University), Types for ambient and process mobility (abstract)
- (May 1, 2006) Philip Wadler (University of Edinburgh), Call-by-Value is Dual to Call-by-Name Reloaded (abstract)
- (April 27, 2006) Benjamin Werner (INRIA & Ecole polytechnique), Computational Formal Proofs (abstract)
- (March 30, 2006) Martin Berger (Queen Mary, Univ of London), A Logical Analysis of Aliasing in Imperative Higher-Order Functions (abstract)
- (March 29, 2006) Masahito Hasegawa (Kyoto Univ), Relational Parametricity and Control (abstract)
- (March 14, 2006) Agata Ciabattoni (Vienna University of Technology), Hypersequent calculi for non classical logics -- an overview (abstract)
- (March 1, 2006) Stefano Berardi (Torino Univ.), Semantic of Backtracking (abstract)
- (January 16, 2006) Shin-Cheng Mu lecture (Univ. of Tokyo), An Algebraic Approach to Bi-directional Updating (abstract)
- (Dec 12, 2005) Anton Setzer (University of Wales Swansea, UK), Inductive-recursive definitions and initial algebras (abstract)
- (Dec 12, 2005) Toshiyasu Arai (Kobe University), A path ordering POP (abstract)
- (Nov 28, 2005) Sylvain Salvati (NII), Linear matching in the linear lambda-calculus: complexity insights and practical algorithms. (abstract)
- (Oct 31, 2005) Patrick Baillot (CNRS, University of Paris-Nord), Verification of Ptime reducibility for system F lambda-terms via Linear logic (abstract)
- (Oct 31, 2005) Claudia Faggian (University of Padova, University of Paris 7), Ludics nets, a game model of concurrent interaction (abstract)
- (Jun 20, 2005) Kentaro Sato (Kobe Univ. and Dept. of Math., Univ. of Michigan), Prime Filters in Substructural Logics from Topos Theoretic Point of View (abstract)
- (Apr 14, 2005) Patrick Baillot (CNRS, University of Paris-Nord), From Linear logic to Implicit computational complexity (abstract)
- (Apr 14, 2005) Ugo Dal Lago (University of Bologna), Quantitative models and implicit complexity (abstract)
- (Mar 15, 2005) Agata Ciabattoni (Vienna Univ. of Technology), Automated Generation of Analytic Calculi for Logics with Linearity (abstract)
- (Jan 19, 2005) Makoto Tatsuta (NII), Second Order Strong Normalization with Permutative Conversions (abstract)
- (Jan 12, 2005) Kazushize Terui (NII), Towards a semantic characterization of cut-elimination
- (Dec 8, 2004) Makoto Kanazawa (NII), Computing interpolants in implicational logics II
- (Dec 1, 2004) Makoto Kanazawa (NII), Computing interpolants in implicational logics I

- Third NII Type Theory Workshop (February 12, 2008) (program)
- Second NII Type Theory Workshop (March 7, 2007) (program)
- Third Workshop on Lambda Calculus and Formal Grammar (Januar 29, 2007) (program,)
- Second Workshop on Lambda Calculus and Formal Grammar (February 2, 2006) (program,original information)
- Type Theory Workshop (January 23, 2006) (program)
- NII Workshop on Focus (Nov 10--11, 2005) (program,original information)
- Lambda Calculus and Formal Grammar (Feb 18, 2005) (program,original information)

Despite their popularity and importance, pointer-based programs with linked data structures remain a major challenge for program verification. We propose an automated verification system that is concise, precise and expressive for ensuring the safety of pointer-based programs. Our approach is based on separation logic and uses user-definable shape predicates to allow programmers to describe a wide range of data structures with their associated properties. To support automatic verification, we design a new entailment checking procedure that can handle well-founded inductive predicates using unfold/fold reasoning. To improve expressivity, we support set of states for proof search, intersection types for methods and coercion rules for related shape predicates. Recently, We have also applied these modular and reusable verification techniques to OO programs. We have proven the soundness and termination of our verification system, and have built a working system.

Simply typed lambda-calculus can be represented in type theory using either a shallow or a deep embedding. In the first case, one uses the fact that simply typed lambda-calculus is a fragment of type theory. It is thus immediate to view simply typed lambda-terms as (say) Coq programs. In the second case, one defines a data-type for representing lambda-terms (with named variables, or de Bruijn indexes...) and defines a well-typedness predicate over them. I will show how one can define translations from one to the other representation in Coq using the Normalization by Evaluation technique. In particular this yields a decompilation function for simply typed programs. I will then sketch how this suggests a new approach to the handling of syntax with binders in type theoretical provers like Coq. This is joint work with Francois Garillot. Part of it was published at TPHOLs 2007. The last part is work in progress.

It is commonly agreed that the success of future proof assistants will rely on their ability to incorporate computations within deductions in order to mimic the mathematician when replacing the proof of a proposition P by the proof of an equivalent proposition P' obtained from P thanks to possibly complex calculations. In this paper, we investigate a new version of the calculus of inductive constructions which incorporates arbitrary first-order decision procedures into deductions via the conversion rule of the calculus. Besides the novelty of the problem itself in the context of the calculus of inductive constructions, a major technical innovation of this work lies in the fact that the computation mechanism varies along proof-checking: goals are sent to the decision procedure together with the set of user hypotheses available from the current context. Our main result shows that this extension of the calculus of constructions does not compromise its main properties: confluency, strong normalization and decidability of proof-checking are all preserved. We finally discuss the implication of this work on the architecture of proof assistants.

Staged computation, which explicitly divides a computation into separate stages, is a unifying framework for the existing program-generation systems. Web document programming, partial evaluation, runtime code generation, function inlining, macro expansion are all instances of staged computation. I will present a polymorphic modal type system and its principal type inference algorithm that conservatively extend ML by all of practical staging constructs (as found in Lisp's quasi-quotation system). The combination is meaningful because ML is a practical higher-order, impure, and typed language, while Lisp's quasi-quotation system has long evolved to comply with the demands from multi-staged programming practices. Our type system supports open code, unrestricted operations on references, intentional variable-capturing substitution as well as capture-avoiding substitution, and lifting values into code, whose combination escaped all the previous systems. (This is a co-work with Ik-soon Kim and Cristiano Calcagno, and was presented in the ACM Symposium on Principles of Programming Languages'06.)

We introduce lambda box, a lambda-calculus corresponding to an intuitionistic fragment of the minimal modal logic K via Curry-Howard isomorphism. We also give a translation from lambda box into dependent type theory by extending the standard translation from formula-level to proof-level. We then clarify close relationship between modal logic and first-order predicate logic not only at formula-level but also at proof-level by showing fullness and completeness of the extended standard translation. In this talk, we explain the definitions of lambda box and the extended standard translation, and proofs of its fullness and completeness.

The talk starts from a general background to illustrate the usage of mobile processes and their types as an analysis tool for ubiquitous computing. Then I will give a tutorial on the session types of mobile processes which have been recently actively studied among researchers. Their variants are also used in industrial products for Web Services and Operating Systems; starting from 1994, the session types have been developed for the Pi and Ambient calculi, CORBA interfaces, Haskell, multithreaded functional languages, the C language, C#, distributed and multithreaded Java, and web description languages. After the introduction of the session types, I shall talk about my recent work with Vasco Vasconcelos which investigates an issue of the subject reduction of the session types.

We introduce an algebraic semantics for hybrid logic with binders. It is known (Areces, Blackburn, Marx) that this formalism is a modal counterpart of the bounded fragment of first-order logic, studied by Feferman in the 1960's. The algebraization process leads to an interesting class of boolean algebras with operators, called substitution-satisfaction algebras (SSA's). We prove a representation theorem for these algebras and thus provide an algebraic proof of completeness for hybrid logic. Our proof is similar to Andreka-Nemeti algebraic proof of completeness for first-order logic. The fact that the bounded fragment is a conservative reduction class for first-order logic hints that there is a connection between cylindric algebras and SSA's: we identify a subvariety of SSA's, whose term-reducts are cylindric algebras.

In 1987, Toyama proved that the union of two confluent term-rewriting systems that share absolutely no function symbols or constants is likewise confluent, a property called modularity. The proof of this beautiful modularity result, technically based on slicing terms into an homogeneous cap and a so called alien, possibly heterogeneous substitution, was later substantially simplified. We present a further simplification of the proof of Toyama's result for confluence, which shows that the crux of the problem lies in two different properties: a cleaning lemma, whose goal is to anticipate the application of collapsing reductions; a modularity property of ordered completion, that allows to pairwise match the caps and alien substitutions of two equivalent terms. The approach allows for rules with extra variables on the right. We then show that Toyama's modularity result scales up to rewriting modulo equations.

Two kinds of ambient calculi will be presented, where the "open" capability is replaced by direct mobility of generic processes, and types express the communication, access and mobility properties of the modelled system by using group names to collect ambients with homologous features. In such framework it will be shown how the introduction of co-moves and runtime type-checking allows typing to be local to the ambients, with no global assumptions.

The rules of classical logic may be formulated in pairs corresponding to De Morgan duals: rules about "and" are dual to rules about "or". Wadler (2003), inspired by Curien and Herbelin (2000), presents a dual calculus in which exchanging "and" with "or" in types also exchanges call-by-value with call-by-name in evaluations. Just as the lambda calculus (of Church 1932,1940) corresponds to the intuitionistic natural deduction (of Gentzen 1935), the dual calculus corresponds to the classical sequent calculus (also of Gentzen 1935). This talk describes dual calculus, discusses its relationship to the mu calculus of Parigot (1992) and the Symmetric Lambda Calculus of Barbanera and Berardi (1996), and suggests further directions of research that may be of interest to the rewriting community.

The idea of formal mathematical proofs is that every logical step of reasoning is made explicit, so that the proof can be formally verified by a computer using a so called proof-assistant. On the other hand, large computations performed by a computers are necessary in order to establish some mathematical results like the primality of large numbers, the four colour theorem or Kepler's conjecture. I will explain how computational formalisms, like the one implemented in Coq allow to reconcile these two roles of the computer for mathematics. I will illustrate this through examples coming from primality proofs and the recent formalization of the four colour theorem in Coq.

We present a compositional program logic for call-by-value imperative higher-order functions with general forms of aliasing, which can arise from the use of reference names as function parameters, return values, content of references and parts of data tructures. The program logic extends our earlier logic for alias-free imperative higher-order functions with new modal operators which serve as building blocks for clean structural reasoning about programs and data structures in the presence of aliasing. This has been an open issue since the pioneering work by Cartwright-Oppen and Morris twenty-five years ago. We illustrate usage of the logic for description and reasoning through concrete examples including a higher-order polymorphic Quicksort. The logical status of the new operators is clarified by translating them into (in)equalities of reference names. The logic is observationally complete in the sense that two programs are observationally indistinguishable iff they satisfy the same set of assertions. (Joint work with Kohei Honda and Nobuko Yoshida)

We study the equational theory of Parigot's second-order $\lambda\mu$-calculus in connection with a call-by-name continuation-passing style (CPS) translation into a fragment of the second-order $\lambda$-calculus. It is observed that the relational parametricity on the CPS target calculus induces a natural notion of equivalence on the $\lambda\mu$-terms. On the other hand, the unconstrained relational parametricity on the $\lambda\mu$-calculus turns out to be inconsistent with this CPS semantics. Following these facts, we propose to formulate the relational parametricity on the $\lambda\mu$-calculus in a constrained way, which might be called ``focal parametricity''. We sketch some use of focal parametricity for deriving ``free theorems'' for $\lambda\mu2$ syntactically. At the conceptual and abstract level, this story closely resembles to the study of ``linear parametricity'' and recursion. Just like the linear parametricity gives a solution of accommodating non-termination and recursion in the polymorphic setting (as advocated by Plotkin), the focal parametricity provides a way of accommodating control features in the polymorphic setting. We discuss a common generalization of linear parametricity and focal parametricity towards a theory of parametricity for computational effects.

The hypersequent framework is a simple but versatile tool for handling many important (families of) non-classical logics in Gentzen's spirit. In this talk I give an overview of the method of hypersequents. To this end I will present examples of hypersequent calculi for logics of a different nature ranging from intermediate to fuzzy logics. Cut elimination in hypersequent calculi as well as the automated generation of analytic calculi for classes of logics satisfying suitable properties will be described.

Backtracking is the possibility, for a given computation, to come back to a previous state and restart from it, forgetting everything took place after it. We introduced a new notion, 1-backtracking, a particular case of backtracking in which forgetting is irreversible. If we forget a state we can never restore it back. The topic of the talk is a game theoretical model for 1-backtracking, and its use for modelling Hayashi's Limit Computable Mathematics and Lambda Calculus. (with T. Coquand, U. de'Liguror, S. Hayashi)

In many occasions would one encounter the task of maintaining the consistency of two pieces of structured data that are related by some transform --- synchronising bookmarks in different web browsers, the source and the view in an editor, or views in databases, to name a few. We proposes a formal model of such tasks, basing on a programming language allowing injective functions only. The programmer designs the transformation as if she is writing a functional program, while the synchronisation behaviour is automatically derived by algebraic reasoning. The main advantage is being able to deal with duplication and structural changes. In this talk I will give an overview of the project, and talk about its application and limitations.

Inductive-recursive definitions were originally developed by Peter Dybjer as a general concept, which subsumes all standard extensions of Martin-L{\"o}f type theory. It substantially extends the concept of strictly positive inductive definitions by a general form of universes. We present then a closed formalization of inductive-recursive definitions. In this closed formalization there exists a data type of codes for inductive-recursive definitions. Using this data type one can write programs which take as input a code for a data type and elements of this data type and compute from it another code for a data type and elements of it. This is a very general form of generic programming. If time permits we will give a model of the type theory of inductive-recursive definitions and indicate why its proof theoretic strength is in the interval $[\mathrm{KPM},\mathrm{KPM}^+]$ (the details of the upper bound have however not yet been fully worked out). (joint work with Peter Dybjer, Gothenburg, Sweden)

We introduce a new path ordering POP for polynomial time functions, such that for a finite rewrite system R contained in POP, the derivation length of the term $f(t,\ldots)$ for R is bounded by a polynomial in the derivation lengths of the input terms $t,\ldots$. (a joint work with Georg Moser (innsbruck U.))

Linear lambda-calculus is the calculus associated to intuitionistic implicative linear logic through the Curry-Howard isomorphism. Linear matching equations in that calculus are matching equations where the occurrences of the unknowns are linear. In that talk we will introduce a new type system for linear lambda-calculus, namely "syntactic descriptions". With that type system, checking whether normalizing t will result in the term u amounts to check if t has a certain type d in a context gamma, both determined by u. Furthermore we will show that solving linear matching equations amounts to type-inference and proof-search in that system. We will then introduce a proof-net theory for this type system from which we can conclude that under certain simple conditions solving linear matching in the linear lambda-calculus is polynomial. And finally we will define a tabular algorithm based on syntactic descriptions for those equations to be solved.

Type systems have been used extensively to verify properties on functional programs, like for instance termination. Type inference then provides a way to check if a program satisfies the property. However we are often more interested in feasible termination, meaning that the program will terminate in time polynomial in the size of its input (Ptime). System F is a polymorphic type system corresponding to second-order intuitionistic logic by the proofs-as-programs correspondence. As a type system it ensures termination, but typeability is undecidable. Linear logic itself can be seen as a type system refining system F, and Light Linear logic, or its variant Dual Light Affine Logic (DLAL), more specifically ensure polynomial time computability. In this talk we present a procedure for typing lambda-terms in DLAL: given a system F typed term as input, it decides whether this type can be refined into a DLAL type. This gives a sufficient criterion for an F-typed term to be not only terminating, but also Ptime terminating. Our algorithm proceeds by collecting and solving disjunctive linear constraints. We show that this procedure can be performed in time polynomial w.r.t. the size of the term. (Joint work with Kazushige Terui)

We present Ludics nets (L-nets) as a game model allowing for concurrent interaction. L-nets have been developed in the context of Ludics (Girard 2001), which can be seen as a game model of sequential interaction, abstracting away from proof-theory. A key aspect of the Ludics programme is to bring closer together syntax and semantics, making the former more abstract and the latter more concrete. In the same line, L-nets can be seen both as parallel strategies (in game-semantical terms) and as an abstract form of proof nets. Strategies capturing sequential interaction, such as Hyland-Ong innocent strategies or Girard's designs, are based on trees; during an interaction (play, or run) the order between the actions is totally specified. L-nets on the contrary are based on graphs, and interactions are partial orders; the intuition is that some actions can be performed in parallel (or scheduled in any order), while there are tasks which depend upon other tasks. When taking a proof-theoretical point of view, a tree strategy can be seen as an abstract sequent calculus derivations, while an L-net appears as an abstract (multiplicative-additive) proof-net. Moreover, as a tree strategy is a particular case of L-net, we have a homogeneus setting, in which it is possible to move between different degrees of sequentiality (both on the syntactical and on the semantical side). This talk is based on work in collaboration with Francois Maurel and with Pierre-Louis Curien.

Logic and lambda-calculus have provided powerful tools for high-level programming and program verification: functional languages, proof-assistents, type systems... However quantitative aspects in these settings can be an issue: for instance programs extracted from formal constructive proofs might have bad time or space complexity bounds. Thus Implicit computational complexity (ICC) aims at controlling complexity without explicitly measuring bounds, but instead by considering restricted calculi: for instance languages in which all programs run in polynomial time. Linear logic has turned out to be a useful framework for ICC, because it gives a fined-grained control over the use of resources in the proofs-as-programs paradigm . In this talk we will present different approaches from Linear logic to characterize polynomial time computation (coming from the work of Girard, Lafont...). We will discuss the possibility of using such variants of Linear logic as type systems refining system F and ensuring complexity bounds on lambda-calculus terms, in the following way: if a lambda-term is well-typed then it can be executed in polynomial time. (this talk includes joint work with Kazushige Terui)

We give new proofs of soundness (all representable functions on base types lies in certain complexity classes) for Light Affine Logic, Elementary Affine Logic, LFPL and Soft Affine Logic. The proofs are based on a common semantical framework which is merely instantiated in four different ways. The framework consists of an innovative modification of realizability which allows us to use resource-bounded computations as realisers as opposed to including all Turing computable functions as is usually the case in realizability constructions. For example, all realisers in the model for LFPL are polynomially bounded computations whence soundness holds by construction of the model. The work then lies in being able to interpret all the required constructs in the model. While being the first entirely semantical proof of polytime soundness for light logics, our proof also provides a notable simplification of the original already semantical proof of polytime soundness for LFPL. A new result made possible by the semantic framework is the addition of polymorphism and a modality to LFPL.

A central task of logic in computer science is to provide automated generation of suitable analytic calculi for a wide range of non-classical logics. By analytic calculi we mean calculi in which the proof search proceeds by step-wise decomposition of the formula to be proved. In this talk we show how to automatically generate analytic hypersequent calculi for a large class of logics containing the linearity axiom (lin) (A -> B) \/ (B -> A) starting from existing cut-free sequent calculi for the corresponding logics without (lin).

A simple and complete proof of strong normalization for first and second order intuitionistic natural deduction including disjunction, first-order existence and permutative conversions is given. The paper follows Tait-Girard approach via computability predicates (reducibility) and saturated sets. Strong normalization is first established for a set of conversions of a new kind, then deduced for the standard conversions. Difficulties arising for disjunction are resolved using a new logic where disjunction is restricted to atomic formulas. (a joint work with Grigori Mints (Stanford Univ))

Date: February 12, 2008, 11:00--16:40 Place: National Institute of Informatics, Room 2005 (20th floor) Program: 11:00--11:40 Daisuke Kimura (National Institute of Informatics) Call-by-Value Is Dual to Call-by-Name, Extended 11:40--11:50 Tea Break 11:50--12:30 Makoto Kanazawa (National Institute of Informatics) A lambda calculus characterization of attributed tree transductions 12:30--14:00 Lunch Break 14:00--14:40 Makoto Tatsuta (National Institute of Informatics) Types for Hereditary Permutators 14:40--15:20 Satoshi Kobayashi (Kyoto Sangyo Univ) Game semantics and translational semantics for limit computable mathematics 15:20--15:40 Tea Break 15:40--16:40 Stefano Berardi (Torino Univ) A computational interpretation of classical proofs through parallel computations Abstracts: ---------------------------------------------------------------------- Daisuke Kimura (National Institute of Informatics) Call-by-Value Is Dual to Call-by-Name, Extended We extend Wadler’s work that showed duality between call-by-value and call-by-name by giving mutual translations between the lambda mu calculus and the dual calculus. We extend the lambda mu calculus and the dual calculus through two stages. We first add a fixed-point operator and an iteration operator to the call-by-name and call-by-value systems respectively. Secondly, we add recursive types, ?, and types to these systems. The extended duality between call-by-name with recursion and call-by-value with iteration has been suggested by Kakutani. He followed Selinger’s category-theoretic approach. We completely follow Wadler’s syntactic approach. We give mutual translations between our extended lambda mu calculus and dual calculus by extending Wadler’s translations, and also show that our translations form an equational correspondence, which was defined by Sabry and Felleisen. By composing our translations with duality on the dual calculus, we obtain a duality on our extended lambda mu calculus. Wadler’s duality on the lambda mu calculus was an involution, and our duality on our extended lambda mu calculus is also an involution. ---------------------------------------------------------------------- Makoto Kanazawa (National Institute of Informatics) A lambda calculus characterization of attributed tree transductions A characterization of the class of attributed tree transductions (an abstraction of attribute grammars) is given in terms of a model where each node label of a tree is replaced by an almost linear lambda-term, i.e., a simply typed lambda term in which every duplicated variable must have an atomic type. ---------------------------------------------------------------------- Makoto Tatsuta (National Institute of Informatics) Types for Hereditary Permutators This talk answers the open problem of finding a type system that characterizes hereditary permutators, which is the TLCA open problem 20. First this talk shows that there does not exist such a type system by showing that the set of hereditary permutators is not recursively enumerable. The set of positive primitive recursive functions is used to prove it. Secondly this talk gives a best-possible solution by providing a countably infinite set of types such that a term has every type in the set if and only if the term is a hereditary permutator. ---------------------------------------------------------------------- Satoshi Kobayashi (Kyoto Sangyo Univ) Game semantics and translational semantics for limit computable mathematics TBA ---------------------------------------------------------------------- Stefano Berardi (Torino Univ) A computational interpretation of classical proofs through parallel computations We argue in favor of the following thesis: there is an intric link between the computation we can unwing from classical proofs and parallel computations. We introduce a semantical model for computations extracted from classical proofs based on parallel computations and on the concept of learning in the limit. Then we discuss how these ideas can be modeled in the lambda calculus with continuations. The final aim of our research is designing parallel extensions of the existing languages with continuations.

Date: March 7, 2007, 10:20--16:50 Place: National Institute of Informatics, Lecture Room 1 (12th floor) Program: 10:20--11:00 Yohji Akama (Tohoku Univ) Hierarchies of Delayed Consistent Learnings : Some Problems 11:00--11:40 Daisuke Kimura (National Institute of Informatics) Duality Between Call-by-value Reductions and Call-by-name Reductions 11:40--11:50 Tea Break 11:50--12:50 Agata Ciabattoni (TU Wien) Density Elimination and Rational Completeness for First-Order Logics 12:50--14:20 Lunch Break 14:20--15:00 Makoto Tatsuta (National Institute of Informatics) Simple saturated sets for disjunction and second-order 15:00--15:40 Satoshi Kobayashi (Kyoto Sangyo Univ) Game semantics for limit computable mathematics 15:40--15:50 Tea Break 15:50--16:50 Stefano Berardi (Torino Univ) A sequent calculus for monotonic learning Abstracts: ---------------------------------------------------------------------- Yohji Akama (Tohoku Univ), Hierarchies of Delayed Consistent Learnings : Some Problems A consistent learner is required to correctly and completely reflect in its actual hypothesis all data received so far. Though this demand sounds quite plausible, it may lead to the unsolvability of the learning problem. Therefore, in the present paper several variations of consistent learning are introduced and studied. These variations allow a so-called delta-delay relaxing the consistency demand to all but the last delta data. Additionally, we introduce the notion of coherent learning (again with delta-delay) requiring the learner to correctly reflect only the last datum (only the (n-delta)-th datum) seen. Our results are threefold. First, it is shown that all models of coherent learning with delta-delay are exactly as powerful as their corresponding consistent learning models with delta-delay. Second, we provide characterizations for consistent learning with delta-delay in terms of complexity. Finally, we establish strict hierarchies for all consistent learning models with delta-delay in dependence on delta. We remark that our series of hierarchical results is related to the difficult Blum-Blum's problem: "is totaly consistent learning with arbitrary order a proper subclass of totally consistent learning?" (This talk is based on a joint LNAI paper with Prof. Thomas Zeugmann(Univ. Hokkaido)). ---------------------------------------------------------------------- Daisuke Kimura (National Institute of Informatics), Duality Between Call-by-value Reductions and Call-by-name Reductions Wadler proposed the dual calculus, which corresponds to classical sequent calculus LK, and studied the relationship between the lambda-mu-calculus and the dual calculus as equational systems to explain the duality between call-by-value and call-by-name in a purely syntactical way. Wadler left an open question whether one can obtain similar results by replacing the equations with reductions. To answer his question, we first refine the lambda-mu-calculus as reduction systems by reformulating sum types and omitting problematic reduction rules that are not simulated by reductions of the dual calculus. Secondly, we give translations between the call-by-name lambda-mu-calculus and the call-by-name dual calculus, and show that they preserve the call-by-name reductions. We also show that the compositions of these translations become identity maps up to the call-by-name reductions. We also give translations for the call-by-value systems, and show that they satisfy properties similar to the call-by-name translations. Thirdly, we introduce translations between the call-by-value lambda-mu-calculus and the call-by-name one by composing the above translations with duality on the dual calculus. We finally obtain results corresponding to Wadler's, but our results are based on reductions. ---------------------------------------------------------------------- Agata Ciabattoni (TU Wien), Density Elimination and Rational Completeness for First-Order Logics (join work with G. Metcalfe) Elimination of the cut-rule is a fundamental topic in proof theory, corresponding to the removal of lemmas in proofs. However, the addition and subsequent elimination of other rules can also be of considerable interest. In this talk we consider one such rule of importance in Fuzzy Logic: the so-called ``density rule'' of Takeuti and Titani. Density elimination by substitutions is introduced as a uniform method for removing applications of this rule from proofs in first-order hypersequent calculi. For a large class of calculi, density elimination by this method is guaranteed by known sufficient conditions for cut-elimination. Moreover, adding the density rule to any axiomatic extension of a simple first-order logic gives a logic that is rational complete; i.e., complete with respect to linearly and densely ordered algebras: a precursor to showing that it is a fuzzy logic (complete for algebras with a real unit interval lattice reduct). Hence the sufficient conditions for cut-elimination guarantee rational completeness for a large class of first-order substructural logics. ---------------------------------------------------------------------- Makoto Tatsuta (National Institute of Informatics), Simple saturated sets for disjunction and second-order existential quantification This talk gives simple saturated sets for disjunction and second-order existential quantification by using the idea of segments in Prawitz's strong validity. Saturated sets for disjunction are defined by Pi-0-1 comprehension and those for second-order existential quantification are defined by Sigma-1-1 comprehension. Saturated-set semantics and a simple strong normalization proof are given to the system with disjunction, second-order existential quantification, and their permutative conversions. This talk also introduces the contraction property to saturated sets, which gives us saturated sets closed under union. This enables us to have saturated-set semantics for the system with union types, second-order existential types, and their permutative conversions, and prove its strong normalization. ---------------------------------------------------------------------- Satoshi Kobayashi (Kyoto Sangyo Univ), Game semantics for limit computable mathematics TBA ---------------------------------------------------------------------- Stefano Berardi (Torino Univ), A sequent calculus for monotonic learning (Join work with M. Tatsuta) We propose an infinitary sequent calculus deriving exactly all propositions we can obtain as a limit of a monotonic learning process. We show that the theorems of our sequent calculus are exactly the theorems of: infinitary intuitionistic arithmetic + Excluded Middle restricted to semi-decidable statements.

This is the third installment of an informal workshop series that started two years ago. (The programs of the last two workshops are available.) The theme of the workshop is the interface between the lambda calculus and formal grammar, with a particular emphasis on the grammar formalism known as Abstract Categorial Grammar or Lambda Grammar. The workshop is open to all interested people. The workshop is part of the NII joint research project Toward a Comprehensive Model of Grammar Based on the Typed Lambda Calculus, headed by Makoto Kanazawa. Date and Place Date: January 29, 2007 Place: Lecture Rooms 1 & 2, 20th floor, National Institute of Informatics (National Center of Sciences Building). (Map and access information.) When you enter the National Center of Sciences, tell the guard that you are attending the workshop on the 20th floor. Program Abstracts of some of the talks are available here. 9:50 Opening Remarks 10:00 Philippe de Groote (INRIA-Lorraine), Introduction to Abstract Categorial Grammars 10:40 Sylvain Pogodalla (INRIA-Lorraine), Different Views on Modeling Scope Ambiguity with ACGs 11:20 Coffee 11:40 Yoad Winter (Technion/NIAS), A Modular Approach to Intensional Semantics 12:20 Reinhard Muskens (Tilburg University), Sense and Reference in Classical Type Theory 13:00 Lunch 14:20 Philippe de Groote (INRIA-Lorraine), Earley-Like Parsing of Second-Order ACG 15:00 Sylvain Salvati (NII), Towards Parsing Non-Linear ACGs with Intersection Types 15:40 Coffee 16:00 Satoru Kuroda (Gunma Prefectural Women's University), Characterizations and Problems on LOGCFL 16:40 Makoto Kanazawa (NII), Parsing and Generation as Datalog Queries

This is the second installment of an informal workshop series that started last year. (See the program of last year's workshop.) The theme of the workshop is the interface between the lambda calculus and formal grammar, with a particular emphasis on the grammar formalism known as Abstract Categorial Grammar or Lambda Grammar. The workshop is open to all interested people. The workshop is part of the NII joint research project Toward a Comprehensive Model of Grammar Based on the Typed Lambda Calculus, headed by Makoto Kanazawa. Invited Speakers Philippe de Groote, INRIA-Lorraine Sylvain Pogodalla, INRIA-Lorraine Date and Place Date: February 2, 2006 Place: Conference Room 1208, 12th floor, National Institute of Informatics (National Center of Sciences Building). (Map and access information.) When you enter the National Center of Sciences, tell the guard that you are attending the workshop on the 12th floor. Program 9:45 Opening Remarks 10:00 Makoto Kanazawa (NII), Semantics-Driven Learning of Lexicalized Grammars 11:00 Sylvain Pogodalla (INRIA-Lorraine), About Higher-Order in Semantic Representation for TAGs 12:00 Lunch 13:30 Ryo Yoshinaka (University of Tokyo and NII), Simulation of Deterministic Tree Walking Transducers by Two-Dimensional ACGs 14:30 Sylvain Salvati (NII), Encoding Second-Order String ACGs with Deterministic Tree Walking Transducers 15:30 Break 16:00 Kazushige Terui (NII), Intersection Types for Computational Complexity 17:00 Philippe de Groote (INRIA-Lorraine), Type-Theoretic Extensions of Abstract Categorial Grammars

Date: January 23, 2006, 10:20--18:00 Place: National Institute of Informatics, Lecture Room 2 (12th floor) Program: 10:20--10:50 Ryu Hasegawa (Univ of Tokyo), A Categorical Reduction System On Linear Logic 10:50--11:20 Sylvain Salvati (National Institute of Informatics), Syntactic descriptions: type systems to solve higher order matching equations 11:20--11:30 Tea Break 11:30--12:00 Kazushige Terui (National Institute of Informatics), Intersection types for computational complexity 12:00--12:30 Yukiyoshi Kameyama (Tsukuba Univ), Formalizing Equational Correspondence Proof for CPS Transformation based on Logical Framework 12:30--14:00 Lunch Break 14:00--14:30 Masako Takahashi, Origins of Programs 14:30--15:00 Yohji Akama (Tohoku Univ), Consistent Learning of Virus Signatures from the Behaviour 15:00--15:20 Tea Break 15:20--15:50 Ken-etsu Fujita (Gunma Univ), An embedding from universal types into existential types 15:50--16:20 Ryo Kashima (Tokyo Institute of Technology), Problems on intuitionistic logic 16:20--16:30 Tea Break 16:30--17:00 Mitsu Okada (Keio Univ), TBA 17:00--18:00 Mariangiola Dezani-Ciancaglini (Torino Univ), Intersection Types as Finitary Logical Semantics Abstracts: ---------------------------------------------------------------------- Ryu Hasegawa (Univ of Tokyo), A Categorical Reduction System On Linear Logic We give rewriting relations on the defining diagrams of the categorical semantics of linear logic. This way we regard the free categorical model as a reduction system. We sketch the proof of the weak normalization property, and the ongoing attempt towards the strong normalization property and the Church-Rosser property. ---------------------------------------------------------------------- Sylvain Salvati (National Institute of Informatics), Syntactic descriptions: type systems to solve higher order matching equations We have developped a formal tool, syntactic descriptions, in order to study a particular kind of syntactic equations: higher order matching in the linear lambda-calculus. Syntactic descriptions have given us some new theoritical and practical results for this kind of matching problem. First they have given some precise insights on the complexity of the problem. Second they have helped to develop a new efficient algorithm for solving linear matching equations (equations where the unknonws have linear occurences). Furthermore, syntactic descriptions seem to be an idea which can be adapted to many kind of syntactic matching problems. We therefore tried to adapt their definition in order to cope with higher order matching in the simply typed lambda-calculus. To achieve this we have used ideas from intersection types theory (joint work with Kazushige Terui). This reduces the problem of matching to the emptiness problem for particular types. We don't know whether this emptiness problem is decidable yet, nor if it is equivalent in terms of decidability to higher order matching problem. However it is decidable at third order and gives a rather simple proof that third order matching is decidable. ---------------------------------------------------------------------- Kazushige Terui (National Institute of Informatics), Intersection types for computational complexity It is well known that intersection types provide us with some static characterizations of dynamic properties of lambda terms such as strong/weak normalizability. By using linear/affine versions of intersection types, it is also possible to characterize finer resource-sensitive properties such as normalizability within a given number of reduction steps. In this talk, we give a precise account of the latter aspects of intersection types, and discuss their possible uses in complexity analysis of functional programs. ---------------------------------------------------------------------- Yukiyoshi Kameyama (Tsukuba Univ), Formalizing Equational Correspondence Proof for CPS Transformation based on Logical Framework (Joint work with Hiroyuki Yoshihara) In our previous work, we established the soundness and completeness of CPS transformation for lambda calculus with a certain kind of control operators, whose proof made use of the type structure of the target language of the transformation. Motivated by the fact that the completeness proof needs a lot of calculation and also a careful treatment of variables, we formalize such a proof in twelf proof assistant, which is based on dependently typed metalanguage (logical framework). An interesting fact we found throught this formalization is that the straightforward proof of the completeness for the case of pure lambda calculus does not work, since the inverse of CPS transformation is not compositional. By extending the calculus with a control operator such as call/cc, we can recover the compositionality of the inverse transformation, and then the completeness proof can be formalized within the metalanguage of logical framework. ---------------------------------------------------------------------- Masako Takahashi, Origins of Programs One of the interesting questions in the history of computers is where the notion of computer programs came from and how. It is no doubt that programs in the machine language described in EDVAC report by von Neumann (1945) is in the first generation in the history of computer programs, and it is said that he was influenced in getting the idea by McCulloch-Pitts' paper on nerve nets (1943) and by Turing's paper on Turing machines (1936/7). Then, what kind of influence did von Neumann get from these papers? What did influence Turing to think of the instruction table of the Turing machine, which one might think as another origin of computer programs in itself. We would like to discuss these questions in connection with the history of symbolism in mathematics. ---------------------------------------------------------------------- Yohji Akama (Tohoku Univ), Consistent Learning of Virus Signatures from the Behaviour We observe that to learn virus signatures from the behaviours is dual to learn the administration policy from the configuration sequences, and we formulated them with inductive learning of computable functions. From practical point of view, various consistency of each guess with data presented so far should be demanded, but we prove the equivalence among them. We will also discuss the refutation learning and so on in this line. ---------------------------------------------------------------------- Ken-etsu Fujita (Gunma Univ), An embedding from universal types into existential types I talk about an embedding from System F into a system with second order existential types, which forms a Galois insertion. The target type system \lambda^\exists is logically considered as an intuitionistic (minimal) logic consisting of negation, conjunction and 2nd order existential quantifier, and the embedding is based on a call-by-name cps translation. This embedding gives an interpretation of polymorphic functions in terms of abstract data types. Moreover, fundamental properties such as normal forms, Church-Rosser, weak and strong normalization, are the corresponding ones between System F and a core subsystem of \lambda^\exists. ---------------------------------------------------------------------- Ryo Kashima (Tokyo Institute of Technology), Problems on intuitionistic logic We investigate some properties on intuitionistic logic, e.g., a sufficient condition for the proposition: If [(\forall x.A(x)) \rightarrow B] is provable in 2nd order propositional intuitionistic logic, then there are quantifier-free instances A(C_1),...,A(C_n) s.t. [A(C_1) & ... & A(C_n) \rightarrow B] is provable in propositional intuitionistic logic. ---------------------------------------------------------------------- Mitsu Okada (Keio Univ), TBA ---------------------------------------------------------------------- Mariangiola Dezani-Ciancaglini (Torino Univ), Intersection Types as Finitary Logical Semantics {\em Stone dualities} allow to describe special classes of topological spaces by means of (possibly finitary) partial orders. Typically, these partial orders are given by the topology, a basis for it, or a subbasis for it. The seminal result is the duality between the categories of Stone spaces and that of Boolean algebras (see [john86]). Other very important examples are the descriptions of {\em Scott domains} as {\em information systems} [scot82] and the description of {\em SFP domains} as {\em pre-locales} [abra91]. Intersection types can be viewed also as a restriction of the domain theory in logical form, see [abra91], to the special case of modeling pure lambda calculus by means of $\omega$-algebraic complete lattices. Intersection types have been used as a powerful tool both for the analysis and the synthesis of $\lambda$-models, see {\em e.g.} [coppdezahonslong84]. On the one hand, intersection type disciplines provide finitary inductive definitions of interpretation of $\lambda$-terms in models. On the other hand, they are suggestive for the shape the domain model has to have in order to exhibit certain properties [dezahonsmoto2000]. [abra91] S. Abramsky. Domain theory in logical form. {\em Ann. Pure Appl. Logic}, 51(1-2):1--77, 1991. [coppdezahonslong84] M. Coppo, M. Dezani-Ciancaglini, F. Honsell, and G. Longo. Extended type structures and filter lambda models. In {\em Logic colloquium '82}, pages 241--262. North-Holland, Amsterdam, 1984. [dezahonsmoto2000] M.~Dezani-Ciancaglini, F.~Honsell, and Y.~Motohama. Compositional characterization of $\lambda$-terms using intersection types. In {\em Theoretical Computer Science}, 340(3), 459-495, 2005. [john86] P.~T. Johnstone. {\em Stone spaces}. Cambridge University Press, Cambridge, 1986. [scot82] D.~S. Scott. Domains for denotational semantics. In {\em ICALP'82}, vol. 140 of {\em LNCS}, pages 577--613. Springer, Berlin, 1982.

This is an informal workshop on semantic, pragmatic, and phonological aspects of focus in natural language. The workshop is open to all interested people. The workshop is an initiative of the NII joint research project "Focus in Natural Language" led by Chris Tancredi. Date: November 10-11, 2005 Place: Large Conference Room (2208), 22nd floor, National Institute of Informatics (National Center of Sciences Building) Program Thursday, November 10 - Toshiyuki Ogihara, Counterfactual Conditionals and Foci Crosslinguistically - Shinichiro Ishihara, On Embedded Foci - Kayono Shiobara, Prominence at the Interfaces and Its Effects on Linearization - Mariko Sugahara, Focus and Intonation: An Interaction among Contrastive Focus and Other Grammatical Factors - Chris Tancredi, Association with Focus and Discourse Licensing Friday, November 11 - David Beaver, The Semantics and Pragmatics of Focus Sensitivity - Kimiko Nakanishi, Focus, Numerals, and Negative Polarity Items in Japanese - Daniel Buring, Towards a Typology of Focus

This is an informal workshop on the interface between the lambda calculus and formal grammar, with a particular emphasis on the recently proposed grammar formalism known as Abstract Categorial Grammar or Lambda Grammar. The workshop is open to all interested people. Invited Speakers Philippe de Groote, INRIA-Lorraine Sylvain Salvati, INRIA-Lorraine Reinhard Muskens, Tilburg University Date: February 18, 2005 Place: Lecture Rooms 1 and 2, 20th floor, National Institute of Informatics (National Center of Sciences Building). (Map and access information: http://www.nii.ac.jp/help-e.html.) Program - Reinhard Muskens, Separating Syntax and Combinatorics in Categorial Grammar - Philippe de Groote, Introduction to Abstract Categorial Grammars Sylvain Salvati, Parsing Abstract Categorial Grammars: Complexity and Algorithms - Ryo Yoshinaka, The Complexity and Generative Capacity of Lexicalized Abstract Categorial Grammars - Makoto Kanazawa, Abstract Families of Abstract Categorial Languages - Philippe de Groote, Discourse Representation Theory and Montague Semantics - Reinhard Muskens, Lambda Grammars and Hyperintensionality