ILP NII-Satellite Meeting
organized by
Inoue Laboratory


August, 24 (Monday), 14:30 - 17:30


NII (access)
Room: #1901 (19th floor).


Fabrizio Riguzzi (ILP 2012 co-chair),
Gerson Zaverucha (ILP 2013 co-chair),
Yoshitaka Yamamoto (ILP 2010 Best paper award winner)


14:30 Inference and Learning for Probabilistic Logics Based on the Distribution Semantics.
Fabrizio Riguzzi (Associate Professor, University of Ferrara)

This talk will discuss recent results achieved in inference and learning for probabilistic logic programs under the distribution semantics at the University of Ferrara. After an overview on inference, I will present approaches for learning the parameters of programs. Then I will illustrate various systems for learning both the parameters and the structure at the same time. Learning is particularly costly so techniques for scaling the systems will be discussed. The need for the representation of uncertainty is felt also in description logics. The talk will present the DISPONTE semantics for probabilistic description logics that is based on the distribution semantics. I will also introduce inference and learning systems for DISPONTE.
15:30 Fast Relational Learning Using Bottom Clause Propositionalization with Artificial Neural Networks.
Gerson Zaverucha (Professor, Federal University of Rio de Janeiro)

Relational learning can be described as the task of learning first-order logic rules from examples. Inductive Logic Programming (ILP) performs relational learning either directly by manipulating first-order rules or through propositionalization, which translates the relational task into an attribute-value learning task by representing subsets of relations as features. In this talk, initially we review a fast method and system for relational learning based on a novel propositionalization called Bottom Clause Propositionalization (BCP). Bottom clauses are boundaries in the hypothesis search space used by ILP systems Progol and Aleph. Bottom clauses carry semantic meaning and can be mapped directly onto numerical vectors, simplifying the feature extraction process. We have integrated BCP with a well-known neural-symbolic system, C-IL2P, to perform learning from numerical vectors. C-IL2P uses background knowledge in the form of propositional logic programs to build a neural network. The integrated system, which we call CILP++, handles first-order logic knowledge and is available for download from Sourceforge. We have evaluated CILP++ on seven ILP datasets, comparing results with Aleph and a well-known propositionalization method, RSD. The results show that CILP++ can achieve accuracy comparable to Aleph, while being generally faster, BCP achieved statistically significant improvement in accuracy in comparison with RSD when running with a neural network, but BCP and RSD perform similarly when running with C4.5.
(Joint work with Manoel Franca and Artur Garcez)
16:30 Resource-Oriented Online Approach for Itemset Mining and Hypothesis Finding.
Yoshitaka Yamamoto (Assistant Professor, University of Yamanashi / JST Presto)

Abstract: There is a common challenge involved in itemset mining and hypothesis finding. The former task is finding such itemsets that frequently occur over a transactional database, while the latter task is finding such clauses that compactly subsume a bottom theory. Both tasks are required to tolerate the combinatorial number of candidate solutions to be searched (i.e., itemsets and clauses). In this talk, we present our recent work to address this combinatorial explosion problem in frequent itemset mining, which enables to incrementally process any bursty transaction with a fixed memory consumption. We then consider the possibility to embed this ``resource-oriented'' online approach into the task of hypothesis finding too.

Links: ILP 2015


Tony Ribeiro: tony_ribeiro at

Webmaster: Tony Ribeiro (E-mail tony_ribeiro at