Publication

Learning Reasoning Strategies in End-to-End Differentiable Proving

International Conference on Machine Learning (ICML)


Abstract

Attempts to render deep learning models interpretable, data-efficient, and robust have seen some success through hybridisation with rule-based systems like Neural Theorem Provers (NTPs). These neuro-symbolic reasoning models can induce interpretable rules and learn representations from data via back-propagation, while providing logical explanations for their predictions. However, they are restricted by their computational complexity, as they need to consider all possible proof paths for explaining a goal, thus rendering them unfit for large-scale applications. We present Conditional Theorem Provers (CTPs), an extension to NTPs that learns an optimal rule selection strategy via gradient-based optimisation. We show that CTPs are scalable and yield state-of-the-art results on the CLUTRR dataset, which tests systematic generalisation of neural models by learning to reason over smaller graphs, and evaluating on larger ones. Finally, CTPs show better link prediction results on standard benchmarks in comparison with other neuro-symbolic reasoning models, while retaining their explainability properties.

Related Publications

All Publications

AISTATS - April 13, 2021

Continual Learning using a Bayesian Nonparametric Dictionary of Weight Factors

Nikhil Mehta, Kevin J Liang, Vinay K Verma, Lawrence Carin

NeurIPS - December 6, 2020

Improved Sample Complexity for Incremental Autonomous Exploration in MDPs

Jean Tarbouriech, Matteo Pirotta, Michal Valko, Alessandro Lazaric

NeurIPS - December 7, 2020

Labelling unlabelled videos from scratch with multi-modal self-supervision

Yuki M. Asano, Mandela Patrick, Christian Rupprecht, Andrea Vedaldi

NeurIPS - December 7, 2020

Adversarial Example Games

Avishek Joey Bose, Gauthier Gidel, Hugo Berard, Andre Cianflone, Pascal Vincent, Simon Lacoste-Julien, William L. Hamilton

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy