Learning and Reasoning with Logic Tensor Networks

20.12.2017 - 17:00
20.12.2017 - 19:00
Lecturer

Luciano Serafini

Data and Knowledge Management Research Unit, Fondazione Bruno Kessler, Trento, Italy

Following the recent success of deep learning, attention has turned to neural artificial intelligence (AI) systems capable of harnessing knowledge as well as large amounts of data. Neuralsymbolic integration has sought for many years to benefit from the integration of symbolic AI with neural computation that can lead to more versatile and explainable learning systems. Recently, constraints have been shown to offer a unifying theoretical framework for learning and reasoning. Constraints-based neural-symbolic computing (CBNSC) offers a methodology for unifying knowledge representation and machine learning. The seminar introduces the theory and practice a recent computational implementation of CBDLR called Logic Tensor Networks (LTNs) LTN is a logic-based formalism defined on a first-order language, where formulas have fuzzy semantics and terms are interpreted in feature vectors of real numbers. LTN allows a well-founded integration of deductive reasoning on a knowledge-base and efficient data-driven relational machine learning e.g. using tensor networks. LTNs enable relational knowledge infusion and distilling from deep networks, thus constraining data-driven learning with background knowledge, and allowing deep networks to be interrogated for explainability.