Directed Models For Statistical Relational Learning

Author: 
Date created: 
2012-10-12
Identifier: 
etd7554
Keywords: 
Bayesian Networks
Markov Logic Networks
Relational Data
Structure Learning
Abstract: 

Statistical Relational Learning is a new branch of machine learning that aims to model a joint distribution over relational data. Relational data consists of different types of objects where each object is characterized with a different set of attributes. The structure of relational data presents an opportunity for objects to carry additional information via their links and enables the model to show correlations among objects and their relationships. This dissertation focuses on learning graphical models for such data. Learning graphical models for relational data is much more challenging than learning graphical models for propositional data. One of the challenges of learning graphical models for relational data is that relational data, unlike propositional data, is non independent and identically distributed and cannot be viewed in a single table. Relational data can be modeled using a graph, where objects are the nodes and relationships between the objects are the edges. In this graph, there may be multiple edges between two nodes because objects may have different types of relationships with each other. The existence of multiple paths of different length among objects makes the learning procedure much harder than learning from a single table. We use a lattice search approach with lifted learning to deal with the multiple path problem. We focus on learning the structure of Markov Logic Networks, which are a first order extension of Markov Random Fields. Markov Logic Networks are a prominent undirected statical relational model that have achieved impressive performance on a variety of statistical relational learning tasks. Our approach combines the scalability and efficiency of learning in directed relational models, and the inference power and theoretical foundations of undirected relational models. We utilize an extension of Bayesian networks based on first order logic for learning class-level or first-order dependencies, which model the general database statistics over attributes of linked objects and their links. We then convert this model to a Markov Logic Network using the standard moralization procedure. Experimental results indicate that our methods are two orders of magnitude faster than, and predictive metrics are superior or competitive with, state-of-the-art Markov Logic Network learners.

Document type: 
Thesis
Rights: 
Copyright remains with the author. The author granted permission for the file to be printed and for the text to be copied and pasted.
File(s): 
Senior supervisor: 
Oliver Schulte
Department: 
Applied Sciences: School of Computing Science
Thesis type: 
(Thesis) Ph.D.
Statistics: