Link-based classification (LBC) is the problem of predicting the class of a target entity given the attributes of entities linked to it. A natural approach to LBC is to upgrade standard classification methods from the single-table learning. In this thesis, we propose two algorithms for upgrading decision tree and Bayes net learners for LBC. One of the issues that make LBC difficult compared to single table learning is the large number of different types of dependencies that a model may have to consider. A principled way to approach the complexity of correlation types is to consider model classes with explicitly stated independence assumptions. We define two independence assumptions weaker than relational naive Bayes assumption to formulate our classification formulas. We investigate the performance of our models on three real world datasets. Experimental results indicate that our models are fast and achieve better performance compared to a variety of relational classifiers.
Copyright is held by the author.
The author granted permission for the file to be printed and for the text to be copied and pasted.
Supervisor or Senior Supervisor
Thesis advisor: Schulte, Oliver
Member of collection