Bayesian networks present a useful tool for displaying correlations between several variables. This thesis presents a hybrid search strategy for structure learning in Bayesian networks whose structure is a directed acyclic graph. The general strategy performs a local search that meets the following criteria: 1. The Markov blankets in the model should be consistent with dependency information from statistical tests. 2. Minimizes the number of edges subject to the first constraint. 3. Maximizes a given score function subject to those constraints. The strategy is adapted and optimized for learning structures for both discrete and continuous networks. Both algorithms are discussed and tested empirically both on synthetically generated structures, and on real networks. We show that adding dependency constraints can improve the quality of the learned models. Furthermore, unlike purely structural strategies, our hybrid method is robust enough to output high quality models even when available data is sparse.
Copyright is held by the author.
Member of collection