Emergent Mind

Abstract

This paper describes a new approach for learning structures of large Bayesian networks based on blocks resulting from feature space clustering. This clustering is obtained using normalized mutual information. And the subsequent aggregation of blocks is done using classical learning methods except that they are input with compressed information about combinations of feature values for each block. Validation of this approach is done for Hill-Climbing as a graph enumeration algorithm for two score functions: BIC and MI. In this way, potentially parallelizable block learning can be implemented even for those score functions that are considered unsuitable for parallelizable learning. The advantage of the approach is evaluated in terms of speed of work as well as the accuracy of the found structures.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.