Browse Prior Art Database

Finding the best complete lattice of discrete Bayesian networks for prediction Disclosure Number: IPCOM000244513D
Publication Date: 2015-Dec-17
Document File: 7 page(s) / 227K

Publishing Venue

The Prior Art Database


When using Bayesian networks for predictive purposes, the Bayesian theory calls for using all the possible models, i.e., model averaging, instead of just finding a single best model. Earlier studies have demonstrated how one can efficiently use (nearly) all the subnetworks of a single good network for prediction. This work, however, has not attempted to specifically search for a network, subnetworks of which would yield good predictions. We demonstrate how to find such a best set of nested networks as fast as finding a single best network using an enhanced version of the dynamic programming approach mentioned earlier. We also show that doing so is better than simply finding a single best network and then using its subnetworks for prediction. There is a previous algorithm that uses dynamic programming to find the single best Bayesian network structure when the goodness criterion for the structure satisfies so called decomposability requirement that allows the goodness of the network to be expressed as a sum of terms, one term per each node of the Bayesian network structure. Marginalizing over substructures also satisfies this decomposability criterion, but the individual terms may be very expensive to compute. We show that they can be computed in reasonable time using discrete zeta-transform.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 12% of the total text.

Page 01 of 7

￿￿￿￿￿￿￿ t￿￿ ￿￿st ￿￿￿￿￿￿t￿ ￿￿tt￿￿￿ ￿￿ ￿￿s￿r￿t￿ ￿￿￿￿s￿￿￿ ￿￿t￿￿r￿s ￿￿r ￿r￿￿￿￿t￿￿￿

 ￿￿￿￿ ￿￿￿￿￿￿￿r ￿￿￿r￿￿r￿ ￿￿￿ ￿￿￿￿

￿￿￿￿￿￿￿rr￿￿￿￿ ￿￿ ￿￿s￿r￿t￿ ￿￿￿￿ts￿ ￿r ￿￿r￿ ￿￿￿￿r￿￿￿￿￿ t￿￿ ￿￿￿￿￿ ￿￿￿￿￿￿r￿t￿￿￿s ￿￿ ￿￿￿￿ ￿￿￿￿t￿ ￿￿s￿r￿t￿ ￿￿￿t￿rs￿ ￿￿￿ ￿￿ ￿￿￿￿￿￿￿￿￿t￿￿ ￿￿￿￿￿￿￿ ￿s￿￿￿ ￿￿s￿r￿t￿ ￿￿￿￿s￿￿￿ ￿￿t￿￿r￿s ￿￿￿￿ ￿￿￿￿ ￿￿￿￿￿s ￿r￿ ￿￿t￿￿ ￿s￿￿ t￿ ￿￿￿￿￿t￿ ￿￿￿￿￿t￿￿￿￿￿ ￿r￿￿￿￿￿￿￿t￿￿s ￿￿ s￿￿￿ ￿￿￿￿￿s￿￿￿s ￿￿ t￿￿ ￿￿￿t￿r￿ ￿￿￿￿￿ t￿￿t ￿￿ ￿￿￿￿ ￿￿￿￿￿s ￿￿r s￿￿￿ ￿￿￿ss￿￿￿￿ ￿￿r￿￿￿￿￿ s￿￿s￿t ￿￿ ￿￿￿￿￿￿￿￿ts ￿￿ t￿￿ ￿￿￿t￿r￿ ￿￿ ￿￿￿￿￿￿￿ ￿￿ s￿￿￿ ￿ ￿￿￿￿￿t￿t￿￿￿ ￿￿￿￿￿ ￿￿ ￿￿￿￿rr￿￿￿ t￿￿ t￿￿￿￿s ￿￿ st￿￿￿ ￿￿￿￿￿t t￿ ￿￿￿ ￿￿ ￿ ￿￿￿￿￿￿￿￿ ￿￿￿￿￿￿￿t￿ ￿￿￿￿￿ t￿￿t ￿￿ ￿￿￿￿ ￿￿￿￿￿ ￿￿t t￿￿t ￿￿rt￿￿￿ t￿￿￿￿s ￿￿￿￿ ￿￿￿￿ ￿￿s￿￿ss￿￿ ￿￿ ￿t￿

  ￿￿ ￿￿￿￿￿￿￿ ￿￿￿r￿￿￿￿￿ ￿t ￿s ￿￿st￿￿￿r￿ t￿ tr￿ t￿ ￿￿￿r￿ t￿￿ str￿￿t￿r￿ ￿￿ t￿￿s￿ ￿￿￿￿￿s ￿r￿￿ t￿￿ ￿￿t￿￿ ￿￿￿￿￿￿￿ t￿￿ ￿￿st str￿￿t￿r￿ ￿s ￿￿ ￿P￿￿￿r￿ ￿r￿￿￿￿￿ ￿￿r ￿￿￿ t￿￿ ￿￿￿￿￿￿￿￿ ￿s￿￿ ￿￿￿￿￿￿ss ￿r￿t￿r￿￿ ￿￿￿￿ ￿￿t s￿￿￿￿ ￿￿st￿￿￿￿s ￿￿ t￿￿ ￿r￿￿￿￿￿￿ ￿￿￿￿￿ ￿￿ t￿ ￿￿￿￿t ￿￿ ￿￿r￿￿￿￿￿s￿￿￿￿￿s￿ ￿￿￿ ￿￿ s￿￿￿￿￿ ￿s￿￿￿ ￿￿￿￿￿￿￿ ￿r￿￿r￿￿￿￿￿￿ ￿￿￿￿ ￿￿r ￿￿r￿￿r ￿￿st￿￿￿￿s ￿￿ t￿￿ ￿r￿￿￿￿￿￿ ￿￿￿r￿st￿￿ s￿￿r￿￿ ￿￿￿￿r￿t￿￿s ￿r￿ ￿￿￿￿￿￿￿￿ ￿s￿￿￿

  ￿￿￿￿ ￿s￿￿￿ ￿￿￿￿s￿￿￿ ￿￿t￿￿r￿s ￿￿r ￿r￿￿￿￿t￿￿￿ ￿￿r￿￿s￿s￿ t￿￿ ￿￿￿￿s￿￿￿ t￿￿￿ ￿r￿ ￿￿￿￿s ￿￿r ￿s￿￿￿ ￿￿￿ t￿￿ ￿￿ss￿￿￿￿ ￿￿￿￿￿s￿ ￿￿￿￿￿ ￿￿￿￿￿ ￿￿￿r￿￿￿￿￿￿ ￿￿st￿￿￿ ￿￿ ￿￿st ￿￿￿￿￿￿ ￿ s￿￿￿￿￿ ￿￿st ￿￿￿￿￿￿ ￿￿r￿￿￿r st￿￿￿￿s ￿￿￿￿ ￿￿￿￿￿str￿t￿￿...