Measuring the Hardness of Stochastic Sampling on Bayesian Networks with Deterministic Causalities: the k-Test

Computer Science – Artificial Intelligence

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

Approximate Bayesian inference is NP-hard. Dagum and Luby defined the Local Variance Bound (LVB) to measure the approximation hardness of Bayesian inference on Bayesian networks, assuming the networks model strictly positive joint probability distributions, i.e. zero probabilities are not permitted. This paper introduces the k-test to measure the approximation hardness of inference on Bayesian networks with deterministic causalities in the probability distribution, i.e. when zero conditional probabilities are permitted. Approximation by stochastic sampling is a widely-used inference method that is known to suffer from inefficiencies due to sample rejection. The k-test predicts when rejection rates of stochastic sampling a Bayesian network will be low, modest, high, or when sampling is intractable.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Measuring the Hardness of Stochastic Sampling on Bayesian Networks with Deterministic Causalities: the k-Test does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Measuring the Hardness of Stochastic Sampling on Bayesian Networks with Deterministic Causalities: the k-Test, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Measuring the Hardness of Stochastic Sampling on Bayesian Networks with Deterministic Causalities: the k-Test will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-90751

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.