Computer Science – Information Theory
Scientific paper
May 2001
adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2001aipc..568..192s&link_type=abstract
BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING: 20th International Workshop. AIP Conference Proceedi
Computer Science
Information Theory
Data Analysis: Algorithms And Implementation, Data Management, Design Of Experiments, Information Theory And Communication Theory
Scientific paper
This paper will consider different methods to measure the gain of information that an experiment provides on parameters of a statistical model. The approach we follow is Bayesian and relies on the assumption that information about model parameters is represented by their probability distribution so that a measure of information is any summary of the probability distributions satisfying some sensible assumptions. Robustness issues will be considered and investigated in some examples using a new family of information measures which have the log-score and the quadratic score as special cases. .
Sebastiani P.
Wynn Henry P.
No associations
LandOfFree
Experimental design to maximize information does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Experimental design to maximize information, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Experimental design to maximize information will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-924028