Computer Science – Information Theory
Scientific paper
2010-10-27
Computer Science
Information Theory
18 pages, 4 figures
Scientific paper
A directed acyclic graph (DAG) partially represents the conditional independence structure among observations of a system if the local Markov condition holds, that is, if every variable is independent of its non-descendants given its parents. In general, there is a whole class of DAGs that represents a given set of conditional independence relations. We are interested in properties of this class that can be derived from observations of a subsystem only. To this end, we prove an information theoretic inequality that allows for the inference of common ancestors of observed parts in any DAG representing some unknown larger system. More explicitly, we show that a large amount of dependence in terms of mutual information among the observations implies the existence of a common ancestor that distributes this information. Within the causal interpretation of DAGs our result can be seen as a quantitative extension of Reichenbach's Principle of Common Cause to more than two variables. Our conclusions are valid also for non-probabilistic observations such as binary strings, since we state the proof for an axiomatized notion of mutual information that includes the stochastic as well as the algorithmic version.
Ay Nihat
Steudel Bastian
No associations
LandOfFree
Information-theoretic inference of common ancestors does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Information-theoretic inference of common ancestors, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Information-theoretic inference of common ancestors will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-485857