Mathematics – Statistics Theory
Scientific paper
2008-04-23
Mathematics
Statistics Theory
16 figures
Scientific paper
Inferring the causal structure that links n observables is usually based upon detecting statistical dependences and choosing simple graphs that make the joint measure Markovian. Here we argue why causal inference is also possible when only single observations are present. We develop a theory how to generate causal graphs explaining similarities between single objects. To this end, we replace the notion of conditional stochastic independence in the causal Markov condition with the vanishing of conditional algorithmic mutual information and describe the corresponding causal inference rules. We explain why a consistent reformulation of causal inference in terms of algorithmic complexity implies a new inference principle that takes into account also the complexity of conditional probability densities, making it possible to select among Markov equivalent causal graphs. This insight provides a theoretical foundation of a heuristic principle proposed in earlier work. We also discuss how to replace Kolmogorov complexity with decidable complexity criteria. This can be seen as an algorithmic analog of replacing the empirically undecidable question of statistical independence with practical independence tests that are based on implicit or explicit assumptions on the underlying distribution.
Janzing Dominik
Schoelkopf Bernhard
No associations
LandOfFree
Causal inference using the algorithmic Markov condition does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Causal inference using the algorithmic Markov condition, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Causal inference using the algorithmic Markov condition will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-104839