Physics – Quantum Physics
Scientific paper
2000-11-30
Sections 1-5 in: Hierarchies of generalized Kolmogorov complexities and nonenumerable universal measures computable in the lim
Physics
Quantum Physics
10 theorems, 50 pages, 100 refs, 20000 words. Minor revisions: added references; improved readability
Scientific paper
The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lacking a short description, and study the spectrum of TOEs spanned by two Ps, one reflecting the most compact constructive descriptions, the other the fastest way of computing everything. The former derives from generalizations of traditional computability, Solomonoff's algorithmic probability, Kolmogorov complexity, and objects more random than Chaitin's Omega, the latter from Levin's universal search and a natural resource-oriented postulate: the cumulative prior probability of all x incomputable within time t by this optimal algorithm should be 1/t. Between both Ps we find a universal cumulatively enumerable measure that dominates traditional enumerable measures; any such CEM must assign low probability to any universe lacking a short enumerating program. We derive P-specific consequences for evolving observers, inductive reasoning, quantum physics, philosophy, and the expected duration of our universe.
No associations
LandOfFree
Algorithmic Theories of Everything does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Algorithmic Theories of Everything, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Algorithmic Theories of Everything will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-249116