Computer Science – Information Theory
Scientific paper
2010-06-05
Computer Science
Information Theory
Scientific paper
Building on Shannon's lead, let's consider a more malleable expression for tracking uncertainty, and states of "knowledge available" vs. "knowledge missing," to better practice innovation, improve risk management, and successfully measure progress of intractable undertakings. Shannon's formula, and its common replacements (Renyi, Tsallis) compute to increased knowledge whenever two competing choices, however marginal, exchange probability measures. Such and other distortions are corrected by anchoring knowledge to a reference challenge. Entropy then expresses progress towards meeting that challenge. We introduce an 'interval of interest' outside which all probability changes should be ignored. The resultant formula for Missing Acquirable Relevant Knowledge (MARK) serves as a means to optimize intractable activities involving knowledge acquisition, such as research, development, risk management, and opportunity exploitation.
No associations
LandOfFree
Shannon Revisited: Considering a More Tractable Expression to Measure and Manage Intractability, Uncertainty, Risk, Ignorance, and Entropy does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Shannon Revisited: Considering a More Tractable Expression to Measure and Manage Intractability, Uncertainty, Risk, Ignorance, and Entropy, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Shannon Revisited: Considering a More Tractable Expression to Measure and Manage Intractability, Uncertainty, Risk, Ignorance, and Entropy will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-638335