Nonlinear Sciences – Chaotic Dynamics
Scientific paper
2011-04-18
Nonlinear Sciences
Chaotic Dynamics
Scientific paper
We have derived equations to calculate upper and lower bounds for the rate of information exchanged between two nodes (or two groups of nodes) in a dynamical network, the mutual information per unit of time (MIR), without having to calculate probabilities but rather Lyapunov exponents or expansion rates. Since no probabilities need to be calculated, these equations provide a simple way to state whether two nodes are information-correlated and can be conveniently used to understand the relationship between structure and function in dynamical networks. The derivation of these bounds for the MIR employs the same ideas as the ones considered by Ruelle when showing that the sum of the positive Lyapunov exponents of a dynamical system is an upper bound for its Kolmogorov-Sinai entropy. If the equations of motion of the dynamical network are known, upper and lower bounds for the MIR can be analytically or semi-analytically calculated. If the equations of motion are not known, we can employ our equations to measure how much information (per unit of time) is shared between two data sets. We carried out physical experiments to validate our theory.
Baptista Murilo S.
Grebogi Celso
Junior E. R. V.
Parlitz Ulrich
Rubinger R. M.
No associations
LandOfFree
Upper and lower bounds for the mutual information in dynamical networks does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Upper and lower bounds for the mutual information in dynamical networks, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Upper and lower bounds for the mutual information in dynamical networks will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-72253