Physics – Data Analysis – Statistics and Probability
Scientific paper
2009-04-30
Physics
Data Analysis, Statistics and Probability
47 pages, 4 tables, 36 figures
Scientific paper
We study some of the most commonly used mutual information estimators, based on histograms of fixed or adaptive bin size, $k$-nearest neighbors and kernels, and focus on optimal selection of their free parameters. We examine the consistency of the estimators (convergence to a stable value with the increase of time series length) and the degree of deviation among the estimators. The optimization of parameters is assessed by quantifying the deviation of the estimated mutual information from its true or asymptotic value as a function of the free parameter. Moreover, some common-used criteria for parameter selection are evaluated for each estimator. The comparative study is based on Monte Carlo simulations on time series from several linear and nonlinear systems of different lengths and noise levels. The results show that the $k$-nearest neighbor is the most stable and less affected by the method-specific parameter. A data adaptive criterion for optimal binning is suggested for linear systems but it is found to be rather conservative for nonlinear systems. It turns out that the binning and kernel estimators give the least deviation in identifying the lag of the first minimum of mutual information from nonlinear systems, and are stable in the presence of noise.
Kugiumtzis Dimitris
Papana Angeliki
No associations
LandOfFree
Evaluation of Mutual Information Estimators for Time Series does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Evaluation of Mutual Information Estimators for Time Series, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Evaluation of Mutual Information Estimators for Time Series will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-442822