Computer Science – Information Theory
Scientific paper
2007-06-11
vol. 55, no. 2, Feb. 2009
Computer Science
Information Theory
Information Theory, IEEE Transactions
Scientific paper
The problem of hypothesis testing against independence for a Gauss-Markov random field (GMRF) is analyzed. Assuming an acyclic dependency graph, an expression for the log-likelihood ratio of detection is derived. Assuming random placement of nodes over a large region according to the Poisson or uniform distribution and nearest-neighbor dependency graph, the error exponent of the Neyman-Pearson detector is derived using large-deviations theory. The error exponent is expressed as a dependency-graph functional and the limit is evaluated through a special law of large numbers for stabilizing graph functionals. The exponent is analyzed for different values of the variance ratio and correlation. It is found that a more correlated GMRF has a higher exponent at low values of the variance ratio whereas the situation is reversed at high values of the variance ratio.
Anandkumar Animashree
Swami Ananthram
Tong Lang
No associations
LandOfFree
Detection of Gauss-Markov Random Fields with Nearest-Neighbor Dependency does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Detection of Gauss-Markov Random Fields with Nearest-Neighbor Dependency, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Detection of Gauss-Markov Random Fields with Nearest-Neighbor Dependency will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-358789