Statistics – Methodology
Scientific paper
2008-01-28
Journal of the Royal Statistical Society, Series B, (2008), vol. 70, pp. 981--1003
Statistics
Methodology
Scientific paper
In this paper we introduce objective proper prior distributions for hypothesis testing and model selection based on measures of divergence between the competing models; we call them divergence based (DB) priors. DB priors have simple forms and desirable properties, like information (finite sample) consistency; often, they are similar to other existing proposals like the intrinsic priors; moreover, in normal linear models scenarios, they exactly reproduce Jeffreys-Zellner-Siow priors. Most importantly, in challenging scenarios such as irregular models and mixture models, the DB priors are well defined and very reasonable, while alternative proposals are not. We derive approximations to the DB priors as well as MCMC and asymptotic expressions for the associated Bayes factors.
Bayarri M. J.
García-Donato Gonzalo
No associations
LandOfFree
Generalization of Jeffreys' divergence based priors for Bayesian hypothesis testing does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Generalization of Jeffreys' divergence based priors for Bayesian hypothesis testing, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Generalization of Jeffreys' divergence based priors for Bayesian hypothesis testing will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-156712