Statistics – Machine Learning
Scientific paper
2010-08-16
Statistics
Machine Learning
46 pages, 9 figures
Scientific paper
We extend the mixtures of Gaussians (MOG) model to the projected mixture of Gaussians (PMOG) model. In the PMOG model, we assume that q dimensional input data points z_i are projected by a q dimensional vector w into 1-D variables u_i. The projected variables u_i are assumed to follow a 1-D MOG model. In the PMOG model, we maximize the likelihood of observing u_i to find both the model parameters for the 1-D MOG as well as the projection vector w. First, we derive an EM algorithm for estimating the PMOG model. Next, we show how the PMOG model can be applied to the problem of blind source separation (BSS). In contrast to conventional BSS where an objective function based on an approximation to differential entropy is minimized, PMOG based BSS simply minimizes the differential entropy of projected sources by fitting a flexible MOG model in the projected 1-D space while simultaneously optimizing the projection vector w. The advantage of PMOG over conventional BSS algorithms is the more flexible fitting of non-Gaussian source densities without assuming near-Gaussianity (as in conventional BSS) and still retaining computational feasibility.
No associations
LandOfFree
PMOG: The projected mixture of Gaussians model with application to blind source separation does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with PMOG: The projected mixture of Gaussians model with application to blind source separation, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and PMOG: The projected mixture of Gaussians model with application to blind source separation will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-178406