Computer Science – Learning
Scientific paper
2012-03-20
Computer Science
Learning
Submitted to Digital Signal Processing, Elsevier; IEEE.org
Scientific paper
We investigate adaptive mixture methods that linearly combine outputs of $m$ constituent filters running in parallel to model a desired signal. We use "Bregman divergences" and obtain certain multiplicative updates to train the linear combination weights under an affine constraint or without any constraints. We use unnormalized relative entropy and relative entropy to define two different Bregman divergences that produce an unnormalized exponentiated gradient update and a normalized exponentiated gradient update on the mixture weights, respectively. We then carry out the mean and the mean-square transient analysis of these adaptive algorithms when they are used to combine outputs of $m$ constituent filters. We illustrate the accuracy of our results and demonstrate the effectiveness of these updates for sparse mixture systems.
Donmez Mehmet A.
Inan Huseyin A.
Kozat Suleyman S.
No associations
LandOfFree
Adaptive Mixture Methods Based on Bregman Divergences does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Adaptive Mixture Methods Based on Bregman Divergences, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Adaptive Mixture Methods Based on Bregman Divergences will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-487664