Sharp Convergence Rate and Support Consistency of Multiple Kernel Learning with Sparse and Dense Regularization

Statistics – Machine Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

26 pages, 1 figure

Scientific paper

We theoretically investigate the convergence rate and support consistency (i.e., correctly identifying the subset of non-zero coefficients in the large sample limit) of multiple kernel learning (MKL). We focus on MKL with block-l1 regularization (inducing sparse kernel combination), block-l2 regularization (inducing uniform kernel combination), and elastic-net regularization (including both block-l1 and block-l2 regularization). For the case where the true kernel combination is sparse, we show a sharper convergence rate of the block-l1 and elastic-net MKL methods than the existing rate for block-l1 MKL. We further show that elastic-net MKL requires a milder condition for being consistent than block-l1 MKL. For the case where the optimal kernel combination is not exactly sparse, we prove that elastic-net MKL can achieve a faster convergence rate than the block-l1 and block-l2 MKL methods by carefully controlling the balance between the block-l1and block-l2 regularizers. Thus, our theoretical results overall suggest the use of elastic-net regularization in MKL.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Sharp Convergence Rate and Support Consistency of Multiple Kernel Learning with Sparse and Dense Regularization does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Sharp Convergence Rate and Support Consistency of Multiple Kernel Learning with Sparse and Dense Regularization, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Sharp Convergence Rate and Support Consistency of Multiple Kernel Learning with Sparse and Dense Regularization will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-94773

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.